![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of significant, lasting value in this rapidly
expanding field.
This book is intended to make recent results on the derivation of higher order numerical schemes for random ordinary differential equations (RODEs) available to a broader readership, and to familiarize readers with RODEs themselves as well as the closely associated theory of random dynamical systems. In addition, it demonstrates how RODEs are being used in the biological sciences, where non-Gaussian and bounded noise are often more realistic than the Gaussian white noise in stochastic differential equations (SODEs). RODEs are used in many important applications and play a fundamental role in the theory of random dynamical systems. They can be analyzed pathwise with deterministic calculus, but require further treatment beyond that of classical ODE theory due to the lack of smoothness in their time variable. Although classical numerical schemes for ODEs can be used pathwise for RODEs, they rarely attain their traditional order since the solutions of RODEs do not have sufficient smoothness to have Taylor expansions in the usual sense. However, Taylor-like expansions can be derived for RODEs using an iterated application of the appropriate chain rule in integral form, and represent the starting point for the systematic derivation of consistent higher order numerical schemes for RODEs. The book is directed at a wide range of readers in applied and computational mathematics and related areas as well as readers who are interested in the applications of mathematical models involving random effects, in particular in the biological sciences.The level of this book is suitable for graduate students in applied mathematics and related areas, computational sciences and systems biology. A basic knowledge of ordinary differential equations and numerical analysis is required.
This book presents a comprehensive study of different tools and techniques available to perform network forensics. Also, various aspects of network forensics are reviewed as well as related technologies and their limitations. This helps security practitioners and researchers in better understanding of the problem, current solution space, and future research scope to detect and investigate various network intrusions against such attacks efficiently. Forensic computing is rapidly gaining importance since the amount of crime involving digital systems is steadily increasing. Furthermore, the area is still underdeveloped and poses many technical and legal challenges. The rapid development of the Internet over the past decade appeared to have facilitated an increase in the incidents of online attacks. There are many reasons which are motivating the attackers to be fearless in carrying out the attacks. For example, the speed with which an attack can be carried out, the anonymity provided by the medium, nature of medium where digital information is stolen without actually removing it, increased availability of potential victims and the global impact of the attacks are some of the aspects. Forensic analysis is performed at two different levels: Computer Forensics and Network Forensics. Computer forensics deals with the collection and analysis of data from computer systems, networks, communication streams and storage media in a manner admissible in a court of law. Network forensics deals with the capture, recording or analysis of network events in order to discover evidential information about the source of security attacks in a court of law. Network forensics is not another term for network security. It is an extended phase of network security as the data for forensic analysis are collected from security products like firewalls and intrusion detection systems. The results of this data analysis are utilized for investigating the attacks. Network forensics generally refers to the collection and analysis of network data such as network traffic, firewall logs, IDS logs, etc. Technically, it is a member of the already-existing and expanding the field of digital forensics. Analogously, network forensics is defined as "The use of scientifically proved techniques to collect, fuses, identifies, examine, correlate, analyze, and document digital evidence from multiple, actively processing and transmitting digital sources for the purpose of uncovering facts related to the planned intent, or measured success of unauthorized activities meant to disrupt, corrupt, and or compromise system components as well as providing information to assist in response to or recovery from these activities." Network forensics plays a significant role in the security of today's organizations. On the one hand, it helps to learn the details of external attacks ensuring similar future attacks are thwarted. Additionally, network forensics is essential for investigating insiders' abuses that constitute the second costliest type of attack within organizations. Finally, law enforcement requires network forensics for crimes in which a computer or digital system is either being the target of a crime or being used as a tool in carrying a crime. Network security protects the system against attack while network forensics focuses on recording evidence of the attack. Network security products are generalized and look for possible harmful behaviors. This monitoring is a continuous process and is performed all through the day. However, network forensics involves post mortem investigation of the attack and is initiated after crime notification. There are many tools which assist in capturing data transferred over the networks so that an attack or the malicious intent of the intrusions may be investigated. Similarly, various network forensic frameworks are proposed in the literature.
This book questions the relevance of computation to the physical universe. Our theories deliver computational descriptions, but the gaps and discontinuities in our grasp suggest a need for continued discourse between researchers from different disciplines, and this book is unique in its focus on the mathematical theory of incomputability and its relevance for the real world. The core of the book consists of thirteen chapters in five parts on extended models of computation; the search for natural examples of incomputable objects; mind, matter, and computation; the nature of information, complexity, and randomness; and the mathematics of emergence and morphogenesis. This book will be of interest to researchers in the areas of theoretical computer science, mathematical logic, and philosophy.
In this book the editors have gathered a number of contributions by persons who have been working on problems of Cognitive Technology (CT). The present collection initiates explorations of the human mind via the technologies the mind produces. These explorations take as their point of departure the question What happens when humans produce new technologies? Two interdependent perspectives from which such a production can be approached are adopted: - How and why constructs that have their origins in human mental life are embodied in physical environments when people fabricate their habitat, even to the point of those constructs becoming that very habitat - How and why these fabricated habitats affect, and feed back into, human mental life. The aim of the CT research programme is to determine, in general, which technologies, and in particular, which interactive computer-based technologies, are humane with respect to the cognitive development and evolutionary adaptation of their end users. But what does it really mean to be humane in a technological world? To shed light on this central issue other pertinent questions are raised, e.g. - Why are human minds externalised, i.e., what purpose does the process of externalisation serve? - What can we learn about the human mind by studying how it externalises itself? - How does the use of externalised mental constructs (the objects we call 'tools') change people fundamentally? - To what extent does human interaction with technology serve as an amplification of human cognition, and to what extent does it lead to a atrophy of the human mind? The book calls for a reflection on what a tool is. Strong parallels between CT andenvironmentalism are drawn: both are seen as trends having originated in our need to understand how we manipulate, by means of the tools we have created, our natural habitat consisting of, on the one hand, the cognitive environment which generates thought and determines action, and on the other hand, the physical environment in which thought and action are realised. Both trends endeavour to protect the human habitat from the unwanted or uncontrolled impact of technology, and are ultimately concerned with the ethics and aesthetics of tool design and tool use. Among the topics selected by the contributors to the book, the following themes emerge (the list is not exhaustive): using technology to empower the cognitively impaired; the ethics versus aesthetics of technology; the externalisation of emotive and affective life and its special dialectic ('mirror') effects; creativity enhancement: cognitive space, problem tractability; externalisation of sensory life and mental imagery; the engineering and modelling aspects of externalised life; externalised communication channels and inner dialogue; externalised learning protocols; relevance analysis as a theoretical framework for cognitive technology.
This book reports on the latest advances and applications of chaotic systems. It consists of 25 contributed chapters by experts who are specialized in the various topics addressed in this book. The chapters cover a broad range of topics of chaotic systems such as chaos, hyperchaos, jerk systems, hyperjerk systems, conservative and dissipative systems, circulant chaotic systems, multi-scroll chaotic systems, finance chaotic system, highly chaotic systems, chaos control, chaos synchronization, circuit realization and applications of chaos theory in secure communications, mobile robot, memristors, cellular neural networks, etc. Special importance was given to chapters offering practical solutions, modeling and novel control methods for the recent research problems in chaos theory. This book will serve as a reference book for graduate students and researchers with a basic knowledge of chaos theory and control systems. The resulting design procedures on the chaotic systems are emphasized using MATLAB software.
The development of social technologies has brought about a new era of political planning and government interactions. In addition to reducing costs in city resource management, ICT and social media can be used in emergency situations as a mechanism for citizen engagement, to facilitate public administration communication, etc. In spite of all these advantages, the application of technologies by governments and the public sector has also fostered debate in terms of cyber security due to the vulnerabilities and risks that can befall different stakeholders. It is necessary to review the most recent research about the implementation of ICTs in the public sector with the aim of understanding both the strengths and the vulnerabilities that the management models can entail. Special Applications of ICTs in Digital Government and the Public Sector: Emerging Research and Opportunities is a collection of innovative research on the methods and applications of ICT implementation in the public sector that seeks to allow readers to understand how ICTs have forced public administrations to undertake reforms to both their workflow and their means of interacting with citizens. While highlighting topics including e-government, emergency communications, and urban planning, this book is ideally designed for government officials, public administrators, public managers, policy holders, policymakers, public consultants, professionals, academicians, students, and researchers seeking current research on the digital communication channels between elected officials and the citizens they represent.
Useful to healthcare providers, severity indices conclude which patients are most at risk for infection as well as the intensity of illness while in the hospital. ""Text Mining Techniques for Healthcare Provider Quality Determination: Methods for Rank Comparisons"" discusses the general practice of defining a patient severity index for risk adjustments and comparison of patient outcomes to assess quality factors. This ""Premier Reference Source"" examines the consequences of patient severity models and investigates the general assumptions required to perform standard severity adjustment.
From finance to artificial intelligence, genetic algorithms are a powerful tool with a wide array of applications. But you don't need an exotic new language or framework to get started; you can learn about genetic algorithms in a language you're already familiar with. Join us for an in-depth look at the algorithms, techniques, and methods that go into writing a genetic algorithm. From introductory problems to real-world applications, you'll learn the underlying principles of problem solving using genetic algorithms. Evolutionary algorithms are a unique and often overlooked subset of machine learning and artificial intelligence. Because of this, most of the available resources are outdated or too academic in nature, and none of them are made with Elixir programmers in mind. Start from the ground up with genetic algorithms in a language you are familiar with. Discover the power of genetic algorithms through simple solutions to challenging problems. Use Elixir features to write genetic algorithms that are concise and idiomatic. Learn the complete life cycle of solving a problem using genetic algorithms. Understand the different techniques and fine-tuning required to solve a wide array of problems. Plan, test, analyze, and visualize your genetic algorithms with real-world applications. Open your eyes to a unique and powerful field - without having to learn a new language or framework. What You Need: You'll need a macOS, Windows, or Linux distribution with an up-to-date Elixir installation.
Collected together in this book are ten state-of-the-art expository articles on the most important topics in optimization, written by leading experts in the field. The book therefore provides a primary reference for those performing research in some area of optimization or for those who have some basic knowledge of optimization techniques but wish to learn the most up-to-date and efficient algorithms for particular classes of problems. The first sections of each chapter are expository and therefore accessible to master's level graduate students. However, the chapters also contain advanced material on current topics of interest to researchers. For instance there are chapters which describe the polynomial-time linear programming algorithms of Khachian and Karmarkar and the techniques used to solve combinatorial and integer programming problems, an order of magnitude larger than was possible just a few years ago. Overall a comprehensive yet lively and up-to-date discussion of the state-of-the-art in optimization is presented in this book.
This book constitutes the refereed post-conference proceedings of the 10th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2016, held in Dongying, China, in October 2016. The 55 revised papers presented were carefully reviewed and selected from 128 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including intelligent sensing, cloud computing, key technologies of the Internet of Things, precision agriculture, animal husbandry information technology, including Internet + modern animal husbandry, livestock big data platform and cloud computing applications, intelligent breeding equipment, precision production models, water product networking and big data , including fishery IoT, intelligent aquaculture facilities, and big data applications.
As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the entire industry. A discussion of three commonly used subject classification systems precedes an annotated bibliography of over 500 items. As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the computer industry: teaching institutions, research institutes and laboratories, manufacturers, standardization organizations, professional associations and societies, and publishers. He begins with a discussion of the three subject classification systems most commonly used to describe, index, and manage computer science information: the Association for Computing Machinery, Inspec, and the Library of Congress. An annotated bibliography of over 500 items follows, grouped by material type, and featuring a mix of classic works and current sources.
This book shows cognitive scientists in training how mathematics, computer science and science can be usefully and seamlessly intertwined. It is a follow-up to the first two volumes on mathematics for cognitive scientists, and includes the mathematics and computational tools needed to understand how to compute the terms in the Fourier series expansions that solve the cable equation. The latter is derived from first principles by going back to cellular biology and the relevant biophysics. A detailed discussion of ion movement through cellular membranes, and an explanation of how the equations that govern such ion movement leading to the standard transient cable equation are included. There are also solutions for the cable model using separation of variables, as well an explanation of why Fourier series converge and a description of the implementation of MatLab tools to compute the solutions. Finally, the standard Hodgkin - Huxley model is developed for an excitable neuron and is solved using MatLab.
This book gathers threads that have evolved across different mathematical disciplines into seamless narrative. It deals with condition as a main aspect in the understanding of the performance ---regarding both stability and complexity--- of numerical algorithms. While the role of condition was shaped in the last half-century, so far there has not been a monograph treating this subject in a uniform and systematic way. The book puts special emphasis on the probabilistic analysis of numerical algorithms via the analysis of the corresponding condition. The exposition's level increases along the book, starting in the context of linear algebra at an undergraduate level and reaching in its third part the recent developments and partial solutions for Smale's 17th problem which can be explained within a graduate course. Its middle part contains a condition-based course on linear programming that fills a gap between the current elementary expositions of the subject based on the simplex method and those focusing on convex programming.
A groundbreaking treatise by one of the great mathematicians of our age, who outlines a style of thinking by which great ideas are conceived. What inspires and spurs on a great idea? Can we train ourselves to think in a way that will enable world-changing understandings and insights to emerge? Richard Hamming said we can. He first inspired a generation of engineers, scientists, and researchers in 1986 with "You and Your Research," an electrifying sermon on why some scientists do great work, why most don't, why he did, and why you can-and should-too. The Art of Doing Science and Engineering is the full expression of what "You and Your Research" outlined. It's a book about thinking; more specifically, a style of thinking by which great ideas are conceived. The book is filled with stories of great people performing mighty deeds-but they are not meant simply to be admired. Instead, they are to be aspired to, learned from, and surpassed. Hamming consistently returns to Shannon's information theory, Einstein's theory of relativity, Grace Hopper's work on high-level programming, Kaiser's work on digital fillers, and his own work on error-correcting codes. He also recounts a number of his spectacular failures as clear examples of what to avoid. Originally published in 1996 and adapted from a course that Hamming taught at the US Naval Postgraduate School, this edition includes an all-new foreword by designer, engineer, and founder of Dynamicland Bret Victor, plus more than 70 redrawn graphs and charts. The Art of Doing Science and Engineering is a reminder that a capacity for learning and creativity are accessible to everyone. Hamming was as much a teacher as a scientist, and having spent a lifetime forming and confirming a theory of great people and great ideas, he prepares the next generation for even greater distinction.
With the proliferation of Software-as-a-Service (SaaS) offerings, it is becoming increasingly important for individual SaaS providers to operate their services at a low cost. This book investigates SaaS from the perspective of the provider and shows how operational costs can be reduced by using "multi tenancy," a technique for consolidating a large number of customers onto a small number of servers. Specifically, the book addresses multi tenancy on the database level, focusing on in-memory column databases, which are the backbone of many important new enterprise applications. For efficiently implementing multi tenancy in a farm of databases, two fundamental challenges must be addressed, (i) workload modeling and (ii) data placement. The first involves estimating the (shared) resource consumption for multi tenancy on a single in-memory database server. The second consists in assigning tenants to servers in a way that minimizes the number of required servers (and thus costs) based on the assumed workload model. This step also entails replicating tenants for performance and high availability. This book presents novel solutions to both problems.
Learn to Create and Write Your Own Apps Do you have a great idea for an app or a game? Would you like to make your dream a reality? Do you need the tools and skills to start making your own apps? When you purchase Swift Programming Guide: Create a Fully Functioning App in a Day, you'll learn how to make your own apps and programs right away! These fun and easy tips transform the dreaded chore of learning programming code into a fun hobby. You'll be proud to show off your creations to your friends, coworkers, and family! Would you like to know more about: Playgrounds? Classes and Methods? Arrays and For Loops? Creating Your First iOS App? Storyboards and Interface Builders? This helpful book explains how to use Xcode and Apple's new coding language, Swift, to create amazing new products. It takes you step-by-step through the process of writing your first app! Download Swift Programming Guide: Create a Fully Functioning App in a Day now, and start making your own apps TODAY!
This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Researchers in academia or industry and graduate students, who work in logic synthesis, quantum computing, nano-technology, and low power VLSI circuit design, will be interested in this book.
The papers in this volume represent the most timely and advanced contributions to the 2014 Joint Applied Statistics Symposium of the International Chinese Statistical Association (ICSA) and the Korean International Statistical Society (KISS), held in Portland, Oregon. The contributions cover new developments in statistical modeling and clinical research: including model development, model checking, and innovative clinical trial design and analysis. Each paper was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. It offered 3 keynote speeches, 7 short courses, 76 parallel scientific sessions, student paper sessions, and social events.
Evolutionary algorithms constitute a class of well-known algorithms, which are designed based on the Darwinian theory of evolution and Mendelian theory of heritage. They are partly based on random and partly based on deterministic principles. Due to this nature, it is challenging to predict and control its performance in solving complex nonlinear problems. Recently, the study of evolutionary dynamics is focused not only on the traditional investigations but also on the understanding and analyzing new principles, with the intention of controlling and utilizing their properties and performances toward more effective real-world applications. In this book, based on many years of intensive research of the authors, is proposing novel ideas about advancing evolutionary dynamics towards new phenomena including many new topics, even the dynamics of equivalent social networks. In fact, it includes more advanced complex networks and incorporates them with the CMLs (coupled map lattices), which are usually used for spatiotemporal complex systems simulation and analysis, based on the observation that chaos in CML can be controlled, so does evolution dynamics. All the chapter authors are, to the best of our knowledge, originators of the ideas mentioned above and researchers on evolutionary algorithms and chaotic dynamics as well as complex networks, who will provide benefits to the readers regarding modern scientific research on related subjects.
This book provides formal and informal definitions and taxonomies for self-aware computing systems, and explains how self-aware computing relates to many existing subfields of computer science, especially software engineering. It describes architectures and algorithms for self-aware systems as well as the benefits and pitfalls of self-awareness, and reviews much of the latest relevant research across a wide array of disciplines, including open research challenges. The chapters of this book are organized into five parts: Introduction, System Architectures, Methods and Algorithms, Applications and Case Studies, and Outlook. Part I offers an introduction that defines self-aware computing systems from multiple perspectives, and establishes a formal definition, a taxonomy and a set of reference scenarios that help to unify the remaining chapters. Next, Part II explores architectures for self-aware computing systems, such as generic concepts and notations that allow a wide range of self-aware system architectures to be described and compared with both isolated and interacting systems. It also reviews the current state of reference architectures, architectural frameworks, and languages for self-aware systems. Part III focuses on methods and algorithms for self-aware computing systems by addressing issues pertaining to system design, like modeling, synthesis and verification. It also examines topics such as adaptation, benchmarks and metrics. Part IV then presents applications and case studies in various domains including cloud computing, data centers, cyber-physical systems, and the degree to which self-aware computing approaches have been adopted within those domains. Lastly, Part V surveys open challenges and future research directions for self-aware computing systems. It can be used as a handbook for professionals and researchers working in areas related to self-aware computing, and can also serve as an advanced textbook for lecturers and postgraduate students studying subjects like advanced software engineering, autonomic computing, self-adaptive systems, and data-center resource management. Each chapter is largely self-contained, and offers plenty of references for anyone wishing to pursue the topic more deeply. |
You may like...
The Host in the Machine - Examining the…
Angela Thomas-Jones
Paperback
R1,318
Discovery Miles 13 180
Telecommunications Engineering: Networks…
Bernhard Ekman
Hardcover
|