![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General
In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework but also systems and signal theory. The most important message of the book and authors is: brains are evolved as a whole and a description of parts although necessary lets one miss the wood for the trees.
This book presents a proof of universal computation in the Game of Life cellular automaton by using a Turing machine construction. It provides an introduction including background information and an extended review of the literature for Turing Machines, Counter Machines and the relevant patterns in Conway's Game of Life so that the subject matter is accessibly to non specialists. The book contains a description of the author’s Turing machine in Conway’s Game of Life including an unlimited storage tape provided by growing stack structures and it also presents a fast universal Turing machine designed to allow the working to be demonstrated in a convenient period of time. Â
This book focuses on the development of 3D design and implementation methodologies for Tree-based FPGA architecture. It also stresses the needs for new and augmented 3D CAD tools to support designs such as, the design for 3D, to manufacture high performance 3D integrated circuits and reconfigurable FPGA-based systems. This book was written as a text that covers the foundations of 3D integrated system design and FPGA architecture design. It was written for the use in an elective or core course at the graduate level in field of Electrical Engineering, Computer Engineering and Doctoral Research programs. No previous background on 3D integration is required, nevertheless fundamental understanding of 2D CMOS VLSI design is required. It is assumed that reader has taken the core curriculum in Electrical Engineering or Computer Engineering, with courses like CMOS VLSI design, Digital System Design and Microelectronics Circuits being the most important. It is accessible for self-study by both senior students and professionals alike.
Elucidating the spatial and temporal dynamics of how things connect has become one of the most important areas of research in the 21st century. Network science now pervades nearly every science domain, resulting in new discoveries in a host of dynamic social and natural systems, including: how neurons connect and communicate in the brain, how information percolates within and among social networks, the evolution of science research through co-authorship networks, the spread of epidemics and many other complex phenomena. Over the past decade, advances in computational power have put the tools of network analysis in the hands of increasing numbers of scientists, enabling more explorations of our world than ever before possible. Information science, social sciences, systems biology, ecosystems ecology, neuroscience and physics all benefit from this movement, which combines graph theory with data sciences to develop and validate theories about the world around us. This book brings together cutting-edge research from the network science field and includes diverse and interdisciplinary topics such as: modeling the structure of urban systems, behavior in social networks, education and learning, data network architecture, structure and dynamics of organizations, crime and terrorism, as well as network topology, modularity and community detection.
The purpose of law is to prevent the society from harm by declaring what conduct is criminal, and prescribing the punishment to be imposed for such conduct. The pervasiveness of the internet and its anonymous nature make cyberspace a lawless frontier where anarchy prevails. Historically, economic value has been assigned to visible and tangible assets. With the increasing appreciation that intangible data disseminated through an intangible medium can possess economic value, cybercrime is also being recognized as an economic asset. The Cybercrime, Digital Forensics and Jurisdiction disseminate knowledge for everyone involved with understanding and preventing cybercrime - business entities, private citizens, and government agencies. The book is firmly rooted in the law demonstrating that a viable strategy to confront cybercrime must be international in scope.
Die Autoren führen auf anschauliche und systematische Weise in die mathematische und informatische Modellierung sowie in die Simulation als universelle Methodik ein. Es geht um Klassen von Modellen und um die Vielfalt an Beschreibungsarten. Aber es geht immer auch darum, wie aus Modellen konkrete Simulationsergebnisse gewonnen werden können. Nach einem kompakten Repetitorium zum benötigten mathematischen Apparat wird das Konzept anhand von Szenarien u. a. aus den Bereichen „Spielen – entscheiden – planen" und „Physik im Rechner" umgesetzt.
This volume is the first of the new series Advances in Dynamics and Delays. It offers the latest advances in the research of analyzing and controlling dynamical systems with delays, which arise in many real-world problems. The contributions in this series are a collection across various disciplines, encompassing engineering, physics, biology, and economics, and some are extensions of those presented at the IFAC (International Federation of Automatic Control) conferences since 2011. The series is categorized in five parts covering the main themes of the contributions: ·        Stability Analysis and Control Design ·        Networks and Graphs ·        Time Delay and Sampled-Data Systems ·        Computational and Software Tools ·        Applications This volume will become a good reference point for researchers and PhD students in the field of delay systems, and for those willing to learn more about the field, and it will also be a resource for control engineers, who will find innovative control methodologies for relevant applications, from both theory and numerical analysis perspectives.
This book is intended for students of computational systems biology with only a limited background in mathematics. Typical books on systems biology merely mention algorithmic approaches, but without offering a deeper understanding. On the other hand, mathematical books are typically unreadable for computational biologists. The authors of the present book have worked hard to fill this gap. The result is not a book on systems biology, but on computational methods in systems biology. This book originated from courses taught by the authors at Freie Universität Berlin. The guiding idea of the courses was to convey those mathematical insights that are indispensable for systems biology, teaching the necessary mathematical prerequisites by means of many illustrative examples and without any theorems. The three chapters cover the mathematical modelling of biochemical and physiological processes, numerical simulation of the dynamics of biological networks and identification of model parameters by means of comparisons with real data. Throughout the text, the strengths and weaknesses of numerical algorithms with respect to various systems biological issues are discussed. Web addresses for downloading the corresponding software are also included.
This book deals with the growing challenges of using assistive robots in our everyday activities along with providing intelligent assistive services. The presented applications concern mainly healthcare and wellness such as helping elderly people, assisting dependent persons, habitat monitoring in smart environments, well-being, security, etc. These applications reveal also new challenges regarding control theory, mechanical design, mechatronics, portability, acceptability, scalability, security, etc.
These contributions, written by the foremost international researchers and practitioners of Genetic Programming (GP), explore the synergy between theoretical and empirical results on real-world problems, producing a comprehensive view of the state of the art in GP. Topics in this volume include: gene expression regulation, novel genetic models for glaucoma, inheritable epigenetics, combinators in genetic programming, sequential symbolic regression, system dynamics, sliding window symbolic regression, large feature problems, alignment in the error space, HUMIE winners, Boolean multiplexer function, and highly distributed genetic programming systems. Application areas include chemical process control, circuit design, financial data mining and bioinformatics. Readers will discover large-scale, real-world applications of GP to a variety of problem domains via in-depth presentations of the latest and most significant results.
Through a series of step-by-step tutorials and numerous hands-on exercises, this book aims to equip the reader with both a good understanding of the importance of space in the abstract world of engineers and the ability to create a model of a product in virtual space – a skill essential for any designer or engineer who needs to present ideas concerning a particular product within a professional environment. The exercises progress logically from the simple to the more complex; while Solid Works or NX is the software used, the underlying philosophy is applicable to all modeling software. In each case, the explanation covers the entire procedure from the basic idea and production capabilities through to the real model; the conversion from 3D model to 2D manufacturing drawing is also clearly explained. Topics covered include modeling of prism, axisymmetric, symmetric and sophisticated shapes; digitization of physical models using modeling software; creation of a CAD model starting from a physical model; free form surface modeling; modeling of product assemblies following bottom-up and top-down principles; and the presentation of a product in accordance with the rules of technical documentation. This book, which includes more than 500 figures, will be ideal for students wishing to gain a sound grasp of space modeling techniques. Academics and professionals will find it to be an excellent teaching and research aid, and an easy-to-use guide.
The essays in this book look at the question of whether physics can be based on information, or – as John Wheeler phrased it – whether we can get “It from Bit”. They are based on the prize-winning essays submitted to the FQXi essay competition of the same name, which drew over 180 entries. The eighteen contributions address topics as diverse as quantum foundations, entropy conservation, nonlinear logic and countable spacetime. Together they provide stimulating reading for all physics aficionados interested in the possible role(s) of information in the laws of nature. The Foundational Questions Institute, FQXi, catalyzes, supports, and disseminates research on questions at the foundations of physics and cosmology, particularly new frontiers and innovative ideas integral to a deep understanding of reality, but unlikely to be supported by conventional funding sources.
This Festschrift in honour of Paul Deheuvels’ 65th birthday compiles recent research results in the area between mathematical statistics and probability theory with a special emphasis on limit theorems. The book brings together contributions from invited international experts to provide an up-to-date survey of the field. Written in textbook style, this collection of original material addresses researchers, PhD and advanced Master students with a solid grasp of mathematical statistics and probability theory.
This book focuses on organization and mechanisms of expert decision-making support using modern information and communication technologies, as well as information analysis and collective intelligence technologies (electronic expertise or simply e-expertise). Chapter 1 (E-Expertise) discusses the role of e-expertise in decision-making processes. The procedures of e-expertise are classified, their benefits and shortcomings are identified and the efficiency conditions are considered. Chapter 2 (Expert Technologies and Principles) provides a comprehensive overview of modern expert technologies. A special emphasis is placed on the specifics of e-expertise. Moreover, the authors study the feasibility and reasonability of employing well-known methods and approaches in e-expertise. Chapter 3 (E-Expertise: Organization and Technologies) describes some examples of up-to-date technologies to perform e-expertise. Chapter 4 (Trust Networks and Competence Networks) deals with the problems of expert finding and grouping by information and communication technologies. Chapter 5 (Active Expertise) treats the problem of expertise stability against any strategic manipulation by experts or coordinators pursuing individual goals. The book addresses a wide range of readers interested in management, decision-making and expert activity in political, economic, social and industrial spheres.
The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.
Heritage is everywhere, and an understanding of our past is increasingly critical to the understanding of our contemporary cultural context and place in global society. Visual Heritage in the Digital Age presents the state-of-the-art in the application of digital technologies to heritage studies, with the chapters collectively demonstrating the ways in which current developments are liberating the study, conservation and management of the past. Digital approaches to heritage have developed significantly over recent decades in terms of both the quantity and range of applications. However, rather than merely improving and enriching the ways in which we understand and engage with the past, this technology is enabling us to do this in entirely new ways. The chapters contained within this volume present a broad range of technologies for capturing data (such as high-definition laser scanning survey and geophysical survey), modelling (including GIS, data fusion, agent-based modelling), and engaging with heritage through novel digital interfaces (mobile technologies and the use of multi-touch interfaces in public spaces). The case studies presented include sites, landscapes and buildings from across Europe, North and Central America, and collections relating to the ancient civilisations of the Middle East and North Africa. The chronological span is immense, extending from the end of the last ice age through to the twentieth century. These case studies reveal new ways of approaching heritage using digital tools, whether from the perspective of interrogating historical textual data, or through the applications of complexity theory and the modelling of agents and behaviours. Beyond the data itself, Visual Heritage in the Digital Age also presents fresh ways of thinking about digital heritage. It explores more theoretical perspectives concerning the role of digital data and the challenges that are presented in terms of its management and preservation.
Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule interpolation, subjective weights based meta learning in multi criteria decision making, swarm-based heuristics for an area exploration and knowledge driven adaptive product representations. The last part addresses different problems, issues and methods of applied mathematics. This includes perturbation estimates for invariant subspaces of Hessenberg matrices, uncertainty and nonlinearity modelling by probabilistic metric spaces and comparison and visualization of the DNA of six primates.
This book offers a comprehensive analysis of the social choice literature and shows, by applying fuzzy sets, how the use of fuzzy preferences, rather than that of strict ones, may affect the social choice theorems. To do this, the book explores the presupposition of rationality within the fuzzy framework and shows that the two conditions for rationality, completeness and transitivity, do exist with fuzzy preferences. Specifically, this book examines: the conditions under which a maximal set exists; the Arrow’s theorem; the Gibbard-Satterthwaite theorem and the median voter theorem. After showing that a non-empty maximal set does exists for fuzzy preference relations, this book goes on to demonstrating the existence of a fuzzy aggregation rule satisfying all five Arrowian conditions, including non-dictatorship. While the Gibbard-Satterthwaite theorem only considers individual fuzzy preferences, this work shows that both individuals and groups can choose alternatives to various degrees, resulting in a social choice that can be both strategy-proof and non-dictatorial. Moreover, the median voter theorem is shown to hold under strict fuzzy preferences but not under weak fuzzy preferences. By providing a standard model of fuzzy social choice and by drawing the necessary connections between the major theorems, this book fills an important gap in the current literature and encourages future empirical research in the field.
This volume offers a guide to the state of the art in the fast evolving field of biometric recognition to newcomers and experienced practitioners. It is focused on the emerging strategies to perform biometric recognition under uncontrolled data acquisition conditions. The mainstream research work in this field is presented in an organized manner, so the reader can easily follow the trends that best suits her/his interests in this growing field. The book chapters cover the recent advances in less controlled / covert data acquisition frameworks, segmentation of poor quality biometric data, biometric data quality assessment, normalization of poor quality biometric data. contactless biometric recognition strategies, biometric recognition robustness, data resolution, illumination, distance, pose, motion, occlusions, multispectral biometric recognition, multimodal biometrics, fusion at different levels, high confidence automatic surveillance.
Cryptography has been employed in war and diplomacy from the time of Julius Caesar. In our Internet age, cryptography's most widespread application may be for commerce, from protecting the security of electronic transfers to guarding communication from industrial espionage. This accessible introduction for undergraduates explains the cryptographic protocols for achieving privacy of communication and the use of digital signatures for certifying the validity, integrity, and origin of a message, document, or program. Rather than offering a how-to on configuring web browsers and e-mail programs, the author provides a guide to the principles and elementary mathematics underlying modern cryptography, giving readers a look under the hood for security techniques and the reasons they are thought to be secure.
This book presents the latest scientific research related to the field of Robotics. It involves different topics such as biomedicine, energy efficiency and home automation and robotics. The book is written by technical experts and researchers from academia and industry working on robotics applications. The book could be used as supplementary material for courses related to Robotics and Domotics.
Computer simulation-based education and training is a multi-billion dollar industry. With the increased complexity of organizational decision making, projected demand for computer simulation-based decisional aids is on the rise. The objective of this book is to enhance systematically our understanding of and gain insights into the general process by which human facilitated ILEs are effectively designed and used in improving users’ decision making in dynamic tasks. This book is divided into four major parts. Part I serves as an introduction to the subject of “decision making in dynamic tasks”, its importance and its complexity. Part II provides background material, drawing upon the relevant literature, for the development of an integrated process model on the effectiveness of human facilitated ILEs in improving decision making in dynamic tasks. Part III focuses on the design, development and application of Fish Bank ILE, in laboratory experiments, to gather empirical evidence for the validity of the process model. Finally, part IV presents a comprehensive analysis of the gathered data to provide a powerful basis for understating important phenomena of training with human facilitated simulation-based learning environments, thereby, help to drive critical lessons to be learned. This book provides the reader with both a comprehensive understanding of the phenomena encountered in decision making with human facilitated ILEs and a unique way of studying the effects of these phenomena on people’s ability to make better decision in complex, dynamic tasks. This book is intended to be of use to managers and practitioners, researchers and students of dynamic decision making. The background material of Part II provides a solid base to understand and organize the existing experimental research literature and approaches.
This volume presents the proceedings of the 11th International Conference on the Design of Cooperative Systems (COOP 2014). The conference is a venue for multidisciplinary research contributing to the design, assessment and analysis of cooperative systems and their integration in organizations, public venues, and everyday life. COOP emerged from the European tradition of Computer Supported Cooperative Work (CSCW) and Cognitive Ergonomics as practiced in France.  These proceedings are a collection of 28 papers reflecting the variety of research activities in the field, as well as an increasing interest in investigating the use and design of ICT in all aspects of everyday life and society, and not merely in the workplace. The papers represent a variety of research topics, from healthcare to sustainable mobility to disaster response, in settings from all over the world. For the first time, the proceedings include papers presented in an Early-Career Researchers Track which was organized in order to give young researchers the opportunity to discuss their work with an international community.  This collection of papers provides a picture of new developments and classic topics of research around cooperative systems, based on the principle that a deep knowledge of cooperative practices is a key to understanding technology impacts and producing quality designs. The articles presented will appeal to researchers and practitioners alike, as they combine an understanding of the nature of work with the possibilities offered by novel digital technologies. Â
With the onset of massive cosmological data collection through media such as the Sloan Digital Sky Survey (SDSS), galaxy classification has been accomplished for the most part with the help of citizen science communities like Galaxy Zoo. Seeking the wisdom of the crowd for such Big Data processing has proved extremely beneficial. However, an analysis of one of the Galaxy Zoo morphological classification data sets has shown that a significant majority of all classified galaxies are labelled as “Uncertain”. This book reports on how to use data mining, more specifically clustering, to identify galaxies that the public has shown some degree of uncertainty for as to whether they belong to one morphology type or another. The book shows the importance of transitions between different data mining techniques in an insightful workflow. It demonstrates that Clustering enables to identify discriminating features in the analysed data sets, adopting a novel feature selection algorithms called Incremental Feature Selection (IFS). The book shows the use of state-of-the-art classification techniques, Random Forests and Support Vector Machines to validate the acquired results. It is concluded that a vast majority of these galaxies are, in fact, of spiral morphology with a small subset potentially consisting of stars, elliptical galaxies or galaxies of other morphological variants.
This book introduces a novel transcoding algorithm for real time video applications, designed to overcome inter-operability problems between MPEG-2 to H.264/AVC. The new algorithm achieves 92.8% reduction in the transcoding run time at a price of an acceptable Peak Signal-to-Noise Ratio (PSNR) degradation, enabling readers to use it for real time video applications. The algorithm described is evaluated through simulation and experimental results. In addition, the authors present a hardware implementation of the new algorithm using Field Programmable Gate Array (FPGA) and Application-specific standard products (ASIC). • Describes a novel transcoding algorithm for real time video applications, designed to overcome inter-operability problems between H.264/AVC to MPEG-2; • Implements algorithm presented using Field Programmable Gate Array (FPGA) and Application-specific Integrated Circuit (ASIC); • Demonstrates the solution to real problems, with verification through simulation and experimental results. |
You may like...
The Coming Wave - AI, Power and Our…
Mustafa Suleyman, Michael Bhaskar
Paperback
Principles Of Business Information…
Ralph Stair, George Reynolds, …
Paperback
(1)R1,780 Discovery Miles 17 800
Intelligence-Based Cardiology and…
Alfonso Limon, Louise Y Sun, …
Hardcover
R2,959
Discovery Miles 29 590
Digital Libraries - Integrating Content…
Mark V Dahl, Kyle Banerjee, …
Paperback
R1,150
Discovery Miles 11 500
The path to becoming a data-driven…
Organisation for Economic Cooperation and Development
Paperback
R1,382
Discovery Miles 13 820
Going digital guide to data governance…
Organisation for Economic Cooperation and Development
Paperback
R613
Discovery Miles 6 130
CompTIA A+ Guide to Information…
Nicholas Pierce, Jean Andrews, …
Hardcover
If Anyone Builds It, Everyone Dies - The…
Eliezer Yudkowsky, Nate Soares
Paperback
|