![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence > Neural networks
A computer that thinks like a person has long been the dream of computer designers. The author uses his 35 years of computer design experience to describe the mechanisms of a thinking computer. These mechanisms include recall, recognition, learning, doing procedures, speech, vision, attention, intelligence, and consciousness. Included are experiments that demonstate the mechanisms described. The experiments use software that the reader can download from the internet and run on his or her personal computer (PC). The software includes a large engram file containing knowledge we use on a daily basis. Additional experiments allow the reader to write and run new engrams. The computer architecture of the human brain is first described. Standard methods of computer design are next used to convert the architecture into thinking computer implementations spanning a range of performace levels. Lastly, the operation of a thinking computer is presented.
Artificial neural networks (ANNs) are computer based systems that are designed to simulate the learning process of neurons in the human brain. ANNs have been attracting great interest during the last decade as predictive models and pattern recognition. Artificial neural networks possess the ability to "learn" from a set of experimental data without actual knowledge of the physical and chemical laws that govern the system. Therefore, ANNs application in data treatment is high, especially where systems present non-linearities and complex behaviour. This book describes the application of artificial neural networks for modelling of water and wastewater treatment processes.
Gain a new perspective on how the brain works and inspires new avenues for design in computer science and engineering This unique book is the first of its kind to introduce human memory and basic cognition in terms of physical circuits, beginning with the possibilities of ferroelectric behavior of neural membranes, moving to the logical properties of neural pulses recognized as solitons, and finally exploring the architecture of cognition itself. It encourages invention via the methodical study of brain theory, including electrically reversible neurons, neural networks, associative memory systems within the brain, neural state machines within associative memory, and reversible computers in general. These models use standard analog and digital circuits that, in contrast to models that include non-physical components, may be applied directly toward the goal of constructing a machine with artificial intelligence based on patterns of the brain. Writing from the circuits and systems perspective, the author reaches across specialized disciplines including neuroscience, psychology, and physics to achieve uncommon coverage of: Neural membranes Neural pulses and neural memory Circuits and systems for memorizing and recalling Dendritic processing and human learning Artificial learning in artificial neural networks The asset of reversibility in man and machine Electrically reversible nanoprocessors Reversible arithmetic Hamiltonian circuit finders Quantum versus classical Each chapter introduces and develops new material and ends with exercises for readers to put their skills into practice. Appendices are provided for non-experts who want a quick overview of brain anatomy, brain psychology, and brain scanning. The nature of this book, with its summaries of major bodies of knowledge, makes it a most valuable reference for professionals, researchers, and students with career goals in artificial intelligence, intelligent systems, neural networks, computer architecture, and neuroscience. A solutions manual is available for instructors; to obtain a copy please email the editorial department at [email protected].
Much research focuses on the question of how information is processed in nervous systems, from the level of individual ionic channels to large-scale neuronal networks, and from "simple" animals such as sea slugs and flies to cats and primates. New interdisciplinary methodologies combine a bottom-up experimental methodology with the more top-down-driven computational and modeling approach. This book serves as a handbook of computational methods and techniques for modeling the functional properties of single and groups of nerve cells.The contributors highlight several key trends: (1) the tightening link between analytical/numerical models and the associated experimental data, (2) the broadening of modeling methods, at both the subcellular level and the level of large neuronal networks that incorporate real biophysical properties of neurons as well as the statistical properties of spike trains, and (3) the organization of the data gained by physical emulation of the nervous system components through the use of very large scale circuit integration (VLSI) technology.The field of neuroscience has grown dramatically since the first edition of this book was published nine years ago. Half of the chapters of the second edition are completely new; the remaining ones have all been thoroughly revised. Many chapters provide an opportunity for interactive tutorials and simulation programs. They can be accessed via Christof Koch's Website.Contributors: Larry F. Abbott, Paul R. Adams, Hagai Agmon-Snir, James M. Bower, Robert E. Burke, Erik de Schutter, Alain Destexhe, Rodney Douglas, Bard Ermentrout, Fabrizio Gabbiani, David Hansel, Michael Hines, Christof Koch, Misha Mahowald, Zachary F. Mainen, Eve Marder, Michael V. Mascagni, Alexander D. Protopapas, Wilfrid Rall, John Rinzel, Idan Segev, Terrence J. Sejnowski, Shihab Shamma, Arthur S. Sherman, Paul Smolen, Haim Sompolinsky, Michael Vanier, Walter M. Yamada.
This textbook provides a thorough introduction to the field of learning from experimental data and soft computing. Support vector machines (SVM) and neural networks (NN) are the mathematical structures, or models, that underlie learning, while fuzzy logic systems (FLS) enable us to embed structured human knowledge into workable algorithms. The book assumes that it is not only useful, but necessary, to treat SVM, NN, and FLS as parts of a connected whole. Throughout, the theory and algorithms are illustrated by practical examples, as well as by problem sets and simulated experiments. This approach enables the reader to develop SVM, NN, and FLS in addition to understanding them. The book also presents three case studies: on NN-based control, financial time series analysis, and computer graphics. A solutions manual and all of the MATLAB programs needed for the simulated experiments are available.
Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation. This book presents the complete spectrum of current research in pulsed neural networks and includes the most important work from many of the key scientists in the field. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book. Contributors Peter S. Burge, Stephen R. Deiss, Rodney J. Douglas, John G. Elias, Wulfram Gerstner, Alister Hamilton, David Horn, Axel Jahnke, Richard Kempter, Wolfgang Maass, Alessandro Mortara, Alan F. Murray, David P. M. Northmore, Irit Opher, Kostas A. Papathanasiou, Michael Recce, Barry J. P. Rising, Ulrich Roth, Tim Schoenauer, Terrence J. Sejnowski, John Shawe-Taylor, Max R. van Daalen, J. Leo van Hemmen, Philippe Venier, Hermann Wagner, Adrian M. Whatley, Anthony M. Zador
A highly readable, non-mathematical introduction to neural networks-computer models that help us to understand how we perceive, think, feel, and act. How does the brain work? How do billions of neurons bring about ideas, sensations, emotions, and actions? Why do children learn faster than elderly people? What can go wrong in perception, thinking, learning, and acting? Scientists now use computer models to help us to understand the most private and human experiences. In The Mind Within the Net, Manfred Spitzer shows how these models can fundamentally change how we think about learning, creativity, thinking, and acting, as well as such matters as schools, retirement homes, politics, and mental disorders. Neurophysiology has told us a lot about how neurons work; neural network theory is about how neurons work together to process information. In this highly readable book, Spitzer provides a basic, nonmathematical introduction to neural networks and their clinical applications. Part I explains the fundamental theory of neural networks and how neural network models work. Part II covers the principles of network functioning and how computer simulations of neural networks have profound consequences for our understanding of how the brain works. Part III covers applications of network models (e.g., to knowledge representation, language, and mental disorders such as schizophrenia and Alzheimer's disease) that shed new light on normal and abnormal states of mind. Finally, Spitzer concludes with his thoughts on the ramifications of neural networks for the understanding of neuropsychology and human nature.
Surprising tales from the scientists who first learned how to use computers to understand the workings of the human brain.Since World War II, a group of scientists has been attempting to understand the human nervous system and to build computer systems that emulate the brain's abilities. Many of the early workers in this field of neural networks came from cybernetics; others came from neuroscience, physics, electrical engineering, mathematics, psychology, even economics. In this collection of interviews, those who helped to shape the field share their childhood memories, their influences, how they became interested in neural networks, and what they see as its future.The subjects tell stories that have been told, referred to, whispered about, and imagined throughout the history of the field. Together, the interviews form a Rashomon-like web of reality. Some of the mythic people responsible for the foundations of modern brain theory and cybernetics, such as Norbert Wiener, Warren McCulloch, and Frank Rosenblatt, appear prominently in the recollections. The interviewees agree about some things and disagree about more. Together, they tell the story of how science is actually done, including the false starts, and the Darwinian struggle for jobs, resources, and reputation. Although some of the interviews contain technical material, there is no actual mathematics in the book.ContributorsJames A. Anderson, Michael Arbib, Gail Carpenter, Leon Cooper, Jack Cowan, Walter Freeman, Stephen Grossberg, Robert Hecht-Neilsen, Geoffrey Hinton, Teuvo Kohonen, Bart Kosko, Jerome Lettvin, Carver Mead, David Rumelhart, Terry Sejnowski, Paul Werbos, Bernard Widrow
Since its founding in 1989 by Terrence Sejnowski, Neural Computation has become the leading journal in the field. Foundations of Neural Computationcollects, by topic, the most significant papers that have appeared in the journal over the past nine years.This volume of Foundations of Neural Computation, on unsupervised learning algorithms, focuses on neural network learning algorithms that do not require an explicit teacher. The goal of unsupervised learning is to extract an efficient internal representation of the statistical structure implicit in the inputs. These algorithms provide insights into the development of the cerebral cortex and implicit learning in humans. They are also of interest to engineers working in areas such as computer vision and speech recognition who seek efficient representations of raw input data.
This book provides a comprehensive introduction to the computational material that forms the underpinnings of the currently evolving set of brain models. It is now clear that the brain is unlikely to be understood without recourse to computational theories. The theme of An Introduction to Natural Computation is that ideas from diverse areas such as neuroscience, information theory, and optimization theory have recently been extended in ways that make them useful for describing the brains programs. This book provides a comprehensive introduction to the computational material that forms the underpinnings of the currently evolving set of brain models. It stresses the broad spectrum of learning models-ranging from neural network learning through reinforcement learning to genetic learning-and situates the various models in their appropriate neural context. To write about models of the brain before the brain is fully understood is a delicate matter. Very detailed models of the neural circuitry risk losing track of the task the brain is trying to solve. At the other extreme, models that represent cognitive constructs can be so abstract that they lose all relationship to neurobiology. An Introduction to Natural Computation takes the middle ground and stresses the computational task while staying near the neurobiology.
This book is the companion volume to "Rethinking Innateness: A Connectionist Perspective on Development" (The MIT Press, 1996), which proposed a new theoretical framework to answer the question "What does it mean to say that a behavior is innate?" The new work provides concrete illustrations -- in the form of computer simulations -- of properties of connectionist models that are particularly relevant to cognitive development. This enables the reader to pursue in depth some of the practical and empirical issues raised in the first book. The authors' larger goal is to demonstrate the usefulness of neural network modeling as a research methodology. The book comes with a complete software package, including demonstration projects, for running neural network simulations on both Macintosh and Windows 95. It also contains a series of exercises in the use of the neural network simulator provided with the book. The software is also available to run on a variety of UNIX platforms.
This book is an outgrowth of the workshop on Neural Adaptive Control Technology, NACT I, held in 1995 in Glasgow. Selected workshop participants were asked to substantially expand and revise their contributions to make them into full papers.The workshop was organised in connection with a three-year European Union funded Basic Research Project in the ESPRIT framework, called NACT, a collaboration between Daimler-Benz (Germany) and the University of Glasgow (Scotland). A major aim of the NACT project is to develop a systematic engineering procedure for designing neural controllers for nonlinear dynamic systems. The techniques developed are being evaluated on concrete industrial problems from Daimler-Benz.In the book emphasis is put on development of sound theory of neural adaptive control for nonlinear control systems, but firmly anchored in the engineering context of industrial practice. Therefore the contributors are both renowned academics and practitioners from major industrial users of neurocontrol.
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. The author introduces the basic principles of pattern recognition and then goes on to describe techniques for modelling probability density functions, and discusses the properties and relative merits of the multi-layer perceptron and radial basis function network models. This book is designed with graduate students in mind and throughout the text it motivates the use of various forms of error functions and reviews the principal algorithms for error function minimization. Bishop also covers the fundamental topics of data processing, feature extraction, and prior knowledge and concludes with an extensive treatment of Bayesian techniques and their applications to neural networks.
Neural networks are an exciting technology of growing importance in real industrial situations, particularly in control and systems. This book aims to give a detailed appreciation of the use of neural nets in these applications; it is aimed particularly at those with a control or systems background who wish to gain an insight into the technology in the context of real applications. The book introduces a wide variety of network types, including Kohenen nets, n-tuple nets and radial basis function networks, as well as the more usual multi-layer perception back-propagation networks. It begins by describing the basic principles and some essential design features, then goes on to examine in depth several application studies illustrating a range of advanced approaches to the topic.
Stephen Grossberg and his colleagues at Boston University's Center for Adaptive Systems are producing some of the most exciting research in the neural network approach to making computers "think." Packed with real-time computer simulations and rigorous demonstrations of these phenomena, this book includes results on vision, speech, cognitive information processing; adaptive pattern recognition, adaptive robotics, conditioning and attention, cognitive-emotional interactions, and decision making under risk.
With Math for Deep Learning, you'll learn the essential mathematics used by and as a background for deep learning. You'll work through Python examples to learn key deep learning related topics in probability, statistics, linear algebra, differential calculus, and matrix calculus as well as how to implement data flow in a neural network, backpropagation, and gradient descent. You'll also use Python to work through the mathematics that underlies those algorithms and even build a fully-functional neural network. In addition you'll find coverage of gradient descent including variations commonly used by the deep learning community: SGD, Adam, RMSprop, and Adagrad/Adadelta.
This current book provides new research on artificial neural networks (ANNs). Topics discussed include the application of ANNs in chemistry and chemical engineering fields; the application of ANNs in the prediction of biodiesel fuel properties from fatty acid constituents; the use of ANNs for solar radiation estimation; the use of in silico methods to design and evaluate skin UV filters; a practical model based on the multilayer perceptron neural network (MLP) approach to predict the milling tool flank wear in a regular cut, as well as entry cut and exit cut, of a milling tool; parameter extraction of small-signal and noise models of microwave transistors based on ANNs; and the application of ANNs to deep-learning and predictive analysis in semantic TCM telemedicine systems.
In this book, the authors present topical research in the study of the architectures and modelling in neurocomputing. Topics discussed include a brain-computer interface for analysing investment behaviour and market stability; neural-based image segmentation architecture with execution on a GPU; EEG montages for the diagnosis of Alzheimer's disease; design and training of neural architectures using extreme learning machines and the systematic comparison of single and multiple hidden-layer neural networks.
Past research has shown that swarm intelligence techniques and fuzzy logic are two useful tools for solving practical engineering problems. This book examines how each of these tools can be utilised for improving the performance of another. Also discussed herein is the capability of swarm intelligence optimisation techniques to obtain the optimal fuzzy systems parameters. The above-mentioned topics are followed by tackling practical problems in pattern recognition, multi-objective benchmarks, and space allocation. In each practical problem, the comparison results with other heuristic methods are provided. Also, a review on some of the past and ongoing research is presented.
Thermodynamic analysis of the refrigeration system is very complex because of the thermodynamic properties equations of working fluids, involving the solution of complex differential equations. This book provides an alternative simple approach based on artificial neural networks (ANNs) and determines the thermodynamic properties of refrigerants.
Human beings the world over are eager to form social bonds, and suffer grievously when these bonds are disrupted. Social connections contribute to our sense of meaning and feelings of vitality, on the one hand, and - at times - to our anguish and despair on the other. It is not surprising that the mechanisms underlying human connections have long interested researchers from diverse disciplines including social psychology, developmental psychology, communication studies, sociology, and neuroscience. Yet there is too little dialogue among these disciplines and too little integration of insights and findings. This fifth book in the Herzliya Series on Personality and Social Psychology aims to rectify that situation by providing a comprehensive survey of cutting-edge theory and research on social connections. The volume contains 21 chapters organised into four main sections: Brain (focusing on the neural underpinnings of social connections and the hormonal processes that contribute to forming connections) Infancy and Development (focusing especially on child-parent relationships) Dyadic Relationship (focusing especially on romantic and marital relationships) Group (considering both evolutionary and physiological bases of group processes) The integrative perspectives presented here are thought-provoking reading for anyone interested in the social nature of the human mind.
Neural Networks, Second Edition provides a complete introduction to neural networks. It describes what they are, what they can do, and how they do it. While some scientific background is assumed, the reader is not expected to have any prior knowledge of neural networks. These networks are explained and discussed by means of examples, so that by the end of the book the reader will have a good overall knowledge of developments right up to current work in the field. * Updated and expanded second edition * Main networks covered are: feedforward networks such as the multilayered perceptron, Boolean networks such as the WISARD, feedback networks such as the Hopfield network, statistical networks such as the Boltzmann machine and Radial-Basis function networks, and self-organising networks such as Kohonen's self-organizing maps. Other networks are referred to throughout the text to give historical interest and alternative architectures * The applications discussed will appeal to student engineers and computer scientists interested in character recognition, intelligent control and threshold logic. The final chapter looks at ways of implementing a neural network, including electronic and optical systems This book is suitable for undergraduates from Computer Science and Electrical Engineering Courses who are taking a one module course on neural networks, and for researchers and computer science professionals who need a quick introduction to the subject. PHIL PICTON is Professor of Intelligent Computer Systems at University College Northampton. Prior to this he was a lecturer at the Open University where he contributed to distance learning courses on control engineering, electronics, mechatronics and artificial intelligence. His research interests include pattern recognition, intelligent control and logic design. |
You may like...
Geospatial Technologies in Urban System…
Alok Bhushan Mukherjee, Akhouri Pramod Krishna, …
Hardcover
R3,609
Discovery Miles 36 090
Geospatial Information Handbook for…
John G. Lyon, Lynn Lyon
Hardcover
R8,099
Discovery Miles 80 990
Remote Sensing and Digital Image…
Marcelo de Carvalho Alves, Luciana Sanches
Hardcover
R3,829
Discovery Miles 38 290
|