![]() |
![]() |
Your cart is empty |
||
Showing 1 - 6 of 6 matches in All Departments
This is the first comprehensive book on information geometry, written by the founder of the field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide range of applications, covering information science, engineering, and neuroscience. It consists of four parts, which on the whole can be read independently. A manifold with a divergence function is first introduced, leading directly to dualistic structure, the heart of information geometry. This part (Part I) can be apprehended without any knowledge of differential geometry. An intuitive explanation of modern differential geometry then follows in Part II, although the book is for the most part understandable without modern differential geometry. Information geometry of statistical inference, including time series analysis and semiparametric estimation (the Neyman-Scott problem), is demonstrated concisely in Part III. Applications addressed in Part IV include hot current topics in machine learning, signal processing, optimization, and neural networks. The book is interdisciplinary, connecting mathematics, information sciences, physics, and neurosciences, inviting readers to a new world of information and geometry. This book is highly recommended to graduate students and researchers who seek new mathematical methods and tools useful in their own fields.
This is the first comprehensive book on information geometry, written by the founder of the field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide range of applications, covering information science, engineering, and neuroscience. It consists of four parts, which on the whole can be read independently. A manifold with a divergence function is first introduced, leading directly to dualistic structure, the heart of information geometry. This part (Part I) can be apprehended without any knowledge of differential geometry. An intuitive explanation of modern differential geometry then follows in Part II, although the book is for the most part understandable without modern differential geometry. Information geometry of statistical inference, including time series analysis and semiparametric estimation (the Neyman-Scott problem), is demonstrated concisely in Part III. Applications addressed in Part IV include hot current topics in machine learning, signal processing, optimization, and neural networks. The book is interdisciplinary, connecting mathematics, information sciences, physics, and neurosciences, inviting readers to a new world of information and geometry. This book is highly recommended to graduate students and researchers who seek new mathematical methods and tools useful in their own fields.
From the reviews: "In this Lecture Note volume the author describes his differential-geometric approach to parametrical statistical problems summarizing the results he had published in a series of papers in the last five years. The author provides a geometric framework for a special class of test and estimation procedures for curved exponential families. ... ... The material and ideas presented in this volume are important and it is recommended to everybody interested in the connection between statistics and geometry ..." #Metrika#1 "More than hundred references are given showing the growing interest in differential geometry with respect to statistics. The book can only strongly be recommended to a geodesist since it offers many new insights into statistics on a familiar ground." #Manuscripta Geodaetica#2
This is an exciting time. The study of neural networks is enjoying a great renaissance, both in computational neuroscience - the development of information processing models of living brains - and in neural computing - the use of neurally inspired concepts in the construction of "intelligent" machines. Thus the title of this volume, Dynamic Interactions in Neural Networks: Models and Data can be given two interpretations. We present models and data on the dynamic interactions occurring in the brain, and we also exhibit the dynamic interactions between research in computational neuroscience and in neural computing, as scientists seek to find common principles that may guide us in the understanding of our own brains and in the design of artificial neural networks. In fact, the book title has yet a third interpretation. It is based on the U. S. -Japan Seminar on "Competition and Cooperation in Neural Nets" which we organized at the University of Southern California, Los Angeles, May 18-22, 1987, and is thus the record of interaction of scientists on both sides of the Pacific in advancing the frontiers of this dynamic, re-born field. The book focuses on three major aspects of neural network function: learning, perception, and action. More specifically, the chapters are grouped under three headings: "Development and Learning in Adaptive Networks," "Visual Function," and "Motor Control and the Cerebellum.
The human brain, wi th its hundred billion or more neurons, is both one of the most complex systems known to man and one of the most important. The last decade has seen an explosion of experimental research on the brain, but little theory of neural networks beyond the study of electrical properties of membranes and small neural circuits. Nonetheless, a number of workers in Japan, the United States and elsewhere have begun to contribute to a theory which provides techniques of mathematical analysis and computer simulation to explore properties of neural systems containing immense numbers of neurons. Recently, it has been gradually recognized that rather independent studies of the dynamics of pattern recognition, pattern format:: ion, motor control, self-organization, etc., in neural systems do in fact make use of common methods. We find that a "competition and cooperation" type of interaction plays a fundamental role in parallel information processing in the brain. The present volume brings together 23 papers presented at a U. S. -Japan Joint Seminar on "Competition and Cooperation in Neural Nets" which was designed to catalyze better integration of theory and experiment in these areas. It was held in Kyoto, Japan, February 15-19, 1982, under the joint sponsorship of the U. S. National Science Foundation and the Japan Society for the Promotion of Science. Participants included brain theorists, neurophysiologists, mathematicians, computer scientists, and physicists. There are seven papers from the U. S.
Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the $\alpha$-connections. The duality between the $\alpha$-connection and the $(-\alpha)$-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability distributions, and the general theory of dual affine connections.The second half of the text provides an overview of many areas of applications, such as statistics, linear systems, information theory, quantum mechanics, convex analysis, neural networks, and affine differential geometry. The book can serve as a suitable text for a topics course for advanced undergraduates and graduate students.
|
![]() ![]() You may like...
Kirstenbosch - A Visitor's Guide
Colin Paterson-Jones, John Winter
Paperback
Discovering Daniel - Finding Our Hope In…
Amir Tsarfati, Rick Yohn
Paperback
|