0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (1)
  • R2,500 - R5,000 (8)
  • R5,000 - R10,000 (3)
  • -
Status
Brand

Showing 1 - 12 of 12 matches in All Departments

Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023): Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023)
Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe
R3,272 Discovery Miles 32 720 Ships in 10 - 15 working days

This book provides several efficient Kalman filters (linear or nonlinear) under information theoretic criteria. They achieve excellent performance in complicated non-Gaussian noises with low computation complexity and have great practical application potential. The book combines all these perspectives and results in a single resource for students and practitioners in relevant application fields. Each chapter starts with a brief review of fundamentals, presents the material focused on the most important properties and evaluates comparatively the models discussing free parameters and their effect on the results. Proofs are provided at the end of each chapter. The book is geared to senior undergraduates with a basic understanding of linear algebra, signal processing and statistics, as well as graduate students or practitioners with experience in Kalman filtering.

System Parameter Identification - Information Criteria and Algorithms (Hardcover, New): Badong Chen, Yu Zhu, Jinchun Hu, Jose... System Parameter Identification - Information Criteria and Algorithms (Hardcover, New)
Badong Chen, Yu Zhu, Jinchun Hu, Jose C. Principe
R2,730 Discovery Miles 27 300 Ships in 12 - 17 working days

Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identification and information processing, investigating system identification from an information theory point of view." "The book is divided into six chapters, which cover the information needed to understand the theory and application of system parameter identification. The authors research provides a base for the book, but it incorporates the results from the latest international research publications.
Named a 2013 Notable Computer Book for Information Systems by "Computing Reviews"One of the first books to present system parameter identification with information theoretic criteria so readers can track the latest developmentsContains numerous illustrative examples to help the reader grasp basic methods"

Theory of Information and its Value (Hardcover, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Hardcover, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,325 Discovery Miles 33 250 Ships in 12 - 17 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.)
Jose C. Principe
R5,342 Discovery Miles 53 420 Ships in 12 - 17 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.): Theodore W.... Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,826 Discovery Miles 28 260 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Theory of Information and its Value (Paperback, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Paperback, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,312 Discovery Miles 33 120 Ships in 10 - 15 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.)
Jose C. Principe
R5,503 Discovery Miles 55 030 Ships in 10 - 15 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of... Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,796 Discovery Miles 27 960 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Brain-Machine Interface Engineering (Paperback): Justin Sanchez, Jose C. Principe Brain-Machine Interface Engineering (Paperback)
Justin Sanchez, Jose C. Principe
R1,233 Discovery Miles 12 330 Ships in 10 - 15 working days

Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generative BMI Models / Adaptive Algorithms for Point Processes / BMI Systems

Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March... Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March 5-8, 2006, Proceedings (Paperback, 2006 ed.)
Justinian Rosca, Deniz Erdogmus, Jose C. Principe, Simon Haykin
R2,997 Discovery Miles 29 970 Ships in 10 - 15 working days

This book constitutes the refereed proceedings of the 6th International Conference on Independent Component Analysis and Blind Source Separation, ICA 2006, held in Charleston, SC, USA, in March 2006.

The 120 revised papers presented were carefully reviewed and selected from 183 submissions. The papers are organized in topical sections on algorithms and architectures, applications, medical applications, speech and signal processing, theory, and visual and sensory processing.

Adaptive Learning Methods for Nonlinear System Modeling (Paperback): Danilo Comminiello, Jose C. Principe Adaptive Learning Methods for Nonlinear System Modeling (Paperback)
Danilo Comminiello, Jose C. Principe
R3,522 R3,168 Discovery Miles 31 680 Save R354 (10%) Ships in 12 - 17 working days

Adaptive Learning Methods for Nonlinear System Modeling presents some of the recent advances on adaptive algorithms and machine learning methods designed for nonlinear system modeling and identification. Real-life problems always entail a certain degree of nonlinearity, which makes linear models a non-optimal choice. This book mainly focuses on those methodologies for nonlinear modeling that involve any adaptive learning approaches to process data coming from an unknown nonlinear system. By learning from available data, such methods aim at estimating the nonlinearity introduced by the unknown system. In particular, the methods presented in this book are based on online learning approaches, which process the data example-by-example and allow to model even complex nonlinearities, e.g., showing time-varying and dynamic behaviors. Possible fields of applications of such algorithms includes distributed sensor networks, wireless communications, channel identification, predictive maintenance, wind prediction, network security, vehicular networks, active noise control, information forensics and security, tracking control in mobile robots, power systems, and nonlinear modeling in big data, among many others. This book serves as a crucial resource for researchers, PhD and post-graduate students working in the areas of machine learning, signal processing, adaptive filtering, nonlinear control, system identification, cooperative systems, computational intelligence. This book may be also of interest to the industry market and practitioners working with a wide variety of nonlinear systems.

Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings... Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings (Paperback, 2013 ed.)
Pablo A. Estevez, Jose C. Principe, Pablo Zegers
R5,453 Discovery Miles 54 530 Ships in 10 - 15 working days

Self-organizing maps (SOMs) were developed by Teuvo Kohonen in the early eighties. Since then more than 10,000 works have been based on SOMs. SOMs are unsupervised neural networks useful for clustering and visualization purposes. Many SOM applications have been developed in engineering and science, and other fields. This book contains refereed papers presented at the 9th Workshop on Self-Organizing Maps (WSOM 2012) held at the Universidad de Chile, Santiago, Chile, on December 12-14, 2012. The workshop brought together researchers and practitioners in the field of self-organizing systems. Among the book chapters there are excellent examples of the use of SOMs in agriculture, computer science, data visualization, health systems, economics, engineering, social sciences, text and image analysis, and time series analysis. Other chapters present the latest theoretical work on SOMs as well as Learning Vector Quantization (LVQ) methods.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Bear Grylls: The Complete Adventures…
Bear Grylls Paperback R750 R585 Discovery Miles 5 850
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Genie Blue Light Blocking Glasses…
R399 R299 Discovery Miles 2 990
Bostik Crystal Clear Tape
R43 Discovery Miles 430
Cadac Pizza Stone (33cm)
 (18)
R398 Discovery Miles 3 980
Playstation 4 Replacement Case
 (9)
R54 Discovery Miles 540
Monami Retractable Wax Crayons (Pack of…
R116 R92 Discovery Miles 920
Dog's Life Ballistic Nylon Waterproof…
R999 R569 Discovery Miles 5 690
CritiCareŽ Paper Tape (25mm x 3m)(Single…
R5 Discovery Miles 50

 

Partners