0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (1)
  • R2,500 - R5,000 (8)
  • R5,000 - R10,000 (3)
  • -
Status
Brand

Showing 1 - 12 of 12 matches in All Departments

Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023): Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023)
Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe
R3,470 Discovery Miles 34 700 Ships in 10 - 15 working days

This book provides several efficient Kalman filters (linear or nonlinear) under information theoretic criteria. They achieve excellent performance in complicated non-Gaussian noises with low computation complexity and have great practical application potential. The book combines all these perspectives and results in a single resource for students and practitioners in relevant application fields. Each chapter starts with a brief review of fundamentals, presents the material focused on the most important properties and evaluates comparatively the models discussing free parameters and their effect on the results. Proofs are provided at the end of each chapter. The book is geared to senior undergraduates with a basic understanding of linear algebra, signal processing and statistics, as well as graduate students or practitioners with experience in Kalman filtering.

Theory of Information and its Value (Paperback, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Paperback, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,514 Discovery Miles 35 140 Ships in 10 - 15 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Adaptive Learning Methods for Nonlinear System Modeling (Paperback): Danilo Comminiello, Jose C. Principe Adaptive Learning Methods for Nonlinear System Modeling (Paperback)
Danilo Comminiello, Jose C. Principe
R3,665 R3,376 Discovery Miles 33 760 Save R289 (8%) Ships in 12 - 17 working days

Adaptive Learning Methods for Nonlinear System Modeling presents some of the recent advances on adaptive algorithms and machine learning methods designed for nonlinear system modeling and identification. Real-life problems always entail a certain degree of nonlinearity, which makes linear models a non-optimal choice. This book mainly focuses on those methodologies for nonlinear modeling that involve any adaptive learning approaches to process data coming from an unknown nonlinear system. By learning from available data, such methods aim at estimating the nonlinearity introduced by the unknown system. In particular, the methods presented in this book are based on online learning approaches, which process the data example-by-example and allow to model even complex nonlinearities, e.g., showing time-varying and dynamic behaviors. Possible fields of applications of such algorithms includes distributed sensor networks, wireless communications, channel identification, predictive maintenance, wind prediction, network security, vehicular networks, active noise control, information forensics and security, tracking control in mobile robots, power systems, and nonlinear modeling in big data, among many others. This book serves as a crucial resource for researchers, PhD and post-graduate students working in the areas of machine learning, signal processing, adaptive filtering, nonlinear control, system identification, cooperative systems, computational intelligence. This book may be also of interest to the industry market and practitioners working with a wide variety of nonlinear systems.

Theory of Information and its Value (Hardcover, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Hardcover, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,546 Discovery Miles 35 460 Ships in 10 - 15 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.)
Jose C. Principe
R5,839 Discovery Miles 58 390 Ships in 10 - 15 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of... Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,966 Discovery Miles 29 660 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.)
Jose C. Principe
R6,105 Discovery Miles 61 050 Ships in 10 - 15 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.): Theodore W.... Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,997 Discovery Miles 29 970 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Brain-Machine Interface Engineering (Paperback): Justin Sanchez, Jose C. Principe Brain-Machine Interface Engineering (Paperback)
Justin Sanchez, Jose C. Principe
R1,306 Discovery Miles 13 060 Ships in 10 - 15 working days

Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generative BMI Models / Adaptive Algorithms for Point Processes / BMI Systems

Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March... Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March 5-8, 2006, Proceedings (Paperback, 2006 ed.)
Justinian Rosca, Deniz Erdogmus, Jose C. Principe, Simon Haykin
R3,182 Discovery Miles 31 820 Ships in 10 - 15 working days

This book constitutes the refereed proceedings of the 6th International Conference on Independent Component Analysis and Blind Source Separation, ICA 2006, held in Charleston, SC, USA, in March 2006.

The 120 revised papers presented were carefully reviewed and selected from 183 submissions. The papers are organized in topical sections on algorithms and architectures, applications, medical applications, speech and signal processing, theory, and visual and sensory processing.

Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings... Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings (Paperback, 2013 ed.)
Pablo A. Estevez, Jose C. Principe, Pablo Zegers
R5,785 Discovery Miles 57 850 Ships in 10 - 15 working days

Self-organizing maps (SOMs) were developed by Teuvo Kohonen in the early eighties. Since then more than 10,000 works have been based on SOMs. SOMs are unsupervised neural networks useful for clustering and visualization purposes. Many SOM applications have been developed in engineering and science, and other fields. This book contains refereed papers presented at the 9th Workshop on Self-Organizing Maps (WSOM 2012) held at the Universidad de Chile, Santiago, Chile, on December 12-14, 2012. The workshop brought together researchers and practitioners in the field of self-organizing systems. Among the book chapters there are excellent examples of the use of SOMs in agriculture, computer science, data visualization, health systems, economics, engineering, social sciences, text and image analysis, and time series analysis. Other chapters present the latest theoretical work on SOMs as well as Learning Vector Quantization (LVQ) methods.

System Parameter Identification - Information Criteria and Algorithms (Hardcover, New): Badong Chen, Yu Zhu, Jinchun Hu, Jose... System Parameter Identification - Information Criteria and Algorithms (Hardcover, New)
Badong Chen, Yu Zhu, Jinchun Hu, Jose C. Principe
R3,567 Discovery Miles 35 670 Ships in 10 - 15 working days

Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identification and information processing, investigating system identification from an information theory point of view." "The book is divided into six chapters, which cover the information needed to understand the theory and application of system parameter identification. The authors research provides a base for the book, but it incorporates the results from the latest international research publications.
Named a 2013 Notable Computer Book for Information Systems by "Computing Reviews"One of the first books to present system parameter identification with information theoretic criteria so readers can track the latest developmentsContains numerous illustrative examples to help the reader grasp basic methods"

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
So, For The Record - Behind The…
Anton Harber Paperback R290 R232 Discovery Miles 2 320
Cook, Eat, Repeat - Ingredients, Recipes…
Nigella Lawson Hardcover R620 R495 Discovery Miles 4 950
Bullsh!t - 50 Fibs That Made South…
Jonathan Ancer Paperback  (1)
R270 R180 Discovery Miles 1 800
Snyman's Criminal Law
Paperback R1,301 R1,153 Discovery Miles 11 530
When Love Kills - The Tragic Tale Of AKA…
Melinda Ferguson Paperback  (1)
R320 R235 Discovery Miles 2 350
Herontdek Jou Selfvertroue - Sewe Stappe…
Rolene Strauss Paperback  (1)
R330 R284 Discovery Miles 2 840
Confronting Inequality - The South…
Michael Nassen Smith Paperback R250 R195 Discovery Miles 1 950
Mexico In Mzansi
Aiden Pienaar Paperback R360 R255 Discovery Miles 2 550
Love And Above - A Journey Into…
Sarah Bullen Paperback R330 R284 Discovery Miles 2 840
Eight Days In July - Inside The Zuma…
Qaanitah Hunter, Kaveel Singh, … Paperback  (1)
R340 R292 Discovery Miles 2 920

 

Partners