0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (2)
  • R2,500 - R5,000 (6)
  • R5,000 - R10,000 (3)
  • -
Status
Brand

Showing 1 - 11 of 11 matches in All Departments

Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023): Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe Kalman Filtering Under Information Theoretic Criteria (1st ed. 2023)
Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe
R3,369 Discovery Miles 33 690 Ships in 10 - 15 working days

This book provides several efficient Kalman filters (linear or nonlinear) under information theoretic criteria. They achieve excellent performance in complicated non-Gaussian noises with low computation complexity and have great practical application potential. The book combines all these perspectives and results in a single resource for students and practitioners in relevant application fields. Each chapter starts with a brief review of fundamentals, presents the material focused on the most important properties and evaluates comparatively the models discussing free parameters and their effect on the results. Proofs are provided at the end of each chapter. The book is geared to senior undergraduates with a basic understanding of linear algebra, signal processing and statistics, as well as graduate students or practitioners with experience in Kalman filtering.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Hardcover, Edition.)
Jose C. Principe
R5,732 Discovery Miles 57 320 Ships in 12 - 19 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.): Theodore W.... Brain-Computer Interfaces - An international assessment of research and development trends (Hardcover, 2008 ed.)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,911 Discovery Miles 29 110 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Theory of Information and its Value (Paperback, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Paperback, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,409 Discovery Miles 34 090 Ships in 10 - 15 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.): Jose C. Principe Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives (Paperback, 2010 ed.)
Jose C. Principe
R5,653 Discovery Miles 56 530 Ships in 10 - 15 working days

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld, this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme: learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (? = 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.

Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of... Brain-Computer Interfaces - An international assessment of research and development trends (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, …
R2,881 Discovery Miles 28 810 Ships in 10 - 15 working days

We have come to know that our ability to survive and grow as a nation to a very large degree depends upon our scientific progress. Moreover, it is not enough simply to keep 1 abreast of the rest of the world in scientific matters. We must maintain our leadership. President Harry Truman spoke those words in 1950, in the aftermath of World War II and in the midst of the Cold War. Indeed, the scientific and engineering leadership of the United States and its allies in the twentieth century played key roles in the successful outcomes of both World War II and the Cold War, sparing the world the twin horrors of fascism and totalitarian communism, and fueling the economic prosperity that followed. Today, as the United States and its allies once again find themselves at war, President Truman's words ring as true as they did a half-century ago. The goal set out in the Truman Administration of maintaining leadership in science has remained the policy of the U. S. Government to this day: Dr. John Marburger, the Director of the Office of Science and Technology (OSTP) in the Executive Office of the President, made remarks to that effect during his 2 confirmation hearings in October 2001. The United States needs metrics for measuring its success in meeting this goal of maintaining leadership in science and technology. That is one of the reasons that the National Science Foundation (NSF) and many other agencies of the U. S.

Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March... Independent Component Analysis and Blind Signal Separation - 6th International Conference, ICA 2006, Charleston, SC, USA, March 5-8, 2006, Proceedings (Paperback, 2006 ed.)
Justinian Rosca, Deniz Erdogmus, Jose C. Principe, Simon Haykin
R3,082 Discovery Miles 30 820 Ships in 10 - 15 working days

This book constitutes the refereed proceedings of the 6th International Conference on Independent Component Analysis and Blind Source Separation, ICA 2006, held in Charleston, SC, USA, in March 2006.

The 120 revised papers presented were carefully reviewed and selected from 183 submissions. The papers are organized in topical sections on algorithms and architectures, applications, medical applications, speech and signal processing, theory, and visual and sensory processing.

Theory of Information and its Value (Hardcover, 1st ed. 2020): Roman V. Belavkin Theory of Information and its Value (Hardcover, 1st ed. 2020)
Roman V. Belavkin; Ruslan L. Stratonovich; Edited by Panos M. Pardalos, Jose C. Principe
R3,227 R3,059 Discovery Miles 30 590 Save R168 (5%) Ships in 9 - 17 working days

This English version of Ruslan L. Stratonovich's Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings... Advances in Self-Organizing Maps - 9th International Workshop, WSOM 2012 Santiago, Chile, December 12-14, 2012 Proceedings (Paperback, 2013 ed.)
Pablo A. Estevez, Jose C. Principe, Pablo Zegers
R5,602 Discovery Miles 56 020 Ships in 10 - 15 working days

Self-organizing maps (SOMs) were developed by Teuvo Kohonen in the early eighties. Since then more than 10,000 works have been based on SOMs. SOMs are unsupervised neural networks useful for clustering and visualization purposes. Many SOM applications have been developed in engineering and science, and other fields. This book contains refereed papers presented at the 9th Workshop on Self-Organizing Maps (WSOM 2012) held at the Universidad de Chile, Santiago, Chile, on December 12-14, 2012. The workshop brought together researchers and practitioners in the field of self-organizing systems. Among the book chapters there are excellent examples of the use of SOMs in agriculture, computer science, data visualization, health systems, economics, engineering, social sciences, text and image analysis, and time series analysis. Other chapters present the latest theoretical work on SOMs as well as Learning Vector Quantization (LVQ) methods.

Brain-Machine Interface Engineering (Paperback): Justin Sanchez, Jose C. Principe Brain-Machine Interface Engineering (Paperback)
Justin Sanchez, Jose C. Principe
R1,280 Discovery Miles 12 800 Ships in 10 - 15 working days

Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generative BMI Models / Adaptive Algorithms for Point Processes / BMI Systems

Adaptive Learning Methods for Nonlinear System Modeling (Paperback): Danilo Comminiello, Jose C. Principe Adaptive Learning Methods for Nonlinear System Modeling (Paperback)
Danilo Comminiello, Jose C. Principe
R3,651 R2,207 Discovery Miles 22 070 Save R1,444 (40%) Ships in 9 - 17 working days

Adaptive Learning Methods for Nonlinear System Modeling presents some of the recent advances on adaptive algorithms and machine learning methods designed for nonlinear system modeling and identification. Real-life problems always entail a certain degree of nonlinearity, which makes linear models a non-optimal choice. This book mainly focuses on those methodologies for nonlinear modeling that involve any adaptive learning approaches to process data coming from an unknown nonlinear system. By learning from available data, such methods aim at estimating the nonlinearity introduced by the unknown system. In particular, the methods presented in this book are based on online learning approaches, which process the data example-by-example and allow to model even complex nonlinearities, e.g., showing time-varying and dynamic behaviors. Possible fields of applications of such algorithms includes distributed sensor networks, wireless communications, channel identification, predictive maintenance, wind prediction, network security, vehicular networks, active noise control, information forensics and security, tracking control in mobile robots, power systems, and nonlinear modeling in big data, among many others. This book serves as a crucial resource for researchers, PhD and post-graduate students working in the areas of machine learning, signal processing, adaptive filtering, nonlinear control, system identification, cooperative systems, computational intelligence. This book may be also of interest to the industry market and practitioners working with a wide variety of nonlinear systems.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Klaskameraad Study Guide: Brave New…
Aldous Huxley Paperback R250 R223 Discovery Miles 2 230
You Can't Put God in a Box - Thoughtful…
Kelly Besecke Hardcover R4,073 Discovery Miles 40 730
Lied Vir Sarah - Lesse Van My Ma
Jonathan Jansen, Naomi Jansen Hardcover  (1)
R100 R93 Discovery Miles 930
London Review of English and Foreign…
William Kenrick Paperback R717 Discovery Miles 7 170
Social Problems and Inequality - Social…
John Alessio Hardcover R4,640 Discovery Miles 46 400
Black And White Bioscope - Making Movies…
Neil Parsons Hardcover R339 Discovery Miles 3 390
Structural Analysis with the Finite…
Eugenio Onate Paperback R1,826 Discovery Miles 18 260
Die Woud Van Sneeu & Ys
Frenette van Wyk Paperback R270 R253 Discovery Miles 2 530
The Truths We Hold - An American Journey
Kamala Harris Paperback R295 R272 Discovery Miles 2 720
Key to Answered Prayer, The
Rabbi K. A. Schneider Hardcover R329 R302 Discovery Miles 3 020

 

Partners