0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (1)
  • R2,500 - R5,000 (1)
  • -
Status
Brand

Showing 1 - 2 of 2 matches in All Departments

Minimum Divergence Methods in Statistical Machine Learning - From an Information Geometric Viewpoint (Hardcover, 1st ed. 2022):... Minimum Divergence Methods in Statistical Machine Learning - From an Information Geometric Viewpoint (Hardcover, 1st ed. 2022)
Shinto Eguchi, Osamu Komori
R3,521 Discovery Miles 35 210 Ships in 10 - 15 working days

This book explores minimum divergence methods of statistical machine learning for estimation, regression, prediction, and so forth, in which we engage in information geometry to elucidate their intrinsic properties of the corresponding loss functions, learning algorithms, and statistical models. One of the most elementary examples is Gauss's least squares estimator in a linear regression model, in which the estimator is given by minimization of the sum of squares between a response vector and a vector of the linear subspace hulled by explanatory vectors. This is extended to Fisher's maximum likelihood estimator (MLE) for an exponential model, in which the estimator is provided by minimization of the Kullback-Leibler (KL) divergence between a data distribution and a parametric distribution of the exponential model in an empirical analogue. Thus, we envisage a geometric interpretation of such minimization procedures such that a right triangle is kept with Pythagorean identity in the sense of the KL divergence. This understanding sublimates a dualistic interplay between a statistical estimation and model, which requires dual geodesic paths, called m-geodesic and e-geodesic paths, in a framework of information geometry. We extend such a dualistic structure of the MLE and exponential model to that of the minimum divergence estimator and the maximum entropy model, which is applied to robust statistics, maximum entropy, density estimation, principal component analysis, independent component analysis, regression analysis, manifold learning, boosting algorithm, clustering, dynamic treatment regimes, and so forth. We consider a variety of information divergence measures typically including KL divergence to express departure from one probability distribution to another. An information divergence is decomposed into the cross-entropy and the (diagonal) entropy in which the entropy associates with a generative model as a family of maximum entropy distributions; the cross entropy associates with a statistical estimation method via minimization of the empirical analogue based on given data. Thus any statistical divergence includes an intrinsic object between the generative model and the estimation method. Typically, KL divergence leads to the exponential model and the maximum likelihood estimation. It is shown that any information divergence leads to a Riemannian metric and a pair of the linear connections in the framework of information geometry. We focus on a class of information divergence generated by an increasing and convex function U, called U-divergence. It is shown that any generator function U generates the U-entropy and U-divergence, in which there is a dualistic structure between the U-divergence method and the maximum U-entropy model. We observe that a specific choice of U leads to a robust statistical procedure via the minimum U-divergence method. If U is selected as an exponential function, then the corresponding U-entropy and U-divergence are reduced to the Boltzmann-Shanon entropy and the KL divergence; the minimum U-divergence estimator is equivalent to the MLE. For robust supervised learning to predict a class label we observe that the U-boosting algorithm performs well for contamination of mislabel examples if U is appropriately selected. We present such maximal U-entropy and minimum U-divergence methods, in particular, selecting a power function as U to provide flexible performance in statistical machine learning.

Statistical Methods for Imbalanced Data in Ecological and Biological Studies (Paperback, 1st ed. 2019): Osamu Komori, Shinto... Statistical Methods for Imbalanced Data in Ecological and Biological Studies (Paperback, 1st ed. 2019)
Osamu Komori, Shinto Eguchi
R1,469 Discovery Miles 14 690 Ships in 10 - 15 working days

This book presents a fresh, new approach in that it provides a comprehensive recent review of challenging problems caused by imbalanced data in prediction and classification, and also in that it introduces several of the latest statistical methods of dealing with these problems. The book discusses the property of the imbalance of data from two points of view. The first is quantitative imbalance, meaning that the sample size in one population highly outnumbers that in another population. It includes presence-only data as an extreme case, where the presence of a species is confirmed, whereas the information on its absence is uncertain, which is especially common in ecology in predicting habitat distribution. The second is qualitative imbalance, meaning that the data distribution of one population can be well specified whereas that of the other one shows a highly heterogeneous property. A typical case is the existence of outliers commonly observed in gene expression data, and another is heterogeneous characteristics often observed in a case group in case-control studies. The extension of the logistic regression model, maxent, and AdaBoost for imbalanced data is discussed, providing a new framework for improvement of prediction, classification, and performance of variable selection. Weights functions introduced in the methods play an important role in alleviating the imbalance of data. This book also furnishes a new perspective on these problem and shows some applications of the recently developed statistical methods to real data sets.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Luca Distressed Peak Cap (Khaki)
R249 Discovery Miles 2 490
Minions 2 - The Rise Of Gru
Blu-ray disc R150 Discovery Miles 1 500
Cadac Pizza Stone (33cm)
 (18)
R398 Discovery Miles 3 980
Discovering Daniel - Finding Our Hope In…
Amir Tsarfati, Rick Yohn Paperback R280 R210 Discovery Miles 2 100
Ultra-Link VGA to HDMI with Audio…
R277 Discovery Miles 2 770
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Asus Chromebook FLIP CR1100FKA-C864G1C…
R8,599 Discovery Miles 85 990
Loot
Nadine Gordimer Paperback  (2)
R383 R310 Discovery Miles 3 100

 

Partners