0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (2)
  • R100 - R250 (59)
  • R250 - R500 (385)
  • R500+ (13,722)
  • -
Status
Format
Author / Contributor
Publisher

Books > Science & Mathematics > Mathematics > Probability & statistics

The Data Science Framework - A View from the EDISON Project (Hardcover, 1st ed. 2020): Juan J. Cuadrado-Gallego, Yuri Demchenko The Data Science Framework - A View from the EDISON Project (Hardcover, 1st ed. 2020)
Juan J. Cuadrado-Gallego, Yuri Demchenko
R4,366 Discovery Miles 43 660 Ships in 12 - 17 working days

This edited book first consolidates the results of the EU-funded EDISON project (Education for Data Intensive Science to Open New science frontiers), which developed training material and information to assist educators, trainers, employers, and research infrastructure managers in identifying, recruiting and inspiring the data science professionals of the future. It then deepens the presentation of the information and knowledge gained to allow for easier assimilation by the reader. The contributed chapters are presented in sequence, each chapter picking up from the end point of the previous one. After the initial book and project overview, the chapters present the relevant data science competencies and body of knowledge, the model curriculum required to teach the required foundations, profiles of professionals in this domain, and use cases and applications. The text is supported with appendices on related process models. The book can be used to develop new courses in data science, evaluate existing modules and courses, draft job descriptions, and plan and design efficient data-intensive research teams across scientific disciplines.

Elements of Computational Statistics (Hardcover, 1st. ed. 2002. Corr. 2nd printing 2005): James E. Gentle Elements of Computational Statistics (Hardcover, 1st. ed. 2002. Corr. 2nd printing 2005)
James E. Gentle
R4,463 Discovery Miles 44 630 Ships in 12 - 17 working days

Computationally intensive methods have become widely used both for statistical inference and for exploratory analyses of data. The methods of computational statistics involve resampling, partitioning, and multiple transformations of a dataset. They may also make use of randomly generated artificial data. Implementation of these methods often requires advanced techniques in numerical analysis, so there is a close connection between computational statistics and statistical computing. This book describes techniques used in computational statistics, and addresses some areas of application of computationally intensive methods, such as density estimation, identification of structure in data, and model building. Although methods of statistical computing are not emphasized in this book, numerical techniques for transformations, for function approximation, and for optimization are explained in the context of the statistical methods. The book includes exercises, some with solutions. The book can be used as a text or supplementary text for various courses in modern statistics at the advanced undergraduate or graduate level, and it can also be used as a reference for statisticians who use computationally-intensive methods of analysis. Although some familiarity with probability and statistics is assumed, the book reviews basic methods of inference, and so is largely self-contained. James Gentle is University Professor of Computational Statistics at George Mason University. He is a Fellow of the American Statistical Association and a member of the International Statistical Institute. He has held several national offices in the American Statistical Association and has served as associate editor for journals of the ASA as well as for other journals in statistics and computing. He is the author of Random Number Generation and Monte Carlo Methods and Numerical Linear Algebra for Statistical Applications.

Theory of Interacting Quantum Fields (Hardcover): Alexei L Rebenko Theory of Interacting Quantum Fields (Hardcover)
Alexei L Rebenko; Translated by Peter V Malyshev
R6,204 Discovery Miles 62 040 Ships in 12 - 17 working days

This monograph is devoted to the systematic presentation of foundations of the quantum field theory. Unlike numerous monographs devoted to this topic, a wide range of problems covered in this book are accompanied by their sufficiently clear interpretations and applications. An important significant feature of this monograph is the desire of the author to present mathematical problems of the quantum field theory with regard to new methods of the constructive and Euclidean field theory that appeared in the last thirty years of the 20th century and are based on the rigorous mathematical apparatus of functional analysis, the theory of operators, and the theory of generalized functions. The monograph is useful for students, post-graduate students, and young scientists who desire to understand not only the formality of construction of the quantum field theory but also its essence and connection with the classical mechanics, relativistic classical field theory, quantum mechanics, group theory, and the theory of path integral formalism.

Heavy-Tailed Distributions in Disaster Analysis (Hardcover, 2010 Ed.): V Pisarenko, M Rodkin Heavy-Tailed Distributions in Disaster Analysis (Hardcover, 2010 Ed.)
V Pisarenko, M Rodkin
R2,902 Discovery Miles 29 020 Ships in 10 - 15 working days

Mathematically, natural disasters of all types are characterized by heavy tailed distributions. The analysis of such distributions with common methods, such as averages and dispersions, can therefore lead to erroneous conclusions. The statistical methods described in this book avoid such pitfalls. Seismic disasters are studied, primarily thanks to the availability of an ample statistical database. New approaches are presented to seismic risk estimation and forecasting the damage caused by earthquakes, ranging from typical, moderate events to very rare, extreme disasters. Analysis of these latter events is based on the limit theorems of probability and the duality of the generalized Pareto distribution and generalized extreme value distribution. It is shown that the parameter most widely used to estimate seismic risk - Mmax, the maximum possible earthquake value - is potentially non-robust. Robust analogues of this parameter are suggested and calculated for some seismic catalogues. Trends in the costs inferred by damage from natural disasters as related to changing social and economic situations are examined for different regions.

The results obtained argue for sustainable development, whereas entirely different, incorrect conclusions can be drawn if the specific properties of the heavy-tailed distribution and change in completeness of data on natural hazards are neglected.

This pioneering work is directed at risk assessment specialists in general, seismologists, administrators and all those interested in natural disasters and their impact on society.

Geometric Modelling, Numerical Simulation, and Optimization: - Applied Mathematics at SINTEF (Hardcover, 2007 ed.): Geir Hasle,... Geometric Modelling, Numerical Simulation, and Optimization: - Applied Mathematics at SINTEF (Hardcover, 2007 ed.)
Geir Hasle, Knut-Andreas Lie, Ewald Quak
R2,899 Discovery Miles 28 990 Ships in 10 - 15 working days

This edited volume addresses the importance of mathematics for industry and society by presenting highlights from contract research at the Department of Applied Mathematics at SINTEF, the largest independent research organization in Scandinavia. Examples range from computer-aided geometric design, via general purpose computing on graphics cards, to reservoir simulation for enhanced oil recovery. Contributions are written in a tutorial style.

Applied Quantitative Finance (Hardcover, 2nd ed. 2008): Wolfgang Karl Hardle, Nikolaus Hautsch, Ludger Overbeck Applied Quantitative Finance (Hardcover, 2nd ed. 2008)
Wolfgang Karl Hardle, Nikolaus Hautsch, Ludger Overbeck
R2,631 Discovery Miles 26 310 Ships in 10 - 15 working days

Recent years have witnessed a growing importance of quantitative methods in both financial research and industry. This development requires the use of advanced techniques on a theoretical and applied level, especially when it comes to the quantification of risk and the valuation of modern financial products. Applied Quantitative Finance (2nd edition) provides a comprehensive and state-of-the-art treatment of cutting-edge topics and methods. It provides solutions to and presents theoretical developments in many practical problems such as risk management, pricing of credit derivatives, quantification of volatility and copula modelling. The synthesis of theory and practice supported by computational tools is reflected in the selection of topics as well as in a finely tuned balance of scientific contributions on practical implementation and theoretical concepts. This linkage between theory and practice offers theoreticians insights into considerations of applicability and, vice versa, provides practitioners comfortable access to new techniques in quantitative finance. Themes that are dominant in current research and which are presented in this book include among others the valuation of Collaterized Debt Obligations (CDOs), the high-frequency analysis of market liquidity, the pricing of Bermuda options and realized volatility. All Quantlets for the calculation of the given examples are downloadable from the Springer web pages.

The Mathematics of Paul Erdos I (Hardcover, 2nd ed. 2013): Ronald L. Graham, Jaroslav Nesetril, Steve Butler The Mathematics of Paul Erdos I (Hardcover, 2nd ed. 2013)
Ronald L. Graham, Jaroslav Nesetril, Steve Butler
R4,342 Discovery Miles 43 420 Ships in 10 - 15 working days

This is the most comprehensive survey of the mathematical life of the legendary Paul Erdos (1913-1996), one of the most versatile and prolific mathematicians of our time. For the first time, all the main areas of Erdos' research are covered in a single project. Because of overwhelming response from the mathematical community, the project now occupies over 1000 pages, arranged into two volumes. These volumes contain both high level research articles as well as key articles that survey some of the cornerstones of Erdos' work, each written by a leading world specialist in the field. A special chapter "Early Days", rare photographs, and art related to Erdos complement this striking collection. A unique contribution is the bibliography on Erdos' publications: the most comprehensive ever published. This new edition, dedicated to the 100th anniversary of Paul Erdos' birth, contains updates on many of the articles from the two volumes of the first edition, several new articles from prominent mathematicians, a new introduction, more biographical information about Paul Erdos, and an updated list of publications. The first volume contains the unique chapter "Early Days", which features personal memories of Paul Erdos by a number of his colleagues. The other three chapters cover number theory, random methods, and geometry. All of these chapters are essentially updated, most notably the geometry chapter that covers the recent solution of the problem on the number of distinct distances in finite planar sets, which was the most popular of Erdos' favorite geometry problems.

Inference on the Low Level - An Investigation into Deduction, Nonmonotonic Reasoning, and the Philosophy of Cognition... Inference on the Low Level - An Investigation into Deduction, Nonmonotonic Reasoning, and the Philosophy of Cognition (Hardcover, 2004 ed.)
Hannes Leitgeb
R4,452 Discovery Miles 44 520 Ships in 12 - 17 working days

In contrast to the prevailing tradition in epistemology, the focus in this book is on low-level inferences, i.e., those inferences that we are usually not consciously aware of and that we share with the cat nearby which infers that the bird which she sees picking grains from the dirt, is able to fly. Presumably, such inferences are not generated by explicit logical reasoning, but logical methods can be used to describe and analyze such inferences.

Part 1 gives a purely system-theoretic explication of belief and inference. Part 2 adds a reliabilist theory of justification for inference, with a qualitative notion of reliability being employed. Part 3 recalls and extends various systems of deductive and nonmonotonic logic and thereby explains the semantics of absolute and high reliability. In Part 4 it is proven that qualitative neural networks are able to draw justified deductive and nonmonotonic inferences on the basis of distributed representations. This is derived from a soundness/completeness theorem with regard to cognitive semantics of nonmonotonic reasoning. The appendix extends the theory both logically and ontologically, and relates it to A. Goldman's reliability account of justified belief.

Regression with Linear Predictors (Hardcover, 2010 ed.): Per Kragh Andersen, Lene Theil Skovgaard Regression with Linear Predictors (Hardcover, 2010 ed.)
Per Kragh Andersen, Lene Theil Skovgaard
R1,686 Discovery Miles 16 860 Ships in 12 - 17 working days

This is a book about regression analysis, that is, the situation in statistics where the distribution of a response (or outcome) variable is related to - planatory variables (or covariates). This is an extremely common situation in the application of statistical methods in many ?elds, andlinear regression, - gistic regression, and Cox proportional hazards regression are frequently used for quantitative, binary, and survival time outcome variables, respectively. Several books on these topics have appeared and for that reason one may well ask why we embark on writing still another book on regression. We have two main reasons for doing this: 1. First, we want to highlightsimilaritiesamonglinear, logistic, proportional hazards, andotherregressionmodelsthatincludealinearpredictor. These modelsareoftentreatedentirelyseparatelyintextsinspiteofthefactthat alloperationsonthemodelsdealingwiththelinearpredictorareprecisely the same, including handling of categorical and quantitative covariates, testing for linearity and studying interactions. 2. Second, we want to emphasize that, for any type of outcome variable, multiple regression models are composed of simple building blocks that areaddedtogetherinthelinearpredictor: thatis, t-tests, one-wayanalyses of variance and simple linear regressions for quantitative outcomes, 2x2, 2x(k+1) tables and simple logistic regressions for binary outcomes, and 2-and (k+1)-sample logrank testsand simple Cox regressionsfor survival data. Thishastwoconsequences. Allthesesimpleandwellknownmethods can be considered as special cases of the regression models. On the other hand, the e?ect of a single explanatory variable in a multiple regression model can be interpreted in a way similar to that obtained in the simple analysis, however, now valid only for the other explanatory variables in the model "held ?xed.""

Statistical Dynamics and Reliability Theory for Mechanical Structures (Hardcover, 2003 ed.): Nikolay Reshetov Statistical Dynamics and Reliability Theory for Mechanical Structures (Hardcover, 2003 ed.)
Nikolay Reshetov; Valery A. Svetlitsky
R4,469 Discovery Miles 44 690 Ships in 12 - 17 working days

The theory of random processes is an integral part of the analysis and synthesis of complex engineering systems. This textbook systematically presents the fundamentals of statistical dynamics and reliability theory. The theory of Markovian processes used during the analysis of random dynamic processes in mechanical systems is described in detail. Examples are machines, instruments and structures loaded with perturbations. The reliability and lifetime of those objects depend on how properly these perturbations are taken into account. Random vibrations with finite and infinite numbers of degrees of freedom are analyzed as well as the theory and numerical methods of non-stationary processes under the conditions of statistical indeterminacy. This textbook is addressed to students and post-graduates of technical universities. It can also be useful to lecturers and mechanical engineers, including designers in different industries.

Advanced Statistical Methods for Astrophysical Probes of Cosmology (Hardcover, 2013 ed.): Marisa Cristina March Advanced Statistical Methods for Astrophysical Probes of Cosmology (Hardcover, 2013 ed.)
Marisa Cristina March
R1,862 Discovery Miles 18 620 Ships in 12 - 17 working days

This thesis explores advanced Bayesian statistical methods for extracting key information for cosmological model selection, parameter inference and forecasting from astrophysical observations. Bayesian model selection provides a measure of how good models in a set are relative to each other - but what if the best model is missing and not included in the set? Bayesian Doubt is an approach which addresses this problem and seeks to deliver an absolute rather than a relative measure of how good a model is. Supernovae type Ia were the first astrophysical observations to indicate the late time acceleration of the Universe - this work presents a detailed Bayesian Hierarchical Model to infer the cosmological parameters (in particular dark energy) from observations of these supernovae type Ia.

Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015): Dimitris N.... Model-Free Prediction and Regression - A Transformation-Based Approach to Inference (Hardcover, 1st ed. 2015)
Dimitris N. Politis
R3,084 Discovery Miles 30 840 Ships in 12 - 17 working days

The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.

Linear Stochastic Systems - A Geometric Approach to Modeling, Estimation and Identification (Hardcover, 2015 ed.): Anders... Linear Stochastic Systems - A Geometric Approach to Modeling, Estimation and Identification (Hardcover, 2015 ed.)
Anders Lindquist, Giorgio Picci
R4,637 Discovery Miles 46 370 Ships in 10 - 15 working days

This book presents a treatise on the theory and modeling of second-order stationary processes, including an exposition on selected application areas that are important in the engineering and applied sciences. The foundational issues regarding stationary processes dealt with in the beginning of the book have a long history, starting in the 1940s with the work of Kolmogorov, Wiener, Cramer and his students, in particular Wold, and have since been refined and complemented by many others. Problems concerning the filtering and modeling of stationary random signals and systems have also been addressed and studied, fostered by the advent of modern digital computers, since the fundamental work of R.E. Kalman in the early 1960s. The book offers a unified and logically consistent view of the subject based on simple ideas from Hilbert space geometry and coordinate-free thinking. In this framework, the concepts of stochastic state space and state space modeling, based on the notion of the conditional independence of past and future flows of the relevant signals, are revealed to be fundamentally unifying ideas. The book, based on over 30 years of original research, represents a valuable contribution that will inform the fields of stochastic modeling, estimation, system identification, and time series analysis for decades to come. It also provides the mathematical tools needed to grasp and analyze the structures of algorithms in stochastic systems theory.

Fault-Diagnosis Applications - Model-Based Condition Monitoring: Actuators, Drives, Machinery, Plants, Sensors, and... Fault-Diagnosis Applications - Model-Based Condition Monitoring: Actuators, Drives, Machinery, Plants, Sensors, and Fault-tolerant Systems (Hardcover, Edition.)
Rolf Isermann
R3,727 Discovery Miles 37 270 Ships in 12 - 17 working days

Supervision, condition-monitoring, fault detection, fault diagnosis and fault management play an increasing role for technical processes and vehicles in order to improve reliability, availability, maintenance and lifetime. For safety-related processes fault-tolerant systems with redundancy are required in order to reach comprehensive system integrity.

This book is a sequel of the book Fault-Diagnosis Systems published in 2006, where the basic methods were described. After a short introduction into fault-detection and fault-diagnosis methods the book shows how these methods can be applied for a selection of 20 real technical components and processes as examples, such as:

Electrical drives (DC, AC)

Electrical actuators

Fluidic actuators (hydraulic, pneumatic)

Centrifugal and reciprocating pumps

Pipelines (leak detection)

Industrial robots

Machine tools (main and feed drive, drilling, milling, grinding)

Heat exchangers

Also realized fault-tolerant systems for electrical drives, actuators and sensors are presented.

The book describes why and how the various signal-model-based and process-model-based methods were applied and which experimental results could be achieved. In several cases a combination of different methods was most successful.

The book is dedicated to graduate students of electrical, mechanical, chemical engineering and computer science and for engineers.

Bayesian Inference - Data Evaluation and Decisions (Hardcover, 2nd ed. 2016): Hanns Ludwig Harney Bayesian Inference - Data Evaluation and Decisions (Hardcover, 2nd ed. 2016)
Hanns Ludwig Harney
R4,392 Discovery Miles 43 920 Ships in 12 - 17 working days

This new edition offers a comprehensive introduction to the analysis of data using Bayes rule. It generalizes Gaussian error intervals to situations in which the data follow distributions other than Gaussian. This is particularly useful when the observed parameter is barely above the background or the histogram of multiparametric data contains many empty bins, so that the determination of the validity of a theory cannot be based on the chi-squared-criterion. In addition to the solutions of practical problems, this approach provides an epistemic insight: the logic of quantum mechanics is obtained as the logic of unbiased inference from counting data. New sections feature factorizing parameters, commuting parameters, observables in quantum mechanics, the art of fitting with coherent and with incoherent alternatives and fitting with multinomial distribution. Additional problems and examples help deepen the knowledge. Requiring no knowledge of quantum mechanics, the book is written on introductory level, with many examples and exercises, for advanced undergraduate and graduate students in the physical sciences, planning to, or working in, fields such as medical physics, nuclear physics, quantum mechanics, and chaos.

Heterogeneity in Statistical Genetics - How to Assess, Address, and Account for Mixtures in Association Studies (Hardcover, 1st... Heterogeneity in Statistical Genetics - How to Assess, Address, and Account for Mixtures in Association Studies (Hardcover, 1st ed. 2020)
Derek Gordon, Stephen J. Finch, Wonkuk Kim
R3,321 Discovery Miles 33 210 Ships in 10 - 15 working days

Heterogeneity, or mixtures, are ubiquitous in genetics. Even for data as simple as mono-genic diseases, populations are a mixture of affected and unaffected individuals. Still, most statistical genetic association analyses, designed to map genes for diseases and other genetic traits, ignore this phenomenon. In this book, we document methods that incorporate heterogeneity into the design and analysis of genetic and genomic association data. Among the key qualities of our developed statistics is that they include mixture parameters as part of the statistic, a unique component for tests of association. A critical feature of this work is the inclusion of at least one heterogeneity parameter when performing statistical power and sample size calculations for tests of genetic association. We anticipate that this book will be useful to researchers who want to estimate heterogeneity in their data, develop or apply genetic association statistics where heterogeneity exists, and accurately evaluate statistical power and sample size for genetic association through the application of robust experimental design.

Hidden Markov Models in Finance (Hardcover, 2007 ed.): Rogemar S. Mamon, Robert J Elliott Hidden Markov Models in Finance (Hardcover, 2007 ed.)
Rogemar S. Mamon, Robert J Elliott
R2,904 Discovery Miles 29 040 Ships in 10 - 15 working days

A number of methodologies have been employed to provide decision making solutions to a whole assortment of financial problems in today's globalized markets. Hidden Markov Models in Finance by Mamon and Elliott will be the first systematic application of these methods to some special kinds of financial problems; namely, pricing options and variance swaps, valuation of life insurance policies, interest rate theory, credit risk modeling, risk management, analysis of future demand and inventory level, testing foreign exchange rate hypothesis, and early warning systems for currency crises. This book provides researchers and practitioners with analyses that allow them to sort through the random noise of financial markets (i.e., turbulence, volatility, emotion, chaotic events, etc.) and analyze the fundamental components of economic markets. Hence, Hidden Markov Models in Finance provides decision makers with a clear, accurate picture of core financial components by filtering out the random noise in financial markets.

Statistical Methods for Disease Clustering (Hardcover, 2010 ed.): Toshiro Tango Statistical Methods for Disease Clustering (Hardcover, 2010 ed.)
Toshiro Tango
R3,175 Discovery Miles 31 750 Ships in 10 - 15 working days

This book is intended to provide a text on statistical methods for detecting clus ters and/or clustering of health events that is of interest to ?nal year undergraduate and graduate level statistics, biostatistics, epidemiology, and geography students but will also be of relevance to public health practitioners, statisticians, biostatisticians, epidemiologists, medical geographers, human geographers, environmental scien tists, and ecologists. Prerequisites are introductory biostatistics and epidemiology courses. With increasing public health concerns about environmental risks, the need for sophisticated methods for analyzing spatial health events is immediate. Further more, the research area of statistical tests for disease clustering now attracts a wide audience due to the perceived need to implement wide ranging monitoring systems to detect possible health related bioterrorism activity. With this background and the development of the geographical information system (GIS), the analysis of disease clustering of health events has seen considerable development over the last decade. Therefore, several excellent books on spatial epidemiology and statistics have re cently been published. However, it seems to me that there is no other book solely focusing on statistical methods for disease clustering. I hope that readers will ?nd this book useful and interesting as an introduction to the subject.

Stochastic Analysis and Mathematical Physics II - 4th International ANESTOC Workshop in Santiago, Chile (Hardcover, 2003 ed.):... Stochastic Analysis and Mathematical Physics II - 4th International ANESTOC Workshop in Santiago, Chile (Hardcover, 2003 ed.)
Rolando Rebolledo
R1,498 Discovery Miles 14 980 Ships in 10 - 15 working days

The seminar on Stochastic Analysis and Mathematical Physics of the Ca tholic University of Chile, started in Santiago in 1984, has being followed and enlarged since 1995 by a series of international workshops aimed at pro moting a wide-spectrum dialogue between experts on the fields of classical and quantum stochastic analysis, mathematical physics, and physics. This volume collects most of the contributions to the Fourth Interna tional Workshop on Stochastic Analysis and Mathematical Physics (whose Spanish abbreviation is "ANESTOC"; in English, "STAMP"), held in San tiago, Chile, from January 5 to 11, 2000. The workshop style stimulated a vivid exchange of ideas which finally led to a number of written con tributions which I am glad to introduce here. However, we are currently submitted to a sort of invasion of proceedings books, and we do not want to increase our own shelves with a new one of the like. On the other hand, the editors of conference proceedings have to use different exhausting and com pulsive strategies to persuade authors to write and provide texts in time, a task which terrifies us. As a result, this volume is aimed at smoothly start ing a new kind of publication. What we would like to have is a collection of books organized like our seminar.

An Introduction to Bayesian Analysis - Theory and Methods (Hardcover): Jayanta K. Ghosh, Mohan Delampady, Tapas Samanta An Introduction to Bayesian Analysis - Theory and Methods (Hardcover)
Jayanta K. Ghosh, Mohan Delampady, Tapas Samanta
R4,118 Discovery Miles 41 180 Ships in 12 - 17 working days

This is a graduate-level textbook on Bayesian analysis blending modern Bayesian theory, methods, and applications. Starting from basic statistics, undergraduate calculus and linear algebra, ideas of both subjective and objective Bayesian analysis are developed to a level where real-life data can be analyzed using the current techniques of statistical computing.

Advances in both low-dimensional and high-dimensional problems are covered, as well as important topics such as empirical Bayes and hierarchical Bayes methods and Markov chain Monte Carlo (MCMC) techniques.

Many topics are at the cutting edge of statistical research. Solutions to common inference problems appear throughout the text along with discussion of what prior to choose. There is a discussion of elicitation of a subjective prior as well as the motivation, applicability, and limitations of objective priors. By way of important applications the book presents microarrays, nonparametric regression via wavelets as well as DMA mixtures of normals, and spatial analysis with illustrations using simulated and real data. Theoretical topics at the cutting edge include high-dimensional model selection and Intrinsic Bayes Factors, which the authors have successfully applied to geological mapping.

The style is informal but clear. Asymptotics is used to supplement simulation or understand some aspects of the posterior.

Recent Advances in Robust Statistics: Theory and Applications (Hardcover, 1st ed. 2016): Claudio Agostinelli, Ayanendranath... Recent Advances in Robust Statistics: Theory and Applications (Hardcover, 1st ed. 2016)
Claudio Agostinelli, Ayanendranath Basu, Peter Filzmoser, Diganta Mukherjee
R4,995 Discovery Miles 49 950 Ships in 12 - 17 working days

This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12-16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statistical methods. The aim of the ICORS conference, which is being organized annually since 2001, is to bring together researchers interested in robust statistics, data analysis and related areas. The conference is meant for theoretical and applied statisticians, data analysts from other fields, leading experts, junior researchers and graduate students. The ICORS meetings offer a forum for discussing recent advances and emerging ideas in statistics with a focus on robustness, and encourage informal contacts and discussions among all the participants. They also play an important role in maintaining a cohesive group of international researchers interested in robust statistics and related topics, whose interactions transcend the meetings and endure year round.

Mathematical Models in Epidemiology (Hardcover, 1st ed. 2019): Fred Brauer, Carlos Castillo-Chavez, Zhilan Feng Mathematical Models in Epidemiology (Hardcover, 1st ed. 2019)
Fred Brauer, Carlos Castillo-Chavez, Zhilan Feng
R2,439 Discovery Miles 24 390 Ships in 10 - 15 working days

The book is a comprehensive, self-contained introduction to the mathematical modeling and analysis of disease transmission models. It includes (i) an introduction to the main concepts of compartmental models including models with heterogeneous mixing of individuals and models for vector-transmitted diseases, (ii) a detailed analysis of models for important specific diseases, including tuberculosis, HIV/AIDS, influenza, Ebola virus disease, malaria, dengue fever and the Zika virus, (iii) an introduction to more advanced mathematical topics, including age structure, spatial structure, and mobility, and (iv) some challenges and opportunities for the future. There are exercises of varying degrees of difficulty, and projects leading to new research directions. For the benefit of public health professionals whose contact with mathematics may not be recent, there is an appendix covering the necessary mathematical background. There are indications which sections require a strong mathematical background so that the book can be useful for both mathematical modelers and public health professionals.

Introductory Business Statistics, International Edition (with Bind in Printed Access Card) (Paperback, International Edition):... Introductory Business Statistics, International Edition (with Bind in Printed Access Card) (Paperback, International Edition)
Ronald Weiers
R1,272 R1,147 Discovery Miles 11 470 Save R125 (10%) Ships in 10 - 15 working days

Highly praised for its exceptional clarity, conversational style and useful examples, Introductory Business Statistics, 7e, International Edition was written specifically for you. This proven, popular text cuts through the jargon to help you understand fundamental statistical concepts and why they are important to you, your world, and your career. The text's outstanding illustrations, friendly language, non-technical terminology, and current, real-world examples will capture your interest and prepare you for success right from the start.

Computational Intelligence, Optimization and Inverse Problems with Applications in Engineering (Hardcover, 1st ed. 2019):... Computational Intelligence, Optimization and Inverse Problems with Applications in Engineering (Hardcover, 1st ed. 2019)
Gustavo Mendes Platt, Xin-She Yang, Antonio Jose Silva Neto
R2,825 Discovery Miles 28 250 Ships in 10 - 15 working days

This book focuses on metaheuristic methods and its applications to real-world problems in Engineering. The first part describes some key metaheuristic methods, such as Bat Algorithms, Particle Swarm Optimization, Differential Evolution, and Particle Collision Algorithms. Improved versions of these methods and strategies for parameter tuning are also presented, both of which are essential for the practical use of these important computational tools. The second part then applies metaheuristics to problems, mainly in Civil, Mechanical, Chemical, Electrical, and Nuclear Engineering. Other methods, such as the Flower Pollination Algorithm, Symbiotic Organisms Search, Cross-Entropy Algorithm, Artificial Bee Colonies, Population-Based Incremental Learning, Cuckoo Search, and Genetic Algorithms, are also presented. The book is rounded out by recently developed strategies, or hybrid improved versions of existing methods, such as the Lightning Optimization Algorithm, Differential Evolution with Particle Collisions, and Ant Colony Optimization with Dispersion - state-of-the-art approaches for the application of computational intelligence to engineering problems. The wide variety of methods and applications, as well as the original results to problems of practical engineering interest, represent the primary differentiation and distinctive quality of this book. Furthermore, it gathers contributions by authors from four countries - some of which are the original proponents of the methods presented - and 18 research centers around the globe.

Resampling Methods - A Practical Guide to Data Analysis (Hardcover, 3rd ed. 2006): Phillip I Good Resampling Methods - A Practical Guide to Data Analysis (Hardcover, 3rd ed. 2006)
Phillip I Good
R2,685 Discovery Miles 26 850 Ships in 10 - 15 working days

"a ]the author has packaged an excellent and modern set of topics around the development and use of quantitative models.... If you need to learn about resampling, this book would be a good place to start."

a "Technometrics (Review of the Second Edition)

This thoroughly revised and expanded third edition is a practical guide to data analysis using the bootstrap, cross-validation, and permutation tests. Only requiring minimal mathematics beyond algebra, the book provides a table-free introduction to data analysis utilizing numerous exercises, practical data sets, and freely available statistical shareware.

Topics and Features

* Practical presentation covers both the bootstrap and permutations along with the program code necessary to put them to work.

* Includes a systematic guide to selecting the correct procedure for a particular application.

* Detailed coverage of classification, estimation, experimental design, hypothesis testing, and modeling.

* Suitable for both classroom use and individual self-study.

New to the Third Edition

* Procedures are grouped by application; a prefatory chapter guides readers to the appropriate reading matter.

* Program listings and screen shots now accompany each resampling procedure: Whether one programs in C++, CART, Blossom, Box Sampler (an Excel add-in), EViews, MATLAB, R, Resampling Stats, SAS macros, S-PLUS, Stata, or StatXact, readers will find the program listings and screen shots needed to put each resampling procedure into practice.

* To simplify programming, code for readers to download and apply is posted at http: //www.springeronline.com/0-8176-4386-9.

* Notation has beensimplified and, where possible, eliminated.

* A glossary and answers to selected exercises are included.

With its accessible style and intuitive topic development, the book is an excellent basic resource for the power, simplicity, and versatility of resampling methods. It is an essential resource for statisticians, biostatisticians, statistical consultants, students, and research professionals in the biological, physical, and social sciences, engineering, and technology.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Statistical Methods and Calculation…
Isabel Willemse, Peter Nyelisani Paperback R690 R608 Discovery Miles 6 080
Applied Business Statistics - Methods…
Trevor Wegner Paperback R759 R668 Discovery Miles 6 680
Abstract of Statistical Returns in…
Canada. Provincial Secretary's Office Hardcover R750 Discovery Miles 7 500
BI Statistical Methods - Volume I…
Peter Walley Hardcover R2,695 Discovery Miles 26 950
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu Paperback R1,204 R1,088 Discovery Miles 10 880
Probability and Statistics - Pearson New…
Morris DeGroot, Mark Schervish Paperback R2,270 Discovery Miles 22 700
Basic mathematics for economics students…
D. Yu Paperback R345 R319 Discovery Miles 3 190
Statistical tables/Statistiese tabelle
D J Stoker Staple bound R260 R241 Discovery Miles 2 410
Best Books gegradeerde leesreeks: Vlak 1…
Best Books Paperback R90 R78 Discovery Miles 780
Basic mathematics for economics students…
Derek Yu Paperback R345 R319 Discovery Miles 3 190

 

Partners