0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (49)
  • R250 - R500 (372)
  • R500+ (12,127)
  • -
Status
Format
Author / Contributor
Publisher

Books > Science & Mathematics > Mathematics > Probability & statistics

Multilevel Modeling - Methodological Advances, Issues, and Applications (Hardcover): Steven P. Reise, Naihua Duan Multilevel Modeling - Methodological Advances, Issues, and Applications (Hardcover)
Steven P. Reise, Naihua Duan
R4,511 Discovery Miles 45 110 Ships in 10 - 15 working days

This book illustrates the current work of leading multilevel modeling (MLM) researchers from around the world.

The book's goal is to critically examine the real problems that occur when trying to use MLMs in applied research, such as power, experimental design, and model violations. This presentation of cutting-edge work and statistical innovations in multilevel modeling includes topics such as growth modeling, repeated measures analysis, nonlinear modeling, outlier detection, and meta analysis.

This volume will be beneficial for researchers with advanced statistical training and extensive experience in applying multilevel models, especially in the areas of education; clinical intervention; social, developmental and health psychology, and other behavioral sciences; or as a supplement for an introductory graduate-level course.

Mechanical Reliability Improvement - Probability and Statistics for Experimental Testing (Hardcover): Robert Little Mechanical Reliability Improvement - Probability and Statistics for Experimental Testing (Hardcover)
Robert Little
R4,980 Discovery Miles 49 800 Ships in 10 - 15 working days

Contains a compact disc with nearly 200 microcomputer programs illustrating a wide range of reliability and statistical analyses Mechanical Reliability Improvement provides probability and statistical concepts developed using pseudorandom numbers enumeration-, simulation-, and randomization-based statistical analyses for comparison of the test performance of alternative designs, as well as simulation- and randomization-based tests for examination of the credibility of statistical presumptions and discusses centroid and moment of inertia analogies for mean and variance the organization structure of completely randomized, randomized complete block, and split spot experiment test programs

Bayesian Economics Through Numerical Methods - A Guide to Econometrics and Decision-Making with Prior Information (Hardcover):... Bayesian Economics Through Numerical Methods - A Guide to Econometrics and Decision-Making with Prior Information (Hardcover)
Jeffrey H Dorfman
R1,467 Discovery Miles 14 670 Ships in 18 - 22 working days

Providing researchers in economics, finance, and statistics with an up-to-date introduction to applying Bayesian techniques to empirical studies, this book covers the full range of the new numerical techniques which have been developed over the last thirty years. Notably, these are: Monte Carlo sampling, antithetic replication, importance sampling, and Gibbs sampling. The author covers both advances in theory and modern approaches to numerical and applied problems, and includes applications drawn from a variety of different fields within economics, while also providing a quick overview of the underlying statistical ideas of Bayesian thought. The result is a book which presents a roadmap of applied economic questions that can now be addressed empirically with Bayesian methods. Consequently, many researchers will find this a readily readable survey of this growing topic.

SOLUTIONS MANUAL to Accompany Research Design and Statistical Analysis 2/e (Paperback, 2nd edition): Jerome L Myers, Arnold D.... SOLUTIONS MANUAL to Accompany Research Design and Statistical Analysis 2/e (Paperback, 2nd edition)
Jerome L Myers, Arnold D. Well
R776 Discovery Miles 7 760 Ships in 10 - 15 working days

First published in 2002. Routledge is an imprint of Taylor & Francis, an informa company.

Stochastic Differential Equations - With Applications to Physics and Engineering (Hardcover, 1991 ed.): K. Sobczyk Stochastic Differential Equations - With Applications to Physics and Engineering (Hardcover, 1991 ed.)
K. Sobczyk
R2,877 Discovery Miles 28 770 Ships in 18 - 22 working days

'Et moi, ..~ si lavait su CO.llUlJalt en revc:nir, One acMcc matbcmatica bu JaIdcred the human rac:c. It bu put COIDIDOD _ beet je n'y serais point aBe.' Jules Verne wbac it bdoup, 0Jl !be~ IbcII _t to !be dusty cauialcr Iabc&d 'diMardod__ The series is divergent; thc:reforc we may be -'. I!.ticT. Bc:I1 able to do something with it. O. Hcavisidc Mathematics is a tool for thought. A highly necessary tool in a world when: both feedback and non- linearities abound. Similarly. all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statcmalts as: 'One service topology has rendered mathematical physics ...*; 'One service logic has rendered c0m- puter science ...'; 'One service category theory has rendered mathematics ...'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series. This series, Mathematics and Its Applications. started in 19n. Now that over one hundred volumes have appeared it seems opportune to reexamine its scope. At the time I wrote "Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However. the 'tree' of knowledge of mathematics and related fields does not grow only by putting forth new branc:hes. It also happens, quite often in fact, that branches which were thought to be completely.

The Ordered Weighted Averaging Operators - Theory and Applications (Hardcover, 1997 ed.): Ronald R. Yager, J. Kacprzyk The Ordered Weighted Averaging Operators - Theory and Applications (Hardcover, 1997 ed.)
Ronald R. Yager, J. Kacprzyk
R4,201 Discovery Miles 42 010 Ships in 18 - 22 working days

Aggregation plays a central role in many of the technological tasks we are faced with. The importance of this process will become even greater as we move more and more toward becoming an information-cent.ered society, us is happening with the rapid growth of the Internet and the World Wirle Weh. Here we shall be faced with many issues related to the fusion of information. One very pressing issue here is the development of mechanisms to help search for information, a problem that clearly has a strong aggregation-related component. More generally, in order to model the sophisticated ways in which human beings process information, as well as going beyond the human capa bilities, we need provide a basket of aggregation tools. The centrality of aggregation in human thought can be be very clearly seen by looking at neural networks, a technology motivated by modeling the human brain. One can see that the basic operations involved in these networks are learning and aggregation. The Ordered Weighted Averaging (OWA) operators provide a parameter ized family of aggregation operators which include many of the well-known operators such as the maximum, minimum and the simple average."

Probability Theory (Hardcover): Heinz Bauer Probability Theory (Hardcover)
Heinz Bauer; Translated by Robert B. Burckel
R4,546 Discovery Miles 45 460 Ships in 18 - 22 working days

The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 35 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monographs and textbooks written by experts are needed more than ever, not least to pave the way for the next generation of mathematicians. In this sense the editorial board and the publisher of the Studies are devoted to continue the Studies as a service to the mathematical community. Please submit any book proposals to Niels Jacob. Titles in planning include Flavia Smarazzo and Alberto Tesei, Measure Theory: Radon Measures, Young Measures, and Applications to Parabolic Problems (2019) Elena Cordero and Luigi Rodino, Time-Frequency Analysis of Operators (2019) Mark M. Meerschaert, Alla Sikorskii, and Mohsen Zayernouri, Stochastic and Computational Models for Fractional Calculus, second edition (2020) Mariusz Lemanczyk, Ergodic Theory: Spectral Theory, Joinings, and Their Applications (2020) Marco Abate, Holomorphic Dynamics on Hyperbolic Complex Manifolds (2021) Miroslava Antic, Joeri Van der Veken, and Luc Vrancken, Differential Geometry of Submanifolds: Submanifolds of Almost Complex Spaces and Almost Product Spaces (2021) Kai Liu, Ilpo Laine, and Lianzhong Yang, Complex Differential-Difference Equations (2021) Rajendra Vasant Gurjar, Kayo Masuda, and Masayoshi Miyanishi, Affine Space Fibrations (2022)

Cybersecurity Analytics (Paperback): Rakesh M. Verma, David J Marchette Cybersecurity Analytics (Paperback)
Rakesh M. Verma, David J Marchette
R1,512 Discovery Miles 15 120 Ships in 9 - 17 working days

Cybersecurity Analytics is for the cybersecurity student and professional who wants to learn data science techniques critical for tackling cybersecurity challenges, and for the data science student and professional who wants to learn about cybersecurity adaptations. Trying to build a malware detector, a phishing email detector, or just interested in finding patterns in your datasets? This book can let you do it on your own. Numerous examples and datasets links are included so that the reader can "learn by doing." Anyone with a basic college-level calculus course and some probability knowledge can easily understand most of the material. The book includes chapters containing: unsupervised learning, semi-supervised learning, supervised learning, text mining, natural language processing, and more. It also includes background on security, statistics, and linear algebra. The website for the book contains a listing of datasets, updates, and other resources for serious practitioners.

Visualizing Statistical Models And Concepts (Hardcover): R.W. Farebrother, Michael Schyns Visualizing Statistical Models And Concepts (Hardcover)
R.W. Farebrother, Michael Schyns
R4,080 Discovery Miles 40 800 Ships in 10 - 15 working days

"Examines classic algorithms, geometric diagrams, and mechanical principles for enhances visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming."

Components of Variance (Hardcover): D.R. Cox, P.J. Solomon Components of Variance (Hardcover)
D.R. Cox, P.J. Solomon
R3,645 Discovery Miles 36 450 Ships in 10 - 15 working days

Identifying the sources and measuring the impact of haphazard variations are important in any number of research applications, from clinical trials and genetics to industrial design and psychometric testing. Only in very simple situations can such variations be represented effectively by independent, identically distributed random variables or by random sampling from a hypothetical infinite population.

Components of Variance illuminates the complexities of the subject, setting forth its principles with focus on both the development of models for detailed analyses and the statistical techniques themselves. The authors first consider balanced and unbalanced situations, then move to the treatment of non-normal data, beginning with the Poisson and binomial models and followed by extensions to survival data and more general situations. In the final chapter, they discuss ways of extending and assessing various models, including the study of exceedances, the use of nonlinear representations, the study of transformations of the response variable, and the detailed examination of the distributional form of the underlying random variables.

Careful signposting and numerous examples from genetic data analysis, clinical trial design, longitudinal data analysis, industrial design, and meta-analysis make this book accessible - and valuable - not only to statisticians but to all applied research scientists who use statistical methods.

Configural Frequency Analysis - Methods, Models, and Applications (Hardcover): Alexander Von Eye Configural Frequency Analysis - Methods, Models, and Applications (Hardcover)
Alexander Von Eye
R4,530 Discovery Miles 45 300 Ships in 10 - 15 working days

"Configural Frequency Analysis" (CFA) provides an up-to-the-minute comprehensive introduction to its techniques, models, and applications. Written in a formal yet accessible style, actual empirical data examples are used to illustrate key concepts. Step-by-step program sequences are used to show readers how to employ CFA methods using commercial software packages, such as SAS, SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
CFA is an important method for analyzing results involved with categorical and longitudinal data. It allows one to answer the question of whether individual cells or groups of cells of cross-classifications differ significantly from expectations. The expectations are calculated using methods employed in log-linear modeling or a priori information. It is the only statistical method that allows one to make statements about empty areas in the data space.
Applied and or person-oriented researchers, statisticians, and advanced students interested in CFA and categorical and longitudinal data will find this book to be a valuable resource. Developed since 1969, this method is now used by a large number of researchers around the world in a variety of disciplines, including psychology, education, medicine, and sociology. "Configural Frequency Analysis" will serve as an excellent text for courses on configural frequency analysis, categorical variable analysis, or analysis of contingency tables. Prerequisites include an understanding of descriptive statistics, hypothesis testing, statistical model fitting, and some understanding of categorical data analysis and matrix algebra.

Configural Frequency Analysis - Methods, Models, and Applications (Paperback): Alexander Von Eye Configural Frequency Analysis - Methods, Models, and Applications (Paperback)
Alexander Von Eye
R2,007 Discovery Miles 20 070 Ships in 10 - 15 working days

"Configural Frequency Analysis" (CFA) provides an up-to-the-minute comprehensive introduction to its techniques, models, and applications. Written in a formal yet accessible style, actual empirical data examples are used to illustrate key concepts. Step-by-step program sequences are used to show readers how to employ CFA methods using commercial software packages, such as SAS, SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
CFA is an important method for analyzing results involved with categorical and longitudinal data. It allows one to answer the question of whether individual cells or groups of cells of cross-classifications differ significantly from expectations. The expectations are calculated using methods employed in log-linear modeling or a priori information. It is the only statistical method that allows one to make statements about empty areas in the data space.
Applied and or person-oriented researchers, statisticians, and advanced students interested in CFA and categorical and longitudinal data will find this book to be a valuable resource. Developed since 1969, this method is now used by a large number of researchers around the world in a variety of disciplines, including psychology, education, medicine, and sociology. "Configural Frequency Analysis" will serve as an excellent text for courses on configural frequency analysis, categorical variable analysis, or analysis of contingency tables. Prerequisites include an understanding of descriptive statistics, hypothesis testing, statistical model fitting, and some understanding of categorical data analysis and matrix algebra.

Selected Works of R.M. Dudley (Hardcover, Edition.): Evarist Gin e, Vladimir Koltchinskii, R. Norvaisa Selected Works of R.M. Dudley (Hardcover, Edition.)
Evarist Gin e, Vladimir Koltchinskii, R. Norvaisa
R4,366 Discovery Miles 43 660 Ships in 18 - 22 working days

For almost fifty years, Richard M. Dudley has been extremely influential in the development of several areas of Probability. His work on Gaussian processes led to the understanding of the basic fact that their sample boundedness and continuity should be characterized in terms of proper measures of complexity of their parameter spaces equipped with the intrinsic covariance metric. His sufficient condition for sample continuity in terms of metric entropy is widely used and was proved by X. Fernique to be necessary for stationary Gaussian processes, whereas its more subtle versions (majorizing measures) were proved by M. Talagrand to be necessary in general.

Together with V. N. Vapnik and A. Y. Cervonenkis, R. M. Dudley is a founder of the modern theory of empirical processes in general spaces. His work on uniform central limit theorems (under bracketing entropy conditions and for Vapnik-Cervonenkis classes), greatly extends classical results that go back to A. N. Kolmogorov and M. D. Donsker, and became the starting point of a new line of research, continued in the work of Dudley and others, that developed empirical processes into one of the major tools in mathematical statistics and statistical learning theory.

As a consequence of Dudley's early work on weak convergence of probability measures on non-separable metric spaces, the Skorohod topology on the space of regulated right-continuous functions can be replaced, in the study of weak convergence of the empirical distribution function, by the supremum norm. In a further recent step Dudley replaces this norm by the stronger p-variation norms, which then allows replacing compact differentiability of many statistical functionals by Fr chet differentiability in the delta method.

Richard M. Dudley has also made important contributions to mathematical statistics, the theory of weak convergence, relativistic Markov processes, differentiability of nonlinear operators and several other areas of mathematics.

Professor Dudley has been the adviser to thirty PhD's and is a Professor of Mathematics at the Massachusetts Institute of Technology.

Random Evolutions and Their Applications (Hardcover, 1997 ed.): Anatoly Swishchuk Random Evolutions and Their Applications (Hardcover, 1997 ed.)
Anatoly Swishchuk
R1,521 Discovery Miles 15 210 Ships in 18 - 22 working days

The main purpose of this handbook is to summarize and to put in order the ideas, methods, results and literature on the theory of random evolutions and their applications to the evolutionary stochastic systems in random media, and also to present some new trends in the theory of random evolutions and their applications. In physical language, a random evolution ( RE ) is a model for a dynamical sys tem whose state of evolution is subject to random variations. Such systems arise in all branches of science. For example, random Hamiltonian and Schrodinger equations with random potential in quantum mechanics, Maxwell's equation with a random refractive index in electrodynamics, transport equations associated with the trajec tory of a particle whose speed and direction change at random, etc. There are the examples of a single abstract situation in which an evolving system changes its "mode of evolution" or "law of motion" because of random changes of the "environment" or in a "medium." So, in mathematical language, a RE is a solution of stochastic operator integral equations in a Banach space. The operator coefficients of such equations depend on random parameters. Of course, in such generality, our equation includes any homogeneous linear evolving system. Particular examples of such equations were studied in physical applications many years ago. A general mathematical theory of such equations has been developed since 1969, the Theory of Random Evolutions."

Analytical Methods in Statistics - AMISTAT, Prague, November 2015 (Hardcover, 1st ed. 2017): Jarom ir Antoch, Jana Jureckova,... Analytical Methods in Statistics - AMISTAT, Prague, November 2015 (Hardcover, 1st ed. 2017)
Jarom ir Antoch, Jana Jureckova, Matus Maciak, Michal Pesta
R3,252 Discovery Miles 32 520 Ships in 10 - 15 working days

This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.

Analyzing Data Through Probabilistic Modeling in Statistics (Hardcover): Dariusz Jacek Jakobczak Analyzing Data Through Probabilistic Modeling in Statistics (Hardcover)
Dariusz Jacek Jakobczak
R6,170 Discovery Miles 61 700 Ships in 18 - 22 working days

Probabilistic modeling represents a subject arising in many branches of mathematics, economics, and computer science. Such modeling connects pure mathematics with applied sciences. Similarly, data analyzing and statistics are situated on the border between pure mathematics and applied sciences. Therefore, when probabilistic modeling meets statistics, it is a very interesting occasion that has gained much research recently. With the increase of these technologies in life and work, it has become somewhat essential in the workplace to have planning, timetabling, scheduling, decision making, optimization, simulation, data analysis, and risk analysis and process modeling. However, there are still many difficulties and challenges that arrive in these sectors during the process of planning or decision making. There continues to be the need for more research on the impact of such probabilistic modeling with other approaches. Analyzing Data Through Probabilistic Modeling in Statistics is an essential reference source that builds on the available literature in the field of probabilistic modeling, statistics, operational research, planning and scheduling, data extrapolation in decision making, probabilistic interpolation and extrapolation in simulation, stochastic processes, and decision analysis. This text will provide the resources necessary for economics and management sciences and for mathematics and computer sciences. This book is ideal for interested technology developers, decision makers, mathematicians, statisticians and practitioners, stakeholders, researchers, academicians, and students looking to further their research exposure to pertinent topics in operations research and probabilistic modeling.

Robust and Multivariate Statistical Methods - Festschrift in Honor of David E. Tyler (Hardcover, 1st ed. 2023): Mengxi Yi,... Robust and Multivariate Statistical Methods - Festschrift in Honor of David E. Tyler (Hardcover, 1st ed. 2023)
Mengxi Yi, Klaus Nordhausen
R4,969 Discovery Miles 49 690 Ships in 10 - 15 working days

This book presents recent developments in multivariate and robust statistical methods. Featuring contributions by leading experts in the field it covers various topics, including multivariate and high-dimensional methods, time series, graphical models, robust estimation, supervised learning and normal extremes. It will appeal to statistics and data science researchers, PhD students and practitioners who are interested in modern multivariate and robust statistics. The book is dedicated to David E. Tyler on the occasion of his pending retirement and also includes a review contribution on the popular Tyler’s shape matrix.

Trends of Artificial Intelligence and Big Data for E-Health (Hardcover, 1st ed. 2022): Houneida Sakly, Kristen Yeom, Safwan... Trends of Artificial Intelligence and Big Data for E-Health (Hardcover, 1st ed. 2022)
Houneida Sakly, Kristen Yeom, Safwan Halabi, Mourad Said, Jayne Seekins, …
R3,778 Discovery Miles 37 780 Ships in 18 - 22 working days

This book aims to present the impact of Artificial Intelligence (AI) and Big Data in healthcare for medical decision making and data analysis in myriad fields including Radiology, Radiomics, Radiogenomics, Oncology, Pharmacology, COVID-19 prognosis, Cardiac imaging, Neuroradiology, Psychiatry and others. This will include topics such as Artificial Intelligence of Thing (AIOT), Explainable Artificial Intelligence (XAI), Distributed learning, Blockchain of Internet of Things (BIOT), Cybersecurity, and Internet of (Medical) Things (IoTs). Healthcare providers will learn how to leverage Big Data analytics and AI as methodology for accurate analysis based on their clinical data repositories and clinical decision support. The capacity to recognize patterns and transform large amounts of data into usable information for precision medicine assists healthcare professionals in achieving these objectives. Intelligent Health has the potential to monitor patients at risk with underlying conditions and track their progress during therapy. Some of the greatest challenges in using these technologies are based on legal and ethical concerns of using medical data and adequately representing and servicing disparate patient populations. One major potential benefit of this technology is to make health systems more sustainable and standardized. Privacy and data security, establishing protocols, appropriate governance, and improving technologies will be among the crucial priorities for Digital Transformation in Healthcare.

Analysis of Failure and Survival Data (Paperback): Peter J. Smith Analysis of Failure and Survival Data (Paperback)
Peter J. Smith; Series edited by Chris Chatfield, Jim Zidek, Jim Lindsey
R3,098 Discovery Miles 30 980 Ships in 10 - 15 working days

Analysis of Failure and Survival Data is an essential textbook for graduate-level students of survival analysis and reliability and a valuable reference for practitioners. It focuses on the many techniques that appear in popular software packages, including plotting product-limit survival curves, hazard plots, and probability plots in the context of censored data. The author integrates S-Plus and Minitab output throughout the text, along with a variety of real data sets so readers can see how the theory and methods are applied. He also incorporates exercises in each chapter that provide valuable problem-solving experience.

In addition to all of this, the book also brings to light the most recent linear regression techniques. Most importantly, it includes a definitive account of the Buckley-James method for censored linear regression, found to be the best performing method when a Cox proportional hazards method is not appropriate.

Applying the theories of survival analysis and reliability requires more background and experience than students typically receive at the undergraduate level. Mastering the contents of this book will help prepare students to begin performing research in survival analysis and reliability and provide seasoned practitioners with a deeper understanding of the field.

Binomial Distribution Handbook for Scientists and Engineers (Hardcover, 2001 ed.): E. Von Collani, Klaus Drager Binomial Distribution Handbook for Scientists and Engineers (Hardcover, 2001 ed.)
E. Von Collani, Klaus Drager
R2,709 Discovery Miles 27 090 Ships in 18 - 22 working days

This book deals with estimating and testing the probability of an event. It aims at providing practitioners with refined and easy to use techniques as well as initiating a new field of research in theoretical statistics. Practical, comprehensive tables for data analysis of the experimental state of investigations are included, as well as an accompanying CD-ROM with extensive tables for measurement intervals and prediction regions for testing. Statisticians and practitioners will find this book an essential reference.

The Structural Theory of Probability - New Ideas from Computer Science on the Ancient Problem of Probability Interpretation... The Structural Theory of Probability - New Ideas from Computer Science on the Ancient Problem of Probability Interpretation (Hardcover, 2003 ed.)
Paolo Rocchi
R2,741 Discovery Miles 27 410 Ships in 18 - 22 working days

The Structural Theory of Probability addresses the interpretation of probability, often debated in the scientific community. This problem has been examined for centuries; perhaps no other mathematical calculation suffuses mankind's efforts at survival as amply as probability. In the dawn of the 20th century David Hilbert included the foundations of the probability calculus within the most vital mathematical problems; Dr. Rocchi's topical and ever-timely volume proposes a novel, exhaustive solution to this vibrant issue.

Paolo Rocchi, a versatile IBM scientist, outlines a new philosophical and mathematical approach inspired by well-tested software techniques. Through the prism of computer technology he provides an innovative view on the theory of probability. Dr. Rocchi discusses in detail the mathematical tools used to clarify the meaning of probability, integrating with care numerous examples and case studies. The comprehensiveness and originality of its mathematical development make this volume an inspiring read for researchers and students alike.

From a review by the Mathematical Association of America Online: "[The author's] basis thesis is this: Probability theory from Pascal to Kolmogorov and onwards has focused on events as sets of outcomes or results, and probability as a measure attached to these sets. But this ignores the structure of the processes which lead to the outcomes, and the author explores how taking into account the details of the processes would lead to a more fundamental understanding of the nature of probability. This is an interesting idea, and the author makes it clear that at present this is a work in process and not yet a finished product, for hesays that he has tried to give "an impulse in the right direction" with his theory. ... One hopes that in due course the author will develop his theories further and present overwhelmingly persuasive examples of the advantages of his approach." - Ramachandran Bharath

Theories of Meaningfulness (Hardcover): Louis Narens Theories of Meaningfulness (Hardcover)
Louis Narens
R4,248 Discovery Miles 42 480 Ships in 10 - 15 working days

Written by one of the masters of the foundation of measurement, Louis Narens' new book thoroughly examines the basis for the measurement-theoretic concept of meaningfulness and presents a new theory about the role of numbers and invariance in science. The book associates with each portion of mathematical science a subject matter that the portion of science is intended to investigate or describe. It considers those quantitative or empirical assertions and relationships that belong to the subject matter to be meaningful (for that portion of science) and those that do not belong to be meaningless.
The first two chapters of the "Theories of Meaningfulness" introduce meaningfulness concepts, their place in the history of science, and some of their traditional applications. The idea that meaningfulness will have different, but interrelated uses is then introduced. To provide formal descriptions of these, the author employs a powerful framework that incorporates pure mathematics, provides for qualitative objects and relations, and addresses the relationships between qualitative objects and pure mathematics. The framework is then applied to produce axiomatic theories of meaningfulness, including generalizations and a new foundation for the famous Erlanger Program of mathematics. The meaningfulness concept is further specialized with the introduction of intrinsicness, which deals with meaningful concepts and relations that are lawful and qualitativeness, which is concerned with qualitative concepts. The concept of empiricalness is then introduced to distinguish it from meaningfulness and qualitativeness.
The failure to distinguish empiricalness from meaningfulness and qualitativeness has produced much confusion in the foundations of science literature and has generated many pseudo-controversies. This book suggests that many of these disappear when empiricalness is intersected with the other concepts to produce "meaningful and empirical relations," "empirical laws," and "qualitative and empirical concepts."
A primary goal of this book is to show that the new theories of meaningfulness and intrinsicness developed in this book are not only descriptive but are also potent. Asserting that they do more than codify already existing concepts the book:
*works out logical relationships between meaningfulness concepts that were previously unrecognized;
*clarifies certain well-known and important debates by providing rich languages with new concepts and technical results (theorems) that yield insights into the debated issues and positions taken on them; and
*provides new techniques and results in substantive scientific areas of inquiry.
This book is about the role of mathematics in science. It will be useful to those concerned with the foundations of science in their respective fields. Various substantive examples from the behavioral sciences are presented.

Estimation and Inferential Statistics (Hardcover, 1st ed. 2015): Pradip Kumar Sahu, Santi Ranjan Pal, Ajit Kumar Das Estimation and Inferential Statistics (Hardcover, 1st ed. 2015)
Pradip Kumar Sahu, Santi Ranjan Pal, Ajit Kumar Das
R2,320 Discovery Miles 23 200 Ships in 10 - 15 working days

This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the "fear factor" usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.

Robust Statistics - Theory and Methods (Hardcover): RA Maronna Robust Statistics - Theory and Methods (Hardcover)
RA Maronna
R2,585 Discovery Miles 25 850 Ships in 10 - 15 working days

Classical statistical techniques fail to cope well with deviations from a standard distribution. Robust statistical methods take into account these deviations while estimating the parameters of parametric models, thus increasing the accuracy of the inference. Research into robust methods is flourishing, with new methods being developed and different applications considered.

"Robust Statistics" sets out to explain the use of robust methods and their theoretical justification. It provides an up-to-date overview of the theory and practical application of the robust statistical methods in regression, multivariate analysis, generalized linear models and time series. This unique book: Enables the reader to select and use the most appropriate robust method for their particular statistical model. Features computational algorithms for the core methods. Covers regression methods for data mining applications. Includes examples with real data and applications using the S-Plus robust statistics library. Describes the theoretical and operational aspects of robust methods separately, so the reader can choose to focus on one or the other. Supported by a supplementary website featuring time-limited S-Plus download, along with datasets and S-Plus code to allow the reader to reproduce the examples given in the book.

"Robust Statistics" aims to stimulate the use of robust methods as a powerful tool to increase the reliability and accuracy of statistical modelling and data analysis. It is ideal for researchers, practitioners and graduate students of statistics, electrical, chemical and biochemical engineering, and computer vision. There is also much to benefit researchers from other sciences, suchas biotechnology, who need to use robust statistical methods in their work.

Asymptotic Behaviour of Linearly Transformed Sums of Random Variables (Hardcover, 1997 ed.): V.V. Buldygin, Serguei Solntsev Asymptotic Behaviour of Linearly Transformed Sums of Random Variables (Hardcover, 1997 ed.)
V.V. Buldygin, Serguei Solntsev
R2,933 Discovery Miles 29 330 Ships in 18 - 22 working days

This book deals with the almost sure asymptotic behaviour of linearly transformed sequences of independent random variables, vectors and elements of topological vector spaces. The main subjects dealing with series of independent random elements on topological vector spaces, and in particular, in sequence spaces, as well as with generalized summability methods which are treated here are strong limit theorems for operator-normed (matrix normed) sums of independent finite-dimensional random vectors and their applications; almost sure asymptotic behaviour of realizations of one-dimensional and multi-dimensional Gaussian Markov sequences; various conditions providing almost sure continuity of sample paths of Gaussian Markov processes; and almost sure asymptotic behaviour of solutions of one-dimensional and multi-dimensional stochastic recurrence equations of special interest. Many topics, especially those related to strong limit theorems for operator-normed sums of independent random vectors, appear in monographic literature for the first time. Audience: The book is aimed at experts in probability theory, theory of random processes and mathematical statistics who are interested in the almost sure asymptotic behaviour in summability schemes, like operator normed sums and weighted sums, etc. Numerous sections will be of use to those who work in Gaussian processes, stochastic recurrence equations, and probability theory in topological vector spaces. As the exposition of the material is consistent and self-contained it can also be recommended as a textbook for university courses.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Handbook of Research on Emerging Trends…
Arun Solanki, Sandeep Kumar, … Hardcover R10,356 Discovery Miles 103 560
Basic mathematics for economics students…
Derek Yu Paperback R420 Discovery Miles 4 200
Quantum Zero-Error Information Theory
Elloa B. Guedes, Francisco Marcos De Assis, … Hardcover R3,601 Discovery Miles 36 010
Crossbar-Based Interconnection Networks…
Mohsen Jahanshahi, Fathollah Bistouni Hardcover R1,408 Discovery Miles 14 080
Cyberpatterns - Unifying Design Patterns…
Clive Blackwell, Hong Zhu Hardcover R3,424 R1,924 Discovery Miles 19 240
ECOOP 2014 -- Object-Oriented…
Richard Jones Paperback R1,523 Discovery Miles 15 230
Tower C25 Round Col. Code Labels…
R31 R25 Discovery Miles 250
Tower C32 Round Col. Code Labels…
R31 R25 Discovery Miles 250
Tower C10 Round Col. Code Label Sheets…
R31 R25 Discovery Miles 250
Tower C10 Round Col. Code Labels - Red…
R31 R25 Discovery Miles 250

 

Partners