0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (2)
  • R100 - R250 (41)
  • R250 - R500 (351)
  • R500+ (12,639)
  • -
Status
Format
Author / Contributor
Publisher

Books > Science & Mathematics > Mathematics > Probability & statistics

Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.): Bennett L. Fox Strategies for Quasi-Monte Carlo (Hardcover, 1999 ed.)
Bennett L. Fox
R4,585 Discovery Miles 45 850 Ships in 10 - 15 working days

Strategies for Quasi-Monte Carlo builds a framework to design and analyze strategies for randomized quasi-Monte Carlo (RQMC). One key to efficient simulation using RQMC is to structure problems to reveal a small set of important variables, their number being the effective dimension, while the other variables collectively are relatively insignificant. Another is smoothing. The book provides many illustrations of both keys, in particular for problems involving Poisson processes or Gaussian processes. RQMC beats grids by a huge margin. With low effective dimension, RQMC is an order-of-magnitude more efficient than standard Monte Carlo. With, in addition, certain smoothness - perhaps induced - RQMC is an order-of-magnitude more efficient than deterministic QMC. Unlike the latter, RQMC permits error estimation via the central limit theorem. For random-dimensional problems, such as occur with discrete-event simulation, RQMC gets judiciously combined with standard Monte Carlo to keep memory requirements bounded. This monograph has been designed to appeal to a diverse audience, including those with applications in queueing, operations research, computational finance, mathematical programming, partial differential equations (both deterministic and stochastic), and particle transport, as well as to probabilists and statisticians wanting to know how to apply effectively a powerful tool, and to those interested in numerical integration or optimization in their own right. It recognizes that the heart of practical application is algorithms, so pseudocodes appear throughout the book. While not primarily a textbook, it is suitable as a supplementary text for certain graduate courses. As a reference, it belongs on the shelf of everyone with a serious interest in improving simulation efficiency. Moreover, it will be a valuable reference to all those individuals interested in improving simulation efficiency with more than incremental increases.

Probability on Compact Lie Groups (Hardcover, 2014 ed.): David Applebaum Probability on Compact Lie Groups (Hardcover, 2014 ed.)
David Applebaum; Foreword by Herbert Heyer
R3,068 R2,520 Discovery Miles 25 200 Save R548 (18%) Ships in 12 - 19 working days

Probability theory on compact Lie groups deals with the interaction between chance and symmetry, a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications.

The book is primarily aimed at researchers working in probability, stochastic analysis and harmonic analysis on groups. It will also be of interest to mathematicians working in Lie theory and physicists, statisticians and engineers who are working on related applications. A background in first year graduate level measure theoretic probability and functional analysis is essential; a background in Lie groups and representation theory is certainly helpful but the first two chapters also offer orientation in these subjects."

Uncertain Input Data Problems and the Worst Scenario Method, Volume 46 (Hardcover, New): Ivan Hlavacek, Jan Chleboun, Ivo... Uncertain Input Data Problems and the Worst Scenario Method, Volume 46 (Hardcover, New)
Ivan Hlavacek, Jan Chleboun, Ivo Babuska; Volume editing by Jan Achenbach
R4,124 Discovery Miles 41 240 Ships in 12 - 19 working days

This book deals with the impact of uncertainty in input data on the outputs of mathematical models. Uncertain inputs as scalars, tensors, functions, or domain boundaries are considered. In practical terms, material parameters or constitutive laws, for instance, are uncertain, and quantities as local temperature, local mechanical stress, or local displacement are monitored. The goal of the worst scenario method is to extremize the quantity over the set of uncertain input data.
A general mathematical scheme of the worst scenario method, including approximation by finite element methods, is presented, and then applied to various state problems modeled by differential equations or variational inequalities: nonlinear heat flow, Timoshenko beam vibration and buckling, plate buckling, contact problems in elasticity and thermoelasticity with and without friction, and various models of plastic deformation, to list some of the topics. Dozens of examples, figures, and tables are included.
Although the book concentrates on the mathematical aspects of the subject, a substantial part is written in an accessible style and is devoted to various facets of uncertainty in modeling and to the state of the art techniques proposed to deal with uncertain input data.
A chapter on sensitivity analysis and on functional and convex analysis is included for the reader's convenience.
-Rigorous theory is established for the treatment of uncertainty in modeling
- Uncertainty is considered in complex models based on partial differential equations or variational inequalities
- Applications to nonlinear and linear problems with uncertain data are presented in detail: quasilinear steady heat flow, buckling of beams and plates, vibration of beams, frictional contact of bodies, several models of plastic deformation, and more
-Although emphasis is put on theoretical analysis and approximation techniques, numerical examples are also present
-Main ideas and approaches used today to handle uncertainties in modeling are described in an accessible form
-Fairly self-contained book

Statistics in Genetics and in the Environmental Sciences (Hardcover, Partly Reprinte): Luisa T Fernholz, Stephan Morgenthaler,... Statistics in Genetics and in the Environmental Sciences (Hardcover, Partly Reprinte)
Luisa T Fernholz, Stephan Morgenthaler, Werner Stahel
R2,983 Discovery Miles 29 830 Ships in 10 - 15 working days

Statistics is strongly tied to applications in different scientific disciplines, and the most challenging statistical problems arise from problems in the sciences. In fact, the most innovative statistical research flows from the needs of applications in diverse settings. This volume is a testimony to the crucial role that statistics plays in scientific disciplines such as genetics and environmental sciences, among others. The articles in this volume range from human and agricultural genetic DNA research to carcinogens and chemical concentrations in the environment and to space debris and atmospheric chemistry. Also included are some articles on statistical methods which are sufficiently general and flexible to be applied to many practical situations. The papers were refereed by a panel of experts and the editors of the volume. The contributions are based on the talks presented at the Workshop on Statistics and the Sciences, held at the Centro Stefano Franscini in Ascona, Switzerland, during the week of May 23 to 28, 1999. The meeting was jointly organized by the Swiss Federal Institutes of Technology in Lausanne and Zurich, with the financial support of the Minerva Research Foundation. As the presentations at the workshop helped the participants recognize the po tential role that statistics can play in the sciences, we hope that this volume will help the reader to focus on the central role of statistics in the specific areas presented here and to extrapolate the results to further applications."

The Measurement of Association - A Permutation Statistical Approach (Hardcover, 1st ed. 2018): Kenneth J. Berry, Janis E.... The Measurement of Association - A Permutation Statistical Approach (Hardcover, 1st ed. 2018)
Kenneth J. Berry, Janis E. Johnston, Paul W. Mielke Jr.
R4,487 Discovery Miles 44 870 Ships in 10 - 15 working days

This research monograph utilizes exact and Monte Carlo permutation statistical methods to generate probability values and measures of effect size for a variety of measures of association. Association is broadly defined to include measures of correlation for two interval-level variables, measures of association for two nominal-level variables or two ordinal-level variables, and measures of agreement for two nominal-level or two ordinal-level variables. Additionally, measures of association for mixtures of the three levels of measurement are considered: nominal-ordinal, nominal-interval, and ordinal-interval measures. Numerous comparisons of permutation and classical statistical methods are presented. Unlike classical statistical methods, permutation statistical methods do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This book takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field. This topic is relatively new in that it took modern computing power to make permutation methods available to those working in mainstream research. Written for a statistically informed audience, it is particularly useful for teachers of statistics, practicing statisticians, applied statisticians, and quantitative graduate students in fields such as psychology, medical research, epidemiology, public health, and biology. It can also serve as a textbook in graduate courses in subjects like statistics, psychology, and biology.

Statistical Methods for the Analysis of Repeated Measurements (Hardcover, 1st Corrected ed. 2002. Corr. 2nd printing 2003):... Statistical Methods for the Analysis of Repeated Measurements (Hardcover, 1st Corrected ed. 2002. Corr. 2nd printing 2003)
Charles S. Davis
R2,581 R1,761 Discovery Miles 17 610 Save R820 (32%) Ships in 12 - 19 working days

This book provides a comprehensive summary of a wide variety of statistical methods for the analysis of repeated measurements. It is designed to be both a useful reference for practitioners and a textbook for a graduate-level course focused on methods for the analysis of repeated measurements. This book will be of interest to * Statisticians in academics, industry, and research organizations * Scientists who design and analyze studies in which repeated measurements are obtained from each experimental unit * Graduate students in statistics and biostatistics. The prerequisites are knowledge of mathematical statistics at the level of Hogg and Craig (1995) and a course in linear regression and ANOVA at the level of Neter et. al. (1985). The important features of this book include a comprehensive coverage of classical and recent methods for continuous and categorical outcome variables; numerous homework problems at the end of each chapter; and the extensive use of real data sets in examples and homework problems. The 80 data sets used in the examples and homework problems can be downloaded from www.springer-ny.com at the list of author websites. Since many of the data sets can be used to demonstrate multiple methods of analysis, instructors can easily develop additional homework problems and exam questions based on the data sets provided. In addition, overhead transparencies produced using TeX and solutions to homework problems are available to course instructors. The overheads also include programming statements and computer output for the examples, prepared primarily using the SAS System. Charles S. Davis is Senior Director of Biostatistics at Elan Pharmaceuticals, San Diego, California. He previously was professor in the Department of Biostatistics at the University of Iowa. He is author or co-author of more than 75 peer-reviewed papers in statistical and medical journals and one book (Categorical Data Analysis using the SAS System with Maura Stokes and Gary Koch). His research and teaching interests include categorical data analysis, methods for the analysis of repeated measurements, and clinical trials. Dr. Davis has consulted with numerous companies and has taught short courses on categorical data analysis, methods for the analysis of repeated measurements, and clinical trials methodology for industrial, government, and academic organizations. He received an "Excellence in Continuing Education" award from the American Statistical Association in 2001 and has served as associate editor of the journals Controlled Clinical Trials and The American Statistician and as chair of the Biometrics Section of the ASA.

An Introduction to Copulas (Hardcover, 2nd ed. 2006. Corr. 2nd. printing 2007): Roger B. Nelsen An Introduction to Copulas (Hardcover, 2nd ed. 2006. Corr. 2nd. printing 2007)
Roger B. Nelsen
R4,931 Discovery Miles 49 310 Ships in 12 - 19 working days

Copulas are functions that join multivariate distribution functions to their one-dimensional margins. The study of copulas and their role in statistics is a new but vigorously growing field. In this book the student or practitioner of statistics and probability will find discussions of the fundamental properties of copulas and some of their primary applications. The applications include the study of dependence and measures of association, and the construction of families of bivariate distributions. With nearly a hundred examples and over 150 exercises, this book is suitable as a text or for self-study. The only prerequisite is an upper level undergraduate course in probability and mathematical statistics, although some familiarity with nonparametric statistics would be useful. Knowledge of measure-theoretic probability is not required. Roger B. Nelsen is Professor of Mathematics at Lewis & Clark College in Portland, Oregon. He is also the author of "Proofs Without Words: Exercises in Visual Thinking," published by the Mathematical Association of America.

Feynman-Kac Formulae - Genealogical and Interacting Particle Systems with Applications (Hardcover, And): Pierre Del Moral Feynman-Kac Formulae - Genealogical and Interacting Particle Systems with Applications (Hardcover, And)
Pierre Del Moral
R5,212 Discovery Miles 52 120 Ships in 10 - 15 working days

This text takes readers in a clear and progressive format from simple to recent and advanced topics in pure and applied probability such as contraction and annealed properties of non-linear semi-groups, functional entropy inequalities, empirical process convergence, increasing propagations of chaos, central limit, and Berry Esseen type theorems as well as large deviation principles for strong topologies on path-distribution spaces. Topics also include a body of powerful branching and interacting particle methods.

Introduction to Nonparametric Statistics for the Biological Sciences Using R (Hardcover, 1st ed. 2016): Thomas W. MacFarland,... Introduction to Nonparametric Statistics for the Biological Sciences Using R (Hardcover, 1st ed. 2016)
Thomas W. MacFarland, Jan M. Yates
R3,326 Discovery Miles 33 260 Ships in 12 - 19 working days

This book contains a rich set of tools for nonparametric analyses, and the purpose of this text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses and tests using R to broadly compare differences between data sets and statistical approach.

Modelling, Pricing, and Hedging Counterparty Credit Exposure - A Technical Guide (Hardcover, Edition.): Giovanni Cesari, John... Modelling, Pricing, and Hedging Counterparty Credit Exposure - A Technical Guide (Hardcover, Edition.)
Giovanni Cesari, John Aquilina, Niels Charpillon, Zlatko Filipovic, Gordon Lee, …
R4,246 Discovery Miles 42 460 Ships in 12 - 19 working days

The credit crisis that started in 2007, with the collapse of well-established financial institutions and the bankruptcy of many public corporations, has clearly shown the importance for any company entering the derivative business of modelling, pricing, and hedging its counterparty credit exposure.

Building an accurate representation of firm-wide credit exposure, for both risk and trading activities, is a significant challenge from the technical as well as the practical point of view. This volume can be considered as a roadmap to finding practical solutions to the problem of computing counterparty credit exposure for large books of both vanilla and exotic derivatives usually traded by large Investment Banks. It is divided into four parts, (I) Methodology, (II) Architecture and Implementation, (III) Products, and (IV) Hedging and Managing Counterparty Risk.

Starting from a generic modelling and simulation framework based on American Monte Carlo techniques, it presents a software architecture, which, with its modular design, allows the computation of credit exposure in a portfolio-aggregated and scenario-consistent way. An essential part of the design is the definition of a programming language, which allows trade representation based on dynamic modelling features. Several chapters are then devoted to the analysis of credit exposure of various products across all asset classes, namely foreign exchange, interest rate, credit derivatives, and equity. Finally it considers how to mitigate and hedge counterparty exposure. The crucial question of dynamic hedging is addressed by constructing a hybrid product, the Contingent-Credit Default Swap.This volume addresses these and other problems, as well as recent developments related to counterparty credit exposure, from a quantitative perspective. Its unique characteristic is the combination of a rigorous but simple mathematical approach with a practical view of the financial problem at hand.

Random Trees - An Interplay between Combinatorics and Probability (Hardcover, 2009 ed.): Michael Drmota Random Trees - An Interplay between Combinatorics and Probability (Hardcover, 2009 ed.)
Michael Drmota
R3,999 Discovery Miles 39 990 Ships in 12 - 19 working days

Trees are a fundamental object in graph theory and combinatorics as well as a basic object for data structures and algorithms in computer science. During thelastyearsresearchrelatedto(random)treeshasbeenconstantlyincreasing and several asymptotic and probabilistic techniques have been developed in order to describe characteristics of interest of large trees in di?erent settings. Thepurposeofthisbookistoprovideathoroughintroductionintovarious aspects of trees in randomsettings anda systematic treatment ofthe involved mathematicaltechniques. It shouldserveasa referencebookaswellasa basis for future research. One major conceptual aspect is to connect combinatorial and probabilistic methods that range from counting techniques (generating functions, bijections) over asymptotic methods (singularity analysis, saddle point techniques) to various sophisticated techniques in asymptotic probab- ity (convergence of stochastic processes, martingales). However, the reading of the book requires just basic knowledge in combinatorics, complex analysis, functional analysis and probability theory of master degree level. It is also part of concept of the book to provide full proofs of the major results even if they are technically involved and lengthy.

From Classical to Modern Probability - CIMPA Summer School 2001 (Hardcover, 2003 ed.): Pierre Picco, Jaime San Martin From Classical to Modern Probability - CIMPA Summer School 2001 (Hardcover, 2003 ed.)
Pierre Picco, Jaime San Martin
R1,656 Discovery Miles 16 560 Ships in 10 - 15 working days

This volume is based on lectures notes for the courses delivered at the Cimpa Summer School: From Classical to Modern Probability, held at Temuco, Chile, be th th tween January 8 and 26, 2001. This meeting brought together probabilists and graduate students interested in fields like particle systems, percolation, Brownian motion, random structures, potential theory and stochastic processes. We would like to express our gratitude to all the participants of the school as well as the people who contributed to its organization. In particular, to Servet Martinez, and Pablo Ferrari for their scientific advice, and Cesar Burgueiio for all his support and friendship. We want to thank all the professors for their stimulating courses and lectures. Special thanks to those who took the extra work in preparing each chapter of this book. We are also indebted to our sponsors and supporting institutions, whose interest and help was essential to organize this meeting: CIMPA, CNRS, CONI CYT, ECOS, FONDAP Program in Applied Mathematics, French Cooperation, Fundacion Andes, Presidential Fellowship, Universidad de Chile and Universidad de La Frontera. We are grateful to Miss Gladys Cavallone for her excellent work during the preparation of the meeting as well as for the considerable task of unifying the typography of the different chapters of this book."

Exploring Research Frontiers in Contemporary Statistics and Econometrics - A Festschrift for Leopold Simar (Hardcover, 2012):... Exploring Research Frontiers in Contemporary Statistics and Econometrics - A Festschrift for Leopold Simar (Hardcover, 2012)
Ingrid van Keilegom, Paul W. Wilson
R1,549 Discovery Miles 15 490 Ships in 10 - 15 working days

This book collects contributions written by well-known statisticians and econometricians to acknowledge Leopold Simar s far-reaching scientific impact on Statistics and Econometrics throughout his career. The papers contained herein were presented at a conference in
Louvain-la-Neuve in May 2009 in honor of his retirement. The contributions cover a broad variety of issues surrounding frontier
estimation, which Leopold Simar has contributed much to over the past two decades, as well as related issues such as semiparametric regression and models for censored data.

This book collects contributions written by well-known statisticians and econometricians to acknowledge Leopold Simar s far-reaching scientific impact on Statistics and Econometrics throughout his career. The papers contained herein were presented at a conference in
Louvain-la-Neuve in May 2009 in honor of his retirement. The contributions cover a broad variety of issues surrounding frontier
estimation, which Leopold Simar has contributed much to over the past two decades, as well as related issues such as semiparametric regression and models for censored data.

Design of Experiments and Their Implementations (Hardcover): Justin Riggs Design of Experiments and Their Implementations (Hardcover)
Justin Riggs
R1,608 Discovery Miles 16 080 Ships in 12 - 19 working days
Proceedings of the International Conference on Stochastic Analysis and Applications - Hammamet, 2001 (Hardcover, 2004 ed.):... Proceedings of the International Conference on Stochastic Analysis and Applications - Hammamet, 2001 (Hardcover, 2004 ed.)
Sergio Albeverio, Anne Boutet De Monvel, Habib Ouerdiane
R3,571 Discovery Miles 35 710 Ships in 10 - 15 working days

Stochastic analysis is a field of mathematical research having numerous interactions with other domains of mathematics such as partial differential equations, riemannian path spaces, dynamical systems, optimization. It also has many links with applications in engineering, finance, quantum physics, and other fields. This book covers recent and diverse aspects of stochastic and infinite-dimensional analysis. The included papers are written from a variety of standpoints (white noise analysis, Malliavin calculus, quantum stochastic calculus) by the contributors, and provide a broad coverage of the subject.

This volume will be useful to graduate students and research mathematicians wishing to get acquainted with recent developments in the field of stochastic analysis.

Experiments - Planning, Analysis, and Optimization, Third Edition (Hardcover, 3rd Edition): C F J Wu Experiments - Planning, Analysis, and Optimization, Third Edition (Hardcover, 3rd Edition)
C F J Wu
R3,220 Discovery Miles 32 200 Ships in 12 - 19 working days

Praise for the First Edition "If you ... want an up-to-date, definitive reference written by authors who have contributed much to this field, then this book is an essential addition to your library." --Journal of the American Statistical Association A COMPREHENSIVE REVIEW OF MODERN EXPERIMENTAL DESIGN Experiments: Planning, Analysis, and Optimization, Third Edition provides a complete discussion of modern experimental design for product and process improvement--the design and analysis of experiments and their applications for system optimization, robustness, and treatment comparison. While maintaining the same easy-to-follow style as the previous editions, this book continues to present an integrated system of experimental design and analysis that can be applied across various fields of research including engineering, medicine, and the physical sciences. New chapters provide modern updates on practical optimal design and computer experiments, an explanation of computer simulations as an alternative to physical experiments. Each chapter begins with a real-world example of an experiment followed by the methods required to design that type of experiment. The chapters conclude with an application of the methods to the experiment, bridging the gap between theory and practice. The authors modernize accepted methodologies while refining many cutting-edge topics including robust parameter design, analysis of non-normal data, analysis of experiments with complex aliasing, multilevel designs, minimum aberration designs, and orthogonal arrays. The third edition includes: Information on the design and analysis of computer experiments A discussion of practical optimal design of experiments An introduction to conditional main effect (CME) analysis and definitive screening designs (DSDs) New exercise problems This book includes valuable exercises and problems, allowing the reader to gauge their progress and retention of the book's subject matter as they complete each chapter. Drawing on examples from their combined years of working with industrial clients, the authors present many cutting-edge topics in a single, easily accessible source. Extensive case studies, including goals, data, and experimental designs, are also included, and the book's data sets can be found on a related FTP site, along with additional supplemental material. Chapter summaries provide a succinct outline of discussed methods, and extensive appendices direct readers to resources for further study. Experiments: Planning, Analysis, and Optimization, Third Edition is an excellent book for design of experiments courses at the upper-undergraduate and graduate levels. It is also a valuable resource for practicing engineers and statisticians.

Current Trends in Dynamical Systems in Biology and Natural Sciences (Hardcover, 1st ed. 2020): Maira Aguiar, Carlos Braumann,... Current Trends in Dynamical Systems in Biology and Natural Sciences (Hardcover, 1st ed. 2020)
Maira Aguiar, Carlos Braumann, Bob W. Kooi, Andrea Pugliese, Nico Stollenwerk, …
R3,384 Discovery Miles 33 840 Ships in 10 - 15 working days

This book disseminates the latest results and envisages new challenges in the application of mathematics to various practical situations in biology, epidemiology, and ecology. It comprises a collection of the main results presented at the Ninth Edition of the International Workshop "Dynamical Systems Applied to Biology and Natural Sciences - DSABNS", held from 7 to 9 February 2018 at the Department of Mathematics, University of Turin, Italy. While the principal focus is ecology and epidemiology, the coverage extends even to waste recycling and a genetic application. The topics covered in the 12 peer-reviewed contributions involve such diverse mathematical tools as ordinary and partial differential equations, delay equations, stochastic equations, control, and sensitivity analysis. The book is intended to help both in disseminating the latest results and in envisaging new challenges in the application of mathematics to various practical situations in biology, epidemiology, and ecology.

Natural Time Analysis: The New View of Time - Precursory Seismic Electric Signals, Earthquakes and other Complex Time Series... Natural Time Analysis: The New View of Time - Precursory Seismic Electric Signals, Earthquakes and other Complex Time Series (Hardcover, 2011 ed.)
Panayiotis Varotsos, Nicholas V. Sarlis, Efthimios S. Skordas
R5,187 Discovery Miles 51 870 Ships in 10 - 15 working days

This book deals with the theory and the applications of a new time domain, termed natural time domain, that has been forwarded by the authors almost a decade ago (P.A. Varotsos, N.V. Sarlis and E.S. Skordas, Practica of Athens Academy 76, 294-321, 2001; Physical Review E 66, 011902, 2002). In particular, it has been found that novel dynamical features hidden behind time series in complex systems can emerge upon analyzing them in this new time domain, which conforms to the desire to reduce uncertainty and extract signal information as much as possible. The analysis in natural time enables the study of the dynamical evolution of a complex system and identifies when the system enters a critical stage. Hence, natural time plays a key role in predicting impending catastrophic events in general. Relevant examples of data analysis in this new time domain have been published during the last decade in a large variety of fields, e.g., Earth Sciences, Biology and Physics. The book explains in detail a series of such examples including the identification of the sudden cardiac death risk in Cardiology, the recognition of electric signals that precede earthquakes, the determination of the time of an impending major mainshock in Seismology, and the analysis of the avalanches of the penetration of magnetic flux into thin films of type II superconductors in Condensed Matter Physics. In general, this book is concerned with the time-series analysis of signals emitted from complex systems by means of the new time domain and provides advanced students and research workers in diverse fields with a sound grounding in the fundamentals of current research work on detecting (long-range) correlations in complex time series. Furthermore, the modern techniques of Statistical Physics in time series analysis, for example Hurst analysis, the detrended fluctuation analysis, the wavelet transform etc., are presented along with their advantages when natural time domain is employed.

Analysis of Multivariate Survival Data (Hardcover, 1st ed. 2000. Corr. 2nd printing 2001): Philip Hougaard Analysis of Multivariate Survival Data (Hardcover, 1st ed. 2000. Corr. 2nd printing 2001)
Philip Hougaard
R4,676 Discovery Miles 46 760 Ships in 10 - 15 working days

Survival data or more general time-to-event data occur in many areas, including medicine, biology, engineering, economics, and demography, but previously standard methods have requested that all time variables are univariate and independent. This book extends the field by allowing for multivariate times. Applications where such data appear are survival of twins, survival of married couples and families, time to failure of right and left kidney for diabetic patients, life history data with time to outbreak of disease, complications and death, recurrent episodes of diseases and cross-over studies with time responses. As the field is rather new, the concepts and the possible types of data are described in detail and basic aspects of how dependence can appear in such data is discussed. Four different approaches to the analysis of such data are presented. The multi-state models where a life history is described as the subject moving from state to state is the most classical approach. The Markov models make up an important special case, but it is also described how easily more general models are set up and analyzed. Frailty models, which are random effects models for survival data, made a second approach, extending from the most simple shared frailty models, which are considered in detail, to models with more complicated dependence structures over individuals or over time. Marginal modelling has become a popular approach to evaluate the effect of explanatory factors in the presence of dependence, but without having specified a statistical model for the dependence. Finally, the completely non-parametric approach to bivariate censored survival data is described. This book is aimed at investigators who need to analyze multivariate survival data, but due to its focus on the concepts and the modelling aspects, it is also useful for persons interested in such data, but not having a statistical education. It can be used as a textbook for a graduate course in multivariate survival data. It is made from an applied point of view and covers all essential aspects of applying multivariate survival models. Also more theoretical evaluations, like asymptotic theory, are described, but only to the extent useful in applications and for understanding the models. For reading the book, it is useful, but not necessary, to have an understanding of univariate survival data. Philip Hougaard is a statistician at the pharmaceutical company Novo Nordisk. He has a Ph.D. in nonlinear regression models and is Doctor of Science based on a thesis on frailty models. He is associate editor of Biometrics and Lifetime Data Analysis. He has published over 80 papers in the statistical and medical literature.

Public Policy and Statistics - Case Studies from RAND (Hardcover, 2000 ed.): Sally C. Morton Public Policy and Statistics - Case Studies from RAND (Hardcover, 2000 ed.)
Sally C. Morton; Foreword by E. Bradley; Edited by John E. Rolph
R1,717 Discovery Miles 17 170 Ships in 12 - 19 working days

A critical yet constructive description of the rich analytical techniques and substantive applications that typify how statistical thinking has been applied at the RAND Corporation over the past two decades. Case studies of public policy problems are useful for teaching because they are familiar: almost everyone knows something abut health insurance, global warming, and capital punishment, to name but a few of the applications covered in this casebook. Each case study has a common format that describes the policy questions, the statistical questions, and the successful and the unsuccessful analytic strategies. Readers should be familiar with basic statistical concepts including sampling and regression. While designed for statistics courses in areas ranging from economics to health policy to the law at both the advanced undergraduate and graduate levels, empirical researchers and policy-makers will also find this casebook informative.

Selected Works of Terry Speed (Hardcover, 2012): Sandrine Dudoit Selected Works of Terry Speed (Hardcover, 2012)
Sandrine Dudoit
R4,552 Discovery Miles 45 520 Ships in 10 - 15 working days

The purpose of this volume is to provide an overview of Terry Speed's contributions to statistics and beyond. Each of the fifteen chapters concerns a particular area of research and consists of a commentary by a subject-matter expert and selection of representative papers. The chapters, organized more or less chronologically in terms of Terry's career, encompass a wide variety of mathematical and statistical domains, along with their application to biology and medicine. Accordingly, earlier chapters tend to be more theoretical, covering some algebra and probability theory, while later chapters concern more recent work in genetics and genomics. The chapters also span continents and generations, as they present research done over four decades, while crisscrossing the globe. The commentaries provide insight into Terry's contributions to a particular area of research, by summarizing his work and describing its historical and scientific context, motivation, and impact. In addition to shedding light on Terry's scientific achievements, the commentaries reveal endearing aspects of his personality, such as his intellectual curiosity, energy, humor, and generosity.

Geostatistics and Petroleum Geology (Hardcover, 2nd ed. 1999): M.E. Hohn Geostatistics and Petroleum Geology (Hardcover, 2nd ed. 1999)
M.E. Hohn
R3,020 Discovery Miles 30 200 Ships in 10 - 15 working days

This book introduces the concepts and methods of spatial statistics to geologists and engineers working with oil and gas data, and covers all of the most commonly encountered geostatistical methods for estimation and simulation. Topics include calculation and modeling of semivariograms, linear methods of kriging, cokriging, nonlinear methods such as indicator kriging and disjunctive kriging, and conditional simulation, including sequential indicator simulation, sequential Gaussian simulation, and simulated annealing. Semivariogram models range from very simple to complex. All of the fundamental semivariogram models are illustrated, along with anisotropic models, hole effects, geometric and zonal models, and the mechanics of fitting models. For each geostatistical method treated in detail, the author introduces necessary theory and background, describes how the method works, the steps a user must go through, and problems a user might encounter. The emphasis throughout is on what the practitioner needs to know, and the results that can be expected. The book is replete with examples in two and three dimensions, using real-world data such as porosity and permeability, gas production, structural elevation of a reservoir, and seismic information. Geostatistics and Petroleum Geology will be an invaluable advanced-level text for students on petroleum engineering and geosciences courses, as well as an important reference for petroleum geologists and petroleum engineers in oil companies worldwide.

Biopharmaceutical Applied Statistics Symposium - Volume 2 Biostatistical Analysis of Clinical Trials (Hardcover, 1st ed. 2018):... Biopharmaceutical Applied Statistics Symposium - Volume 2 Biostatistical Analysis of Clinical Trials (Hardcover, 1st ed. 2018)
Karl E. Peace, Ding-Geng Chen, Sandeep Menon
R3,140 Discovery Miles 31 400 Ships in 10 - 15 working days

This BASS book Series publishes selected high-quality papers reflecting recent advances in the design and biostatistical analysis of biopharmaceutical experiments - particularly biopharmaceutical clinical trials. The papers were selected from invited presentations at the Biopharmaceutical Applied Statistics Symposium (BASS), which was founded by the first Editor in 1994 and has since become the premier international conference in biopharmaceutical statistics. The primary aims of the BASS are: 1) to raise funding to support graduate students in biostatistics programs, and 2) to provide an opportunity for professionals engaged in pharmaceutical drug research and development to share insights into solving the problems they encounter. The BASS book series is initially divided into three volumes addressing: 1) Design of Clinical Trials; 2) Biostatistical Analysis of Clinical Trials; and 3) Pharmaceutical Applications. This book is the second of the 3-volume book series. The topics covered include: Statistical Approaches to the Meta-analysis of Randomized Clinical Trials, Collaborative Targeted Maximum Likelihood Estimation to Assess Causal Effects in Observational Studies, Generalized Tests in Clinical Trials, Discrete Time-to-event and Score-based Methods with Application to Composite Endpoint for Assessing Evidence of Disease Activity-Free , Imputing Missing Data Using a Surrogate Biomarker: Analyzing the Incidence of Endometrial Hyperplasia, Selected Statistical Issues in Patient-reported Outcomes, Network Meta-analysis, Detecting Safety Signals Among Adverse Events in Clinical Trials, Applied Meta-analysis Using R, Treatment of Missing Data in Comparative Effectiveness Research, Causal Estimands: A Common Language for Missing Data, Bayesian Subgroup Analysis with Examples, Statistical Methods in Diagnostic Devices, A Question-Based Approach to the Analysis of Safety Data, Analysis of Two-stage Adaptive Seamless Trial Design, and Multiplicity Problems in Clinical Trials - A Regulatory Perspective.

Stable and Efficient Cubature-based Filtering in Dynamical Systems (Hardcover, 1st ed. 2017): Dominik Ballreich Stable and Efficient Cubature-based Filtering in Dynamical Systems (Hardcover, 1st ed. 2017)
Dominik Ballreich
R2,464 R1,932 Discovery Miles 19 320 Save R532 (22%) Ships in 12 - 19 working days

The book addresses the problem of calculation of d-dimensional integrals (conditional expectations) in filter problems. It develops new methods of deterministic numerical integration, which can be used to speed up and stabilize filter algorithms. With the help of these methods, better estimates and predictions of latent variables are made possible in the fields of economics, engineering and physics. The resulting procedures are tested within four detailed simulation studies.

Design of Experiments for Pharmaceutical Product Development - Volume I : Basics and Fundamental Principles (Hardcover, 1st ed.... Design of Experiments for Pharmaceutical Product Development - Volume I : Basics and Fundamental Principles (Hardcover, 1st ed. 2021)
Sarwar Beg
R5,086 Discovery Miles 50 860 Ships in 10 - 15 working days

This book volume provides complete and updated information on the applications of Design of Experiments (DoE) and related multivariate techniques at various stages of pharmaceutical product development. It discusses the applications of experimental designs that shall include oral, topical, transdermal, injectables preparations, and beyond for nanopharmaceutical product development, leading to dedicated case studies on various pharmaceutical experiments through illustrations, art-works, tables and figures. This book is a valuable guide for all academic and industrial researchers, pharmaceutical and biomedical scientists, undergraduate and postgraduate research scholars, pharmacists, biostatisticians, biotechnologists, formulations and process engineers, regulatory affairs and quality assurance personnel.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
What about Bella?
Jeannie A Brown Hardcover R490 Discovery Miles 4 900
Expert Searching in the Google Age
Terry Ann Jankowski Hardcover R3,539 R2,489 Discovery Miles 24 890
New Results in Numerical and…
Hans Josef Rath, Carsten Holze, … Hardcover R5,681 Discovery Miles 56 810
Metaheuristic Optimization Algorithms in…
Ali Kaveh, Armin Dadras Eslamlou Hardcover R2,932 Discovery Miles 29 320
Mixing and Dispersion in Flows Dominated…
Herman J H Clercx, GertJan Van Heijst Hardcover R3,703 Discovery Miles 37 030
Pearson Edexcel GCSE (9-1) Mathematics…
Paperback R544 Discovery Miles 5 440
Smart Systems Integration and Simulation
Nicola Bombieri, Massimo Poncino, … Hardcover R3,549 Discovery Miles 35 490
Research Directions in Data and…
Csilla Farkas, Pierangela Samarati Hardcover R3,107 Discovery Miles 31 070
Spin to Survive: Frozen Mountain…
Emily Hawkins Novelty book R491 Discovery Miles 4 910
Putting Content Online - A Practical…
Mark Jordan Paperback R1,379 Discovery Miles 13 790

 

Partners