0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (47)
  • R250 - R500 (367)
  • R500+ (12,217)
  • -
Status
Format
Author / Contributor
Publisher

Books > Science & Mathematics > Mathematics > Probability & statistics

Gaussian Random Functions (Hardcover, 1995 ed.): M.A. Lifshits Gaussian Random Functions (Hardcover, 1995 ed.)
M.A. Lifshits
R7,284 Discovery Miles 72 840 Ships in 10 - 15 working days

It is well known that the normal distribution is the most pleasant, one can even say, an exemplary object in the probability theory. It combines almost all conceivable nice properties that a distribution may ever have: symmetry, stability, indecomposability, a regular tail behavior, etc. Gaussian measures (the distributions of Gaussian random functions), as infinite-dimensional analogues of tht< classical normal distribution, go to work as such exemplary objects in the theory of Gaussian random functions. When one switches to the infinite dimension, some "one-dimensional" properties are extended almost literally, while some others should be profoundly justified, or even must be reconsidered. What is more, the infinite-dimensional situation reveals important links and structures, which either have looked trivial or have not played an independent role in the classical case. The complex of concepts and problems emerging here has become a subject of the theory of Gaussian random functions and their distributions, one of the most advanced fields of the probability science. Although the basic elements in this field were formed in the sixties-seventies, it has been still until recently when a substantial part of the corresponding material has either existed in the form of odd articles in various journals, or has served only as a background for considering some special issues in monographs.

The Construction of Optimal Stated Choice Experiments - Theory and Methods (Hardcover): D J Street The Construction of Optimal Stated Choice Experiments - Theory and Methods (Hardcover)
D J Street
R3,204 Discovery Miles 32 040 Ships in 18 - 22 working days

The most comprehensive and applied discussion of stated choice experiment constructions available

The Construction of Optimal Stated Choice Experiments provides an accessible introduction to the construction methods needed to create the best possible designs for use in modeling decision-making. Many aspects of the design of a generic stated choice experiment are independent of its area of application, and until now there has been no single book describing these constructions. This book begins with a brief description of the various areas where stated choice experiments are applicable, including marketing and health economics, transportation, environmental resource economics, and public welfare analysis. The authors focus on recent research results on the construction of optimal and near-optimal choice experiments and conclude with guidelines and insight on how to properly implement these results. Features of the book include:

Construction of generic stated choice experiments for the estimation of main effects only, as well as experiments for the estimation of main effects plus two-factor interactions

Constructions for choice sets of any size and for attributes with any number of levels

A discussion of designs that contain a none option or a common base option

Practical techniques for the implementation of the constructions

Class-tested material that presents theoretical discussion of optimal design

Complete and extensive references to the mathematical and statistical literature for the constructions

Exercise sets in most chapters, which reinforce the understanding of the presented material

The Construction of Optimal Stated Choice Experiments serves as aninvaluable reference guide for applied statisticians and practitioners in the areas of marketing, health economics, transport, and environmental evaluation. It is also ideal as a supplemental text for courses in the design of experiments, decision support systems, and choice models. A companion web site is available for readers to access web-based software that can be used to implement the constructions described in the book.

Nonparametric Smoothing and Lack-of-Fit Tests (Hardcover, 1997 ed.): Jeffrey Hart Nonparametric Smoothing and Lack-of-Fit Tests (Hardcover, 1997 ed.)
Jeffrey Hart
R2,815 Discovery Miles 28 150 Ships in 18 - 22 working days

An exploration of the use of smoothing methods in testing the fit of parametric regression models. The book reviews many of the existing methods for testing lack-of-fit and also proposes a number of new methods, addressing both applied and theoretical aspects of the model checking problems. As such, the book is of interest to practitioners of statistics and researchers investigating either lack-of-fit tests or nonparametric smoothing ideas. The first four chapters introduce the problem of estimating regression functions by nonparametric smoothers, primarily those of kernel and Fourier series type, and could be used as the foundation for a graduate level course on nonparametric function estimation. The prerequisites for a full appreciation of the book are a modest knowledge of calculus and some familiarity with the basics of mathematical statistics.

Estimation in Surveys with Nonresponse (Hardcover): C.E. Sarndal Estimation in Surveys with Nonresponse (Hardcover)
C.E. Sarndal
R2,566 Discovery Miles 25 660 Ships in 10 - 15 working days

Around the world a multitude of surveys are conducted every day, on a variety of subjects, and consequently surveys have become an accepted part of modern life. However, in recent years survey estimates have been increasingly affected by rising trends in nonresponse, with loss of accuracy as an undesirable result. Whilst it is possible to reduce nonresponse to some degree, it cannot be completely eliminated. Estimation techniques that account systematically for nonresponse and at the same time succeed in delivering acceptable accuracy are much needed.

"Estimation in Surveys with Nonresponse" provides an overview of these techniques, presenting the view of nonresponse as a normal (albeit undesirable) feature of a sample survey, one whose potentially harmful effects are to be minimised. Builds in the nonresponse feature of survey data collection as an integral part of the theory, both for point estimation and for variance estimation. Promotes weighting through calibration as a new and powerful technique for surveys with nonresponse. Highlights the analysis of nonresponse bias in estimates and methods to minimize this bias. Includes computational tools to help identify the best variables for calibration. Discusses the use of imputation as a complement to weighting by calibration. Contains guidelines for dealing with frame imperfections and coverage errors. Features worked examples throughout the text, using real data.

The accessible style of "Estimation in Surveys with Nonresponse" will make this an invaluable tool for survey methodologists in national statistics agencies and private survey agencies. Researchers, teachers, and students of statistics, social sciences and economicswill benefit from the clear presentation and numerous examples.

Local Regression and Likelihood (Hardcover, 1999 ed.): Clive Loader Local Regression and Likelihood (Hardcover, 1999 ed.)
Clive Loader
R4,372 Discovery Miles 43 720 Ships in 10 - 15 working days

Separation of signal from noise is the most fundamental problem in data analysis, and arises in many fields, for example, signal processing, econometrics, acturial science, and geostatistics. This book introduces the local regression method in univariate and multivariate settings, and extensions to local likelihood and density estimation. Basic theoretical results and diagnostic tools such as cross validation are introduced along the way. Examples illustrate the implementation of the methods using the LOCFIT software.

Theory and Applications of Long-Range Dependence (Hardcover, 2003 ed.): Paul Doukhan, George Oppenheim, Murad S Taqqu Theory and Applications of Long-Range Dependence (Hardcover, 2003 ed.)
Paul Doukhan, George Oppenheim, Murad S Taqqu
R3,212 Discovery Miles 32 120 Ships in 18 - 22 working days

The area of data analysis has been greatly affected by our computer age. For example, the issue of collecting and storing huge data sets has become quite simplified and has greatly affected such areas as finance and telecommunications. Even non-specialists try to analyze data sets and ask basic questions about their structure. One such question is whether one observes some type of invariance with respect to scale, a question that is closely related to the existence of long-range dependence in the data. This important topic of long-range dependence is the focus of this unique work, written by a number of specialists on the subject.

The topics selected should give a good overview from the probabilistic and statistical perspective. Included will be articles on fractional Brownian motion, models, inequalities and limit theorems, periodic long-range dependence, parametric, semiparametric, and non-parametric estimation, long-memory stochastic volatility models, robust estimation, and prediction for long-range dependence sequences. For those graduate students and researchers who want to use the methodology and need to know the "tricks of the trade," there will be a special section called "Mathematical Techniques."

Topics in the first part of the book are covered from probabilistic and statistical perspectives and include fractional Brownian motion, models, inequalities and limit theorems, periodic long-range dependence, parametric, semiparametric, and non-parametric estimation, long-memory stochastic volatility models, robust estimation, prediction for long-range dependence sequences. The reader is referred to more detailed proofs if already found in the literature.

The last part of the book is devoted to applications in the areas of simulation, estimation and wavelet techniques, traffic in computer networks, econometry and finance, multifractal models, and hydrology. Diagrams and illustrations enhance the presentation. Each article begins with introductory background material and is accessible to mathematicians, a variety of practitioners, and graduate students. The work serves as a state-of-the art reference or graduate seminar text.

Lifetime Data: Models in Reliability and Survival Analysis (Hardcover, 1996 ed.): Nicholas P. Jewell, Alan C. Kimber, Mei-Ling... Lifetime Data: Models in Reliability and Survival Analysis (Hardcover, 1996 ed.)
Nicholas P. Jewell, Alan C. Kimber, Mei-Ling Ting Lee, G. Alex Whitmore
R4,239 Discovery Miles 42 390 Ships in 18 - 22 working days

Statistical models and methods for lifetime and other time-to-event data are widely used in many fields, including medicine, the environmental sciences, actuarial science, engineering, economics, management, and the social sciences. For example, closely related statistical methods have been applied to the study of the incubation period of diseases such as AIDS, the remission time of cancers, life tables, the time-to-failure of engineering systems, employment duration, and the length of marriages. This volume contains a selection of papers based on the 1994 International Research Conference on Lifetime Data Models in Reliability and Survival Analysis, held at Harvard University. The conference brought together a varied group of researchers and practitioners to advance and promote statistical science in the many fields that deal with lifetime and other time-to-event-data. The volume illustrates the depth and diversity of the field. A few of the authors have published their conference presentations in the new journal Lifetime Data Analysis (Kluwer Academic Publishers).

Theory of Martingales (Hardcover, 1989 ed.): Robert S. Liptser, A.N. Shiryayev Theory of Martingales (Hardcover, 1989 ed.)
Robert S. Liptser, A.N. Shiryayev
R4,258 Discovery Miles 42 580 Ships in 10 - 15 working days

One service mathematics has rc: ndered the 'Et moi, "', si j'avait su comment CD revenir, je n'y serais point alle. ' human race. It has put common SCIIJC back Jules Verne where it belongs. on the topmost shelf next to tbe dusty canister 1abdled 'discarded non- The series is divergent; tberefore we may be sense'. able to do sometbing witb it Eric T. Bell O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics . . . '; 'One service logic has rendered com puter science . . . '; 'One service category theory has rendered mathematics . . . '. All arguably true_ And all statements obtainable this way form part of the raison d'etre of this series_ This series, Mathematics and Its ApplicatiOns, started in 1977. Now that over one hundred volumes have appeared it seems opportune to reexamine its scope_ At the time I wrote "Growing specialization and diversification have brought a host of monographs and textbooks on increasingly specialized topics. However, the 'tree' of knowledge of mathematics and related fields does not grow only by putting forth new branches."

Longitudinal Categorical Data Analysis (Hardcover, 2014 ed.): Brajendra C. Sutradhar Longitudinal Categorical Data Analysis (Hardcover, 2014 ed.)
Brajendra C. Sutradhar
R4,142 Discovery Miles 41 420 Ships in 10 - 15 working days

This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as well as the exact likelihood approaches.The book is technically rigorous, and, it also presents illustrations of the statistical analysis of various real life data involving univariate multinomial responses both in cross-sectional and longitudinal setups. This book is written mainly for the graduate students and researchers in statistics and social sciences, among other applied statistics research areas. However, the rest of the book, specifically the chapters from 1 to 3, may also be used for a senior undergraduate course in statistics.

Fractional Programming - Theory, Methods and Applications (Hardcover, 1997 ed.): I.M.Stancu Minasian Fractional Programming - Theory, Methods and Applications (Hardcover, 1997 ed.)
I.M.Stancu Minasian
R2,720 Discovery Miles 27 200 Ships in 18 - 22 working days

Mathematical programming has know a spectacular diversification in the last few decades. This process has happened both at the level of mathematical research and at the level of the applications generated by the solution methods that were created. To write a monograph dedicated to a certain domain of mathematical programming is, under such circumstances, especially difficult. In the present monograph we opt for the domain of fractional programming. Interest of this subject was generated by the fact that various optimization problems from engineering and economics consider the minimization of a ratio between physical and/or economical functions, for example cost/time, cost/volume, cost/profit, or other quantities that measure the efficiency of a system. For example, the productivity of industrial systems, defined as the ratio between the realized services in a system within a given period of time and the utilized resources, is used as one of the best indicators of the quality of their operation. Such problems, where the objective function appears as a ratio of functions, constitute fractional programming problem. Due to its importance in modeling various decision processes in management science, operational research, and economics, and also due to its frequent appearance in other problems that are not necessarily economical, such as information theory, numerical analysis, stochastic programming, decomposition algorithms for large linear systems, etc., the fractional programming method has received particular attention in the last three decade

Biopharmaceutical Applied Statistics Symposium - Volume 2 Biostatistical Analysis of Clinical Trials (Hardcover, 1st ed. 2018):... Biopharmaceutical Applied Statistics Symposium - Volume 2 Biostatistical Analysis of Clinical Trials (Hardcover, 1st ed. 2018)
Karl E. Peace, Ding-Geng Chen, Sandeep Menon
R2,898 Discovery Miles 28 980 Ships in 18 - 22 working days

This BASS book Series publishes selected high-quality papers reflecting recent advances in the design and biostatistical analysis of biopharmaceutical experiments - particularly biopharmaceutical clinical trials. The papers were selected from invited presentations at the Biopharmaceutical Applied Statistics Symposium (BASS), which was founded by the first Editor in 1994 and has since become the premier international conference in biopharmaceutical statistics. The primary aims of the BASS are: 1) to raise funding to support graduate students in biostatistics programs, and 2) to provide an opportunity for professionals engaged in pharmaceutical drug research and development to share insights into solving the problems they encounter. The BASS book series is initially divided into three volumes addressing: 1) Design of Clinical Trials; 2) Biostatistical Analysis of Clinical Trials; and 3) Pharmaceutical Applications. This book is the second of the 3-volume book series. The topics covered include: Statistical Approaches to the Meta-analysis of Randomized Clinical Trials, Collaborative Targeted Maximum Likelihood Estimation to Assess Causal Effects in Observational Studies, Generalized Tests in Clinical Trials, Discrete Time-to-event and Score-based Methods with Application to Composite Endpoint for Assessing Evidence of Disease Activity-Free , Imputing Missing Data Using a Surrogate Biomarker: Analyzing the Incidence of Endometrial Hyperplasia, Selected Statistical Issues in Patient-reported Outcomes, Network Meta-analysis, Detecting Safety Signals Among Adverse Events in Clinical Trials, Applied Meta-analysis Using R, Treatment of Missing Data in Comparative Effectiveness Research, Causal Estimands: A Common Language for Missing Data, Bayesian Subgroup Analysis with Examples, Statistical Methods in Diagnostic Devices, A Question-Based Approach to the Analysis of Safety Data, Analysis of Two-stage Adaptive Seamless Trial Design, and Multiplicity Problems in Clinical Trials - A Regulatory Perspective.

Stochastic Analysis for Poisson Point Processes - Malliavin Calculus, Wiener-Ito Chaos Expansions and Stochastic Geometry... Stochastic Analysis for Poisson Point Processes - Malliavin Calculus, Wiener-Ito Chaos Expansions and Stochastic Geometry (Hardcover, 1st ed. 2016)
Giovanni Peccati, Matthias Reitzner
R4,116 Discovery Miles 41 160 Ships in 10 - 15 working days

Stochastic geometry is the branch of mathematics that studies geometric structures associated with random configurations, such as random graphs, tilings and mosaics. Due to its close ties with stereology and spatial statistics, the results in this area are relevant for a large number of important applications, e.g. to the mathematical modeling and statistical analysis of telecommunication networks, geostatistics and image analysis. In recent years - due mainly to the impetus of the authors and their collaborators - a powerful connection has been established between stochastic geometry and the Malliavin calculus of variations, which is a collection of probabilistic techniques based on the properties of infinite-dimensional differential operators. This has led in particular to the discovery of a large number of new quantitative limit theorems for high-dimensional geometric objects. This unique book presents an organic collection of authoritative surveys written by the principal actors in this rapidly evolving field, offering a rigorous yet lively presentation of its many facets.

Warranty Data Collection and Analysis (Hardcover, 2011 ed.): Wallace R. Blischke, M. Rezaul Karim, D.N.Prabhakar Murthy Warranty Data Collection and Analysis (Hardcover, 2011 ed.)
Wallace R. Blischke, M. Rezaul Karim, D.N.Prabhakar Murthy
R5,687 Discovery Miles 56 870 Ships in 18 - 22 working days

Warranty Data Collection and Analysis deals with warranty data collection and analysis and the problems associated with these activities. The book is a both a research monograph and a handbook for practitioners. As a research monograph, it unifies the literature on warranty data collection and analysis, and presents the important results in an integrated manner. In the process, it highlights topics that require further research. As a handbook, it provides the essential methodology needed by practitioners involved with warranty data collection and analysis, along with extensive references to further results. Models and techniques needed for proper and effective analysis of data are included, together with guidelines for their use in warranty management, product improvement, and new product development. Warranty Data Collection and Analysis will be of interest to researchers (engineers and statisticians) and practitioners (engineers, applied statisticians, and managers) involved with product warranty and reliability. It is also suitable for use as a reference text for graduate-level reliability programs in engineering, applied statistics, operations research, and management.

Surface- and Groundwater Quality Changes in Periods of Water Scarcity (Hardcover, 2013 ed.): Milos Gregor Surface- and Groundwater Quality Changes in Periods of Water Scarcity (Hardcover, 2013 ed.)
Milos Gregor
R2,669 Discovery Miles 26 690 Ships in 18 - 22 working days

This thesis deals with the evaluation of surface and groundwater quality changes in the periods of water scarcity in river catchment areas. The work can be divided into six parts. Existing methods of drought assessment are discussed in the first part, followed by the brief description of the software package HydroOffice, designed by the author. The software is dedicated to analysis of hydrological data (separation of baseflow, parameters of hydrological drought estimation, recession curves analysis, time series analysis). The capabilities of the software are currently used by scientist from more than 30 countries around the world. The third section is devoted to a comprehensive regional assessment of hydrological drought on Slovak rivers, followed by evaluation of the occurrence, course and character of drought in precipitation, discharges, base flow, groundwater head and spring yields in the pilot area of the Nitra River basin. The fifth part is focused on the assessment of changes in surface and groundwater quality during the drought periods within the pilot area. Finally, the results are summarized and interpreted, and rounded off with an outlook to future research.

Introduction to Stochastic Networks (Hardcover, 1999 ed.): Richard Serfozo Introduction to Stochastic Networks (Hardcover, 1999 ed.)
Richard Serfozo
R2,688 Discovery Miles 26 880 Ships in 18 - 22 working days

In a stochastic network, such as those in computer/telecommunications and manufacturing, discrete units move among a network of stations where they are processed or served. Randomness may occur in the servicing and routing of units, and there may be queueing for services. This book describes several basic stochastic network processes, beginning with Jackson networks and ending with spatial queueing systems in which units, such as cellular phones, move in a space or region where they are served. The focus is on network processes that have tractable (closed-form) expressions for the equilibrium probability distribution of the numbers of units at the stations. These distributions yield network performance parameters such as expectations of throughputs, delays, costs, and travel times. The book is intended for graduate students and researchers in engineering, science and mathematics interested in the basics of stochastic networks that have been developed over the last twenty years. Assuming a graduate course in stochastic processes without measure theory, the emphasis is on multi-dimensional Markov processes. There is also some self-contained material on point processes involving real analysis. The book also contains rather complete introductions to reversible Markov processes, Palm probabilities for stationary systems, Little laws for queueing systems and space-time Poisson processes. This material is used in describing reversible networks, waiting times at stations, travel times and space-time flows in networks. Richard Serfozo received the Ph.D. degree in Industrial Engineering and Management Sciences at Northwestern University in 1969 and is currently Professor of Industrial and Systems Engineering at Georgia Institute of Technology. Prior to that he held positions in the Boeing Company, Syracuse University, and Bell Laboratories. He has held

Analysis of Multivariate Social Science Data (Hardcover, 2nd edition): David J. Bartholomew, Fiona Steele, Irini Moustaki Analysis of Multivariate Social Science Data (Hardcover, 2nd edition)
David J. Bartholomew, Fiona Steele, Irini Moustaki
R5,929 Discovery Miles 59 290 Ships in 10 - 15 working days

Drawing on the authors varied experiences working and teaching in the field, Analysis of Multivariate Social Science Data, Second Editionenables a basic understanding of how to use key multivariate methods in the social sciences. With updates in every chapter, this edition expands its topics to include regression analysis, confirmatory factor analysis, structural equation models, and multilevel models. After emphasizing the summarization of data in the first several chapters, the authors focus on regression analysis. This chapter provides a link between the two halves of the book, signaling the move from descriptive to inferential methods and from interdependence to dependence. The remainder of the text deals with model-based methods that primarily make inferences about processes that generate data. Relying heavily on numerical examples, the authors provide insight into the purpose and working of the methods as well as the interpretation of data. Many of the same examples are used throughout to illustrate connections between the methods. In most chapters, the authors present suggestions for further work that go beyond conventional exercises, encouraging readers to explore new ground in social science research. Requiring minimal mathematical and statistical knowledge, this book shows how various multivariate methods reveal different aspects of data and thus help answer substantive research questions.

Health Care Systems Engineering for Scientists and Practitioners - HCSE, Lyon, France, May 2015 (Hardcover, 1st ed. 2016):... Health Care Systems Engineering for Scientists and Practitioners - HCSE, Lyon, France, May 2015 (Hardcover, 1st ed. 2016)
Andrea Matta, Evren Sahin, Jingshan Li, Alain Guinet, Nico J. Vandaele
R4,614 Discovery Miles 46 140 Ships in 10 - 15 working days

In this volume, scientists and practitioners write about new methods and technologies for improving the operation of health care organizations. Statistical analyses play an important role in these methods with the implications of simulation and modeling applied to the future of health care. Papers are based on work presented at the Second International Conference on Health Care Systems Engineering (HCSE2015) in Lyon, France. The conference was a rare opportunity for scientists and practitioners to share work directly with each other. Each resulting paper received a double blind review. Paper topics include: hospital drug logistics, emergency care, simulation in patient care, and models for home care services.

Actuarial Science - Advances in the Statistical Sciences Festschrift in Honor of Professor V.M. Josh's 70th Birthday... Actuarial Science - Advances in the Statistical Sciences Festschrift in Honor of Professor V.M. Josh's 70th Birthday Volume VI (Hardcover, 1987 ed.)
I. B. MacNeill, G. Umphrey
R2,795 Discovery Miles 27 950 Ships in 18 - 22 working days

On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments of Statistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes in the use of this system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."

Optimization of Stochastic Models - The Interface Between Simulation and Optimization (Hardcover, 1996 ed.): Georg Ch Pflug Optimization of Stochastic Models - The Interface Between Simulation and Optimization (Hardcover, 1996 ed.)
Georg Ch Pflug
R5,354 Discovery Miles 53 540 Ships in 18 - 22 working days

Stochastic models are everywhere. In manufacturing, queuing models are used for modeling production processes, realistic inventory models are stochastic in nature. Stochastic models are considered in transportation and communication. Marketing models use stochastic descriptions of the demands and buyer's behaviors. In finance, market prices and exchange rates are assumed to be certain stochastic processes, and insurance claims appear at random times with random amounts. To each decision problem, a cost function is associated. Costs may be direct or indirect, like loss of time, quality deterioration, loss in production or dissatisfaction of customers. In decision making under uncertainty, the goal is to minimize the expected costs. However, in practically all realistic models, the calculation of the expected costs is impossible due to the model complexity. Simulation is the only practicable way of getting insight into such models. Thus, the problem of optimal decisions can be seen as getting simulation and optimization effectively combined. The field is quite new and yet the number of publications is enormous. This book does not even try to touch all work done in this area. Instead, many concepts are presented and treated with mathematical rigor and necessary conditions for the correctness of various approaches are stated. Optimization of Stochastic Models: The Interface Between Simulation and Optimization is suitable as a text for a graduate level course on Stochastic Models or as a secondary text for a graduate level course in Operations Research.

Selected Works of David Brillinger (Hardcover, 2012): Peter Guttorp, David Brillinger Selected Works of David Brillinger (Hardcover, 2012)
Peter Guttorp, David Brillinger
R4,196 Discovery Miles 41 960 Ships in 18 - 22 working days

This volume contains 30 of David Brillinger's most influential papers. He is an eminent statistical scientist, having published broadly in time series and point process analysis, seismology, neurophysiology, and population biology. Each of these areas are well represented in the book. The volume has been divided into four parts, each with comments by one of Dr. Brillinger's former PhD students. His more theoretical papers have comments by Victor Panaretos from Switzerland. The area of time series has commentary by Pedro Morettin from Brazil. The biologically oriented papers are commented by Tore Schweder from Norway and Haiganoush Preisler from USA, while the point process papers have comments by Peter Guttorp from USA. In addition, the volume contains a Statistical Science interview with Dr. Brillinger, and his bibliography.

Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Hardcover): Nicholas T.... Missing Data and Small-Area Estimation - Modern Analytical Equipment for the Survey Statistician (Hardcover)
Nicholas T. Longford
R2,702 Discovery Miles 27 020 Ships in 18 - 22 working days

This book develops methods for two key problems in the analysis of large-scale surveys: dealing with incomplete data and making inferences about sparsely represented subdomains. The presentation is committed to two particular methods, multiple imputation for missing data and multivariate composition for small-area estimation. The methods are presented as developments of established approaches by attending to their deficiencies. Thus the change to more efficient methods can be gradual, sensitive to the management priorities in large research organisations and multidisciplinary teams and to other reasons for inertia. The typical setting of each problem is addressed first, and then the constituency of the applications is widened to reinforce the view that the general method is essential for modern survey analysis. The general tone of the book is not "from theory to practice," but "from current practice to better practice." The third part of the book, a single chapter, presents a method for efficient estimation under model uncertainty. It is inspired by the solution for small-area estimation and is an example of "from good practice to better theory."

A strength of the presentation is chapters of case studies, one for each problem. Whenever possible, turning to examples and illustrations is preferred to the theoretical argument. The book is suitable for graduate students and researchers who are acquainted with the fundamentals of sampling theory and have a good grounding in statistical computing, or in conjunction with an intensive period of learning and establishing one's own a modern computing and graphical environment that would serve the reader for most of the analytical work inthe future.

While some analysts might regard data imperfections and deficiencies, such as nonresponse and limited sample size, as someone else's failure that bars effective and valid analysis, this book presents them as respectable analytical and inferential challenges, opportunities to harness the computing power into service of high-quality socially relevant statistics.

Overriding in this approach is the general principlea "to do the best, for the consumer of statistical information, that can be done with what is available. The reputation that government statistics is a rigid procedure-based and operation-centred activity, distant from the mainstream of statistical theory and practice, is refuted most resolutely.

After leaving De Montfort University in 2004 where he was a Senior Research Fellow in Statistics, Nick Longford founded the statistical research and consulting company SNTL in Leicester, England. He was awarded the first Campion Fellowship (2000a "02) for methodological research in United Kingdom government statistics. He has served as Associate Editor of the Journal of the Royal Statistical Society, Series A, and the Journal of Educational and Behavioral Statistics and as an Editor of the Journal of Multivariate Analysis. He is a member of the Editorial Board of the British Journal of Mathematical and Statistical Psychology. He is the author of two other monographs, Random Coefficient Models (Oxford University Press, 1993) and Models for Uncertainty in Educational Testing (Springer-Verlag, 1995).

From the reviews:

"Ultimately, this book serves as an excellent reference source to guide and improve statistical practice in survey settings exhibiting theseproblems." Psychometrika

"I am convinced this book will be useful to practitioners...[and a] valuable resource for future research in this field." Jan Kordos in Statistics in Transition, Vol. 7, No. 5, June 2006

"To sum up, I think this is an excellent book and it thoroughly covers methods to deal with incomplete data problems and small-area estimation. It is a useful and suitable book for survey statisticians, as well as for researchers and graduate students interested on sampling designs." Ramon Cleries Soler in Statistics and Operations Research Transactions, Vol. 30, No. 1, January-June 2006

Analysing Seasonal Health Data (Hardcover, 2010 Ed.): Adrian G. Barnett, Annette J. Dobson Analysing Seasonal Health Data (Hardcover, 2010 Ed.)
Adrian G. Barnett, Annette J. Dobson
R2,743 Discovery Miles 27 430 Ships in 18 - 22 working days

Seasonal patterns have been found in a remarkable range of health conditions, including birth defects, respiratory infections and cardiovascular disease. Accurately estimating the size and timing of seasonal peaks in disease incidence is an aid to understanding the causes and possibly to developing interventions. With global warming increasing the intensity of seasonal weather patterns around the world, a review of the methods for estimating seasonal effects on health is timely.

This is the first book on statistical methods for seasonal data written for a health audience. It describes methods for a range of outcomes (including continuous, count and binomial data) and demonstrates appropriate techniques for summarising and modelling these data. It has a practical focus and uses interesting examples to motivate and illustrate the methods. The statistical procedures and example data sets are available in an R package called season .

Biostatistics - Advances in Statiscal Sciences Festschrift in Honor of Professor V.M. Joshi's 70th Birthday Volume V... Biostatistics - Advances in Statiscal Sciences Festschrift in Honor of Professor V.M. Joshi's 70th Birthday Volume V (Hardcover, 1987 ed.)
I. B. MacNeill, G. Umphrey
R2,815 Discovery Miles 28 150 Ships in 18 - 22 working days

On May 27-31, 1985, a series of symposia was held at The University of Western Ontario, London, Canada, to celebrate the 70th birthday of Pro fessor V. M. Joshi. These symposia were chosen to reflect Professor Joshi's research interests as well as areas of expertise in statistical science among faculty in the Departments ofStatistical and Actuarial Sciences, Economics, Epidemiology and Biostatistics, and Philosophy. From these symposia, the six volumes which comprise the "Joshi Festschrift" have arisen. The 117 articles in this work reflect the broad interests and high quality of research of those who attended our conference. We would like to thank all of the contributors for their superb cooperation in helping us to complete this project. Our deepest gratitude must go to the three people who have spent so much of their time in the past year typing these volumes: Jackie Bell, Lise Constant, and Sandy Tarnowski. This work has been printed from "camera ready" copy produced by our Vax 785 computer and QMS Lasergraphix printers, using the text processing software TEX. At the initiation of this project, we were neophytes inthe use ofthis system. Thank you, Jackie, Lise, and Sandy, for having the persistence and dedication needed to complete this undertaking."

Data-Driven Remaining Useful Life Prognosis Techniques - Stochastic Models, Methods and Applications (Hardcover, 1st ed. 2017):... Data-Driven Remaining Useful Life Prognosis Techniques - Stochastic Models, Methods and Applications (Hardcover, 1st ed. 2017)
Xiao-Sheng Si, Zheng-Xin Zhang, Changhua Hu
R5,174 Discovery Miles 51 740 Ships in 10 - 15 working days

This book introduces data-driven remaining useful life prognosis techniques, and shows how to utilize the condition monitoring data to predict the remaining useful life of stochastic degrading systems and to schedule maintenance and logistics plans. It is also the first book that describes the basic data-driven remaining useful life prognosis theory systematically and in detail. The emphasis of the book is on the stochastic models, methods and applications employed in remaining useful life prognosis. It includes a wealth of degradation monitoring experiment data, practical prognosis methods for remaining useful life in various cases, and a series of applications incorporated into prognostic information in decision-making, such as maintenance-related decisions and ordering spare parts. It also highlights the latest advances in data-driven remaining useful life prognosis techniques, especially in the contexts of adaptive prognosis for linear stochastic degrading systems, nonlinear degradation modeling based prognosis, residual storage life prognosis, and prognostic information-based decision-making.

Cost-Benefit Analysis and the Theory of Fuzzy Decisions - Fuzzy Value Theory (Hardcover, 2004 ed.): Kofi Kissi Dompere Cost-Benefit Analysis and the Theory of Fuzzy Decisions - Fuzzy Value Theory (Hardcover, 2004 ed.)
Kofi Kissi Dompere
R4,208 Discovery Miles 42 080 Ships in 18 - 22 working days

Criticism is the habitus of the contemplative intellect, whereby we try to recognize with probability the genuine quality of a l- erary work by using appropriate aids and rules. In so doing, c- tain general and particular points must be considered. The art of interpretation or hermeneutics is the habitus of the contemplative intellect of probing into the sense of somewhat special text by using logical rules and suitable means. Note : Hermeneutics differs from criticism as the part does from the whole. Antonius Gvilielmus Amo Afer (1727) There is no such thing as absolute truth. At best it is a subj- tive criterion, but one based upon valuation. Unfortunately, too many people place their fate in the hands of subjective without properly evaluating it. Arnold A. Kaufmann and Madan M. Gupta The development of cost benefit analysis and the theory of fuzzy decision was divided into two inter-dependent structures of identification and measurement theory on one hand and fuzzy value theory one the other. Each of them has sub-theories that constitute a complete logical system.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu Paperback R1,209 R1,135 Discovery Miles 11 350
Integrated Population Biology and…
Arni S.R. Srinivasa Rao, C.R. Rao Hardcover R6,219 Discovery Miles 62 190
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim Paperback R969 R856 Discovery Miles 8 560
Best Books gegradeerde leesreeks: Vlak 1…
Best Books Paperback R108 Discovery Miles 1 080
Environmental Statistics, Volume 12
G.P Patil Hardcover R1,972 Discovery Miles 19 720
Pearson Edexcel AS and A level…
Paperback  (1)
R291 Discovery Miles 2 910
Introduction to Stochastic Dynamic…
Sheldon M. Ross Paperback R1,404 Discovery Miles 14 040
Applied Business Statistics - Methods…
Trevor Wegner Paperback R930 Discovery Miles 9 300
Eighteenth Annual Report of the Bureau…
Charles J Armiger Fox Hardcover R860 Discovery Miles 8 600
Basic mathematics for economics students…
Derek Yu Paperback R420 Discovery Miles 4 200

 

Partners