0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • R2,500 - R5,000 (10)
  • -
Status
Brand

Showing 1 - 14 of 14 matches in All Departments

Missing and Modified Data in Nonparametric Estimation - With R Examples (Hardcover): Jie Chen, Joseph Heyse, Tze Leung Lai Missing and Modified Data in Nonparametric Estimation - With R Examples (Hardcover)
Jie Chen, Joseph Heyse, Tze Leung Lai
R2,893 Discovery Miles 28 930 Ships in 12 - 17 working days

This book presents a systematic and unified approach for modern nonparametric treatment of missing and modified data via examples of density and hazard rate estimation, nonparametric regression, filtering signals, and time series analysis. All basic types of missing at random and not at random, biasing, truncation, censoring, and measurement errors are discussed, and their treatment is explained. Ten chapters of the book cover basic cases of direct data, biased data, nondestructive and destructive missing, survival data modified by truncation and censoring, missing survival data, stationary and nonstationary time series and processes, and ill-posed modifications. The coverage is suitable for self-study or a one-semester course for graduate students with a prerequisite of a standard course in introductory probability. Exercises of various levels of difficulty will be helpful for the instructor and self-study. The book is primarily about practically important small samples. It explains when consistent estimation is possible, and why in some cases missing data should be ignored and why others must be considered. If missing or data modification makes consistent estimation impossible, then the author explains what type of action is needed to restore the lost information. The book contains more than a hundred figures with simulated data that explain virtually every setting, claim, and development. The companion R software package allows the reader to verify, reproduce and modify every simulation and used estimators. This makes the material fully transparent and allows one to study it interactively. Sam Efromovich is the Endowed Professor of Mathematical Sciences and the Head of the Actuarial Program at the University of Texas at Dallas. He is well known for his work on the theory and application of nonparametric curve estimation and is the author of Nonparametric Curve Estimation: Methods, Theory, and Applications. Professor Sam Efromovich is a Fellow of the Institute of Mathematical Statistics and the American Statistical Association.

Medical Product Safety Evaluation - Biological Models and Statistical Methods (Paperback): Jie Chen, Joseph Heyse, Tze Leung Lai Medical Product Safety Evaluation - Biological Models and Statistical Methods (Paperback)
Jie Chen, Joseph Heyse, Tze Leung Lai
R1,490 Discovery Miles 14 900 Ships in 12 - 17 working days

Medical Product Safety Evaluation: Biological Models and Statistical Methods presents cutting-edge biological models and statistical methods that are tailored to specific objectives and data types for safety analysis and benefit-risk assessment. Some frequently encountered issues and challenges in the design and analysis of safety studies are discussed with illustrative applications and examples. Medical Product Safety Evaluation: Biological Models and Statistical Methods presents cutting-edge biological models and statistical methods that are tailored to specific objectives and data types for safety analysis and benefit-risk assessment. Some frequently encountered issues and challenges in the design and analysis of safety studies are discussed with illustrative applications and examples. The book is designed not only for biopharmaceutical professionals, such as statisticians, safety specialists, pharmacovigilance experts, and pharmacoepidemiologists, who can use the book as self-learning materials or in short courses or training programs, but also for graduate students in statistics and biomedical data science for a one-semester course. Each chapter provides supplements and problems as more readings and exercises.

Quantitative Trading - Algorithms, Analytics, Data, Models, Optimization (Paperback): Xin Guo, Tze Leung Lai, Howard Shek,... Quantitative Trading - Algorithms, Analytics, Data, Models, Optimization (Paperback)
Xin Guo, Tze Leung Lai, Howard Shek, Samuel Po Shing Wong
bundle available
R1,854 Discovery Miles 18 540 Ships in 12 - 17 working days

The first part of this book discusses institutions and mechanisms of algorithmic trading, market microstructure, high-frequency data and stylized facts, time and event aggregation, order book dynamics, trading strategies and algorithms, transaction costs, market impact and execution strategies, risk analysis, and management. The second part covers market impact models, network models, multi-asset trading, machine learning techniques, and nonlinear filtering. The third part discusses electronic market making, liquidity, systemic risk, recent developments and debates on the subject.

Proceedings of the Pacific Rim Statistical Conference for Production Engineering - Big Data, Production Engineering and... Proceedings of the Pacific Rim Statistical Conference for Production Engineering - Big Data, Production Engineering and Statistics (Paperback, Softcover reprint of the original 1st ed. 2018)
Dongseok Choi, Daeheung Jang, Tze Leung Lai, Youngjo Lee, Ying Lu, …
bundle available
R2,957 Discovery Miles 29 570 Ships in 10 - 15 working days

This book presents the proceedings of the 2nd Pacific Rim Statistical Conference for Production Engineering: Production Engineering, Big Data and Statistics, which took place at Seoul National University in Seoul, Korea in December, 2016. The papers included discuss a wide range of statistical challenges, methods and applications for big data in production engineering, and introduce recent advances in relevant statistical methods.

Proceedings of the Pacific Rim Statistical Conference for Production Engineering - Big Data, Production Engineering and... Proceedings of the Pacific Rim Statistical Conference for Production Engineering - Big Data, Production Engineering and Statistics (Hardcover, 1st ed. 2018)
Dongseok Choi, Daeheung Jang, Tze Leung Lai, Youngjo Lee, Ying Lu, …
bundle available
R2,957 Discovery Miles 29 570 Ships in 10 - 15 working days

This book presents the proceedings of the 2nd Pacific Rim Statistical Conference for Production Engineering: Production Engineering, Big Data and Statistics, which took place at Seoul National University in Seoul, Korea in December, 2016. The papers included discuss a wide range of statistical challenges, methods and applications for big data in production engineering, and introduce recent advances in relevant statistical methods.

Selected Papers (Paperback, 1st ed. 1985, Reprinted Softcover 2017): Herbert Robbins Selected Papers (Paperback, 1st ed. 1985, Reprinted Softcover 2017)
Herbert Robbins; Edited by Tze Leung Lai, David Siegmund
R2,024 Discovery Miles 20 240 Ships in 10 - 15 working days

Herbert Robbins is widely recognized as one of the most creative and original mathematical statisticians of our time. The purpose of this book is to reprint, on the occasion of his seventieth birthday, some of his most outstanding research. In making selections for reprinting we have tried to keep in mind three potential audiences: (1) the historian who would like to know Robbins' seminal role in stimulating a substantial proportion of current research in mathematical statistics; (2) the novice who would like a readable, conceptually oriented introduction to these subjects; and (3) the expert who would like to have useful reference material in a single collection. In many cases the needs of the first two groups can be met simulta neously. A distinguishing feature of Robbins' research is its daring originality, which literally creates new specialties for subsequent generations of statisticians to explore. Often these seminal papers are also models of exposition serving to introduce the reader, in the simplest possible context, to ideas that are important for contemporary research in the field. An example is the paper of Robbins and Monro which initiated the subject of stochastic approximation. We have also attempted to provide some useful guidance to the literature in various subjects by supplying additional references, particularly to books and survey articles, with some remarks about important developments in these areas.

Sequential Experimentation in Clinical Trials - Design and Analysis (Paperback, 2013 ed.): Jay Bartroff, Tze Leung Lai,... Sequential Experimentation in Clinical Trials - Design and Analysis (Paperback, 2013 ed.)
Jay Bartroff, Tze Leung Lai, Mei-Chiung Shih
R3,924 Discovery Miles 39 240 Ships in 10 - 15 working days

Sequential Experimentation in Clinical Trials: Design and Analysis is developed from decades of work in research groups, statistical pedagogy, and workshop participation. Different parts of the book can be used for short courses on clinical trials, translational medical research, and sequential experimentation. The authors have successfully used the book to teach innovative clinical trial designs and statistical methods for Statistics Ph.D. students at Stanford University. There are additional online supplements for the book that include chapter-specific exercises and information. Sequential Experimentation in Clinical Trials: Design and Analysis covers the much broader subject of sequential experimentation that includes group sequential and adaptive designs of Phase II and III clinical trials, which have attracted much attention in the past three decades. In particular, the broad scope of design and analysis problems in sequential experimentation clearly requires a wide range of statistical methods and models from nonlinear regression analysis, experimental design, dynamic programming, survival analysis, resampling, and likelihood and Bayesian inference. The background material in these building blocks is summarized in Chapter 2 and Chapter 3 and certain sections in Chapter 6 and Chapter 7. Besides group sequential tests and adaptive designs, the book also introduces sequential change-point detection methods in Chapter 5 in connection with pharmacovigilance and public health surveillance. Together with dynamic programming and approximate dynamic programming in Chapter 3, the book therefore covers all basic topics for a graduate course in sequential analysis designs.

Sequential Experimentation in Clinical Trials - Design and Analysis (Hardcover, 2013 ed.): Jay Bartroff, Tze Leung Lai,... Sequential Experimentation in Clinical Trials - Design and Analysis (Hardcover, 2013 ed.)
Jay Bartroff, Tze Leung Lai, Mei-Chiung Shih
R3,941 Discovery Miles 39 410 Ships in 10 - 15 working days

Sequential Experimentation in Clinical Trials: Design and Analysis is developed from decades of work in research groups, statistical pedagogy, and workshop participation. Different parts of the book can be used for short courses on clinical trials, translational medical research, and sequential experimentation. The authors have successfully used the book to teach innovative clinical trial designs and statistical methods for Statistics Ph.D. students at Stanford University. There are additional online supplements for the book that include chapter-specific exercises and information. Sequential Experimentation in Clinical Trials: Design and Analysis covers the much broader subject of sequential experimentation that includes group sequential and adaptive designs of Phase II and III clinical trials, which have attracted much attention in the past three decades. In particular, the broad scope of design and analysis problems in sequential experimentation clearly requires a wide range of statistical methods and models from nonlinear regression analysis, experimental design, dynamic programming, survival analysis, resampling, and likelihood and Bayesian inference. The background material in these building blocks is summarized in Chapter 2 and Chapter 3 and certain sections in Chapter 6 and Chapter 7. Besides group sequential tests and adaptive designs, the book also introduces sequential change-point detection methods in Chapter 5 in connection with pharmacovigilance and public health surveillance. Together with dynamic programming and approximate dynamic programming in Chapter 3, the book therefore covers all basic topics for a graduate course in sequential analysis designs.

Self-Normalized Processes - Limit Theory and Statistical Applications (Paperback, Softcover reprint of hardcover 1st ed. 2009):... Self-Normalized Processes - Limit Theory and Statistical Applications (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Victor H. Pena, Tze Leung Lai, Qi-Man Shao
R3,721 Discovery Miles 37 210 Ships in 10 - 15 working days

Self-normalized processes are of common occurrence in probabilistic and statistical studies. A prototypical example is Student's t-statistic introduced in 1908 by Gosset, whose portrait is on the front cover. Due to the highly non-linear nature of these processes, the theory experienced a long period of slow development. In recent years there have been a number of important advances in the theory and applications of self-normalized processes. Some of these developments are closely linked to the study of central limit theorems, which imply that self-normalized processes are approximate pivots for statistical inference. The present volume covers recent developments in the area, including self-normalized large and moderate deviations, and laws of the iterated logarithms for self-normalized martingales. This is the first book that systematically treats the theory and applications of self-normalization.

Statistical Models and Methods for Financial Markets (Paperback, Softcover reprint of hardcover 1st ed. 2008): Tze Leung Lai,... Statistical Models and Methods for Financial Markets (Paperback, Softcover reprint of hardcover 1st ed. 2008)
Tze Leung Lai, Haipeng Xing
bundle available
R2,487 Discovery Miles 24 870 Ships in 10 - 15 working days

The idea of writing this bookarosein 2000when the ?rst author wasassigned to teach the required course STATS 240 (Statistical Methods in Finance) in the new M. S. program in ?nancial mathematics at Stanford, which is an interdisciplinary program that aims to provide a master's-level education in applied mathematics, statistics, computing, ?nance, and economics. Students in the programhad di?erent backgroundsin statistics. Some had only taken a basic course in statistical inference, while others had taken a broad spectrum of M. S. - and Ph. D. -level statistics courses. On the other hand, all of them had already taken required core courses in investment theory and derivative pricing, and STATS 240 was supposed to link the theory and pricing formulas to real-world data and pricing or investment strategies. Besides students in theprogram, thecoursealso attractedmanystudentsfromother departments in the university, further increasing the heterogeneity of students, as many of them had a strong background in mathematical and statistical modeling from the mathematical, physical, and engineering sciences but no previous experience in ?nance. To address the diversity in background but common strong interest in the subject and in a potential career as a "quant" in the ?nancialindustry, thecoursematerialwascarefullychosennotonlytopresent basic statistical methods of importance to quantitative ?nance but also to summarize domain knowledge in ?nance and show how it can be combined with statistical modeling in ?nancial analysis and decision making. The course material evolved over the years, especially after the second author helped as the head TA during the years 2004 and 2005.

Statistical Models and Methods for Financial Markets (Hardcover, 2008 ed.): Tze Leung Lai, Haipeng Xing Statistical Models and Methods for Financial Markets (Hardcover, 2008 ed.)
Tze Leung Lai, Haipeng Xing
bundle available
R3,444 Discovery Miles 34 440 Ships in 10 - 15 working days

The idea of writing this bookarosein 2000when the ?rst author wasassigned to teach the required course STATS 240 (Statistical Methods in Finance) in the new M. S. program in ?nancial mathematics at Stanford, which is an interdisciplinary program that aims to provide a master's-level education in applied mathematics, statistics, computing, ?nance, and economics. Students in the programhad di?erent backgroundsin statistics. Some had only taken a basic course in statistical inference, while others had taken a broad spectrum of M. S. - and Ph. D. -level statistics courses. On the other hand, all of them had already taken required core courses in investment theory and derivative pricing, and STATS 240 was supposed to link the theory and pricing formulas to real-world data and pricing or investment strategies. Besides students in theprogram, thecoursealso attractedmanystudentsfromother departments in the university, further increasing the heterogeneity of students, as many of them had a strong background in mathematical and statistical modeling from the mathematical, physical, and engineering sciences but no previous experience in ?nance. To address the diversity in background but common strong interest in the subject and in a potential career as a "quant" in the ?nancialindustry, thecoursematerialwascarefullychosennotonlytopresent basic statistical methods of importance to quantitative ?nance but also to summarize domain knowledge in ?nance and show how it can be combined with statistical modeling in ?nancial analysis and decision making. The course material evolved over the years, especially after the second author helped as the head TA during the years 2004 and 2005.

Quantitative Trading - Algorithms, Analytics, Data, Models, Optimization (Hardcover): Xin Guo, Tze Leung Lai, Howard Shek,... Quantitative Trading - Algorithms, Analytics, Data, Models, Optimization (Hardcover)
Xin Guo, Tze Leung Lai, Howard Shek, Samuel Po Shing Wong
bundle available
R3,029 Discovery Miles 30 290 Ships in 12 - 17 working days

The first part of this book discusses institutions and mechanisms of algorithmic trading, market microstructure, high-frequency data and stylized facts, time and event aggregation, order book dynamics, trading strategies and algorithms, transaction costs, market impact and execution strategies, risk analysis, and management. The second part covers market impact models, network models, multi-asset trading, machine learning techniques, and nonlinear filtering. The third part discusses electronic market making, liquidity, systemic risk, recent developments and debates on the subject.

Medical Product Safety Evaluation - Biological Models and Statistical Methods (Hardcover): Jie Chen, Joseph Heyse, Tze Leung Lai Medical Product Safety Evaluation - Biological Models and Statistical Methods (Hardcover)
Jie Chen, Joseph Heyse, Tze Leung Lai
R3,405 Discovery Miles 34 050 Ships in 12 - 17 working days

Medical Product Safety Evaluation: Biological Models and Statistical Methods presents cutting-edge biological models and statistical methods that are tailored to specific objectives and data types for safety analysis and benefit-risk assessment. Some frequently encountered issues and challenges in the design and analysis of safety studies are discussed with illustrative applications and examples. Medical Product Safety Evaluation: Biological Models and Statistical Methods presents cutting-edge biological models and statistical methods that are tailored to specific objectives and data types for safety analysis and benefit-risk assessment. Some frequently encountered issues and challenges in the design and analysis of safety studies are discussed with illustrative applications and examples. The book is designed not only for biopharmaceutical professionals, such as statisticians, safety specialists, pharmacovigilance experts, and pharmacoepidemiologists, who can use the book as self-learning materials or in short courses or training programs, but also for graduate students in statistics and biomedical data science for a one-semester course. Each chapter provides supplements and problems as more readings and exercises.

Self-Normalized Processes - Limit Theory and Statistical Applications (Hardcover, 2009 ed.): Victor H. Pena, Tze Leung Lai,... Self-Normalized Processes - Limit Theory and Statistical Applications (Hardcover, 2009 ed.)
Victor H. Pena, Tze Leung Lai, Qi-Man Shao
R3,753 Discovery Miles 37 530 Ships in 10 - 15 working days

Self-normalized processes are of common occurrence in probabilistic and statistical studies. A prototypical example is Student's t-statistic introduced in 1908 by Gosset, whose portrait is on the front cover. Due to the highly non-linear nature of these processes, the theory experienced a long period of slow development. In recent years there have been a number of important advances in the theory and applications of self-normalized processes. Some of these developments are closely linked to the study of central limit theorems, which imply that self-normalized processes are approximate pivots for statistical inference.

The present volume covers recent developments in the area, including self-normalized large and moderate deviations, and laws of the iterated logarithms for self-normalized martingales. This is the first book that systematically treats the theory and applications of self-normalization.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
JBL T110 In-Ear Headphones (Black)
 (13)
R229 R201 Discovery Miles 2 010
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Dromex 3-Ply Medical Mask (Box of 50)
 (17)
R1,099 R399 Discovery Miles 3 990
UGreen USBC-40574 USB-C Male To USB-C…
R229 R215 Discovery Miles 2 150
Too Hard To Forget
Tessa Bailey Paperback R280 R224 Discovery Miles 2 240
Happier Than Ever
Billie Eilish CD  (1)
R426 Discovery Miles 4 260
Cable Guy Ikon "Light Up" Marvel…
R543 Discovery Miles 5 430
Jurassic Park Trilogy Collection
Sam Neill, Laura Dern, … Blu-ray disc  (1)
R362 Discovery Miles 3 620
Dig & Discover: Ancient Egypt - Excavate…
Hinkler Pty Ltd Kit R263 Discovery Miles 2 630
Sony PlayStation 5 DualSense Wireless…
R1,599 R1,479 Discovery Miles 14 790

 

Partners