![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
Single Subject Designs in Biomedicine draws upon the rich history of single case research within the educational and behavioral research settings and extends the application to the field of biomedicine. Biomedical illustrations are used to demonstrate the processes of designing, implementing, and evaluating a single subject design. Strengths and limitations of various methodologies are presented, along with specific clinical areas of application in which these applications would be appropriate. Statistical and visual techniques for data analysis are also discussed. The breadth and depth of information provided is suitable for medical students in research oriented courses, primary care practitioners and medical specialists seeking to apply methods of evidence practice to improve patient care, and medical researchers who are expanding their methodological expertise to include single subject designs. Increasing awareness of the utility in the single subject design could enhance treatment approach and evaluation both in biomedical research and medical care settings.
Microarrays for simultaneous measurement of redundancy of RNA species are used in fundamental biology as well as in medical research. Statistically, a microarray may be considered as an observation of very high dimensionality equal to the number of expression levels measured on it. In "Statistical Methods for Microarray Data Analysis: Methods and Protocols, " expert researchers in the field detail many methods and techniques used to study microarrays, guiding the reader from microarray technology to statistical problems of specific multivariate data analysis. Written in the highly successful "Methods in Molecular Biology " series format, the chapters include the kind of detailed description and implementation advice that is crucial for getting optimal results in the laboratory. Thorough and intuitive, "Statistical Methods for Microarray Data Analysis: ""Methods and Protocols "aids scientists in continuing to study microarrays and the most current statistical methods.
The theory of Markov Decision Processes - also known under several other names including sequential stochastic optimization, discrete-time stochastic control, and stochastic dynamic programming - studies sequential optimization of discrete time stochastic systems. Fundamentally, this is a methodology that examines and analyzes a discrete-time stochastic system whose transition mechanism can be controlled over time. Each control policy defines the stochastic process and values of objective functions associated with this process. Its objective is to select a "good" control policy. In real life, decisions that humans and computers make on all levels usually have two types of impacts: (i) they cost or save time, money, or other resources, or they bring revenues, as well as (ii) they have an impact on the future, by influencing the dynamics. In many situations, decisions with the largest immediate profit may not be good in view of future events. Markov Decision Processes (MDPs) model this paradigm and provide results on the structure and existence of good policies and on methods for their calculations. MDPs are attractive to many researchers because they are important both from the practical and the intellectual points of view. MDPs provide tools for the solution of important real-life problems. In particular, many business and engineering applications use MDP models. Analysis of various problems arising in MDPs leads to a large variety of interesting mathematical and computational problems. Accordingly, the Handbook of Markov Decision Processes is split into three parts: Part I deals with models with finite state and action spaces and Part II deals with infinite state problems, and Part IIIexamines specific applications. Individual chapters are written by leading experts on the subject.
The main theme of this monograph is "comparative statistical inference. " While the topics covered have been carefully selected (they are, for example, restricted to pr- lems of statistical estimation), my aim is to provide ideas and examples which will assist a statistician, or a statistical practitioner, in comparing the performance one can expect from using either Bayesian or classical (aka, frequentist) solutions in - timation problems. Before investing the hours it will take to read this monograph, one might well want to know what sets it apart from other treatises on comparative inference. The two books that are closest to the present work are the well-known tomes by Barnett (1999) and Cox (2006). These books do indeed consider the c- ceptual and methodological differences between Bayesian and frequentist methods. What is largely absent from them, however, are answers to the question: "which - proach should one use in a given problem?" It is this latter issue that this monograph is intended to investigate. There are many books on Bayesian inference, including, for example, the widely used texts by Carlin and Louis (2008) and Gelman, Carlin, Stern and Rubin (2004). These books differ from the present work in that they begin with the premise that a Bayesian treatment is called for and then provide guidance on how a Bayesian an- ysis should be executed. Similarly, there are many books written from a classical perspective.
A timely and applied approach to the newly discovered methods and applications of U-statistics Built on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research. The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applicable, the authors then build upon this established foundation in order to equip readers with the knowledge needed to understand the modern-day extensions of U-statistics that are explored in subsequent chapters. Additional topical coverage includes: Longitudinal data modeling with missing data Parametric and distribution-free mixed-effect and structural equation models A new multi-response based regression framework for non-parametric statistics such as the product moment correlation, Kendall's tau, and Mann-Whitney-Wilcoxon rank tests A new class of U-statistic-based estimating equations (UBEE) for dependent responses Motivating examples, in-depthillustrations of statistical and model-building concepts, and an extensive discussion of longitudinal study designs strengthen the real-world utility and comprehension of this book. An accompanying Web site features SAS(R) and S-Plus(R) program codes, software applications, and additional study data. Modern Applied U-Statistics accommodates second- and third-year students of biostatistics at the graduate level and also serves as an excellent self-study for practitioners in the fields of bioinformatics and psychosocial research.
The concept of ridges has appeared numerous times in the image processing liter ature. Sometimes the term is used in an intuitive sense. Other times a concrete definition is provided. In almost all cases the concept is used for very specific ap plications. When analyzing images or data sets, it is very natural for a scientist to measure critical behavior by considering maxima or minima of the data. These critical points are relatively easy to compute. Numerical packages always provide support for root finding or optimization, whether it be through bisection, Newton's method, conjugate gradient method, or other standard methods. It has not been natural for scientists to consider critical behavior in a higher-order sense. The con cept of ridge as a manifold of critical points is a natural extension of the concept of local maximum as an isolated critical point. However, almost no attention has been given to formalizing the concept. There is a need for a formal development. There is a need for understanding the computation issues that arise in the imple mentations. The purpose of this book is to address both needs by providing a formal mathematical foundation and a computational framework for ridges. The intended audience for this book includes anyone interested in exploring the use fulness of ridges in data analysis."
This book is concerned with important problems of robust (stable) statistical pat tern recognition when hypothetical model assumptions about experimental data are violated (disturbed). Pattern recognition theory is the field of applied mathematics in which prin ciples and methods are constructed for classification and identification of objects, phenomena, processes, situations, and signals, i. e., of objects that can be specified by a finite set of features, or properties characterizing the objects (Mathematical Encyclopedia (1984)). Two stages in development of the mathematical theory of pattern recognition may be observed. At the first stage, until the middle of the 1970s, pattern recogni tion theory was replenished mainly from adjacent mathematical disciplines: mathe matical statistics, functional analysis, discrete mathematics, and information theory. This development stage is characterized by successful solution of pattern recognition problems of different physical nature, but of the simplest form in the sense of used mathematical models. One of the main approaches to solve pattern recognition problems is the statisti cal approach, which uses stochastic models of feature variables. Under the statistical approach, the first stage of pattern recognition theory development is characterized by the assumption that the probability data model is known exactly or it is esti mated from a representative sample of large size with negligible estimation errors (Das Gupta, 1973, 1977), (Rey, 1978), (Vasiljev, 1983))."
This book comprises nine selected works on numerical and computational methods for solving multiobjective optimization, game theory, and machine learning problems. It provides extended versions of selected papers from various fields of science such as computer science, mathematics and engineering that were presented at EVOLVE 2013 held in July 2013 at Leiden University in the Netherlands. The internationally peer-reviewed papers include original work on important topics in both theory and applications, such as the role of diversity in optimization, statistical approaches to combinatorial optimization, computational game theory, and cell mapping techniques for numerical landscape exploration. Applications focus on aspects including robustness, handling multiple objectives, and complex search spaces in engineering design and computational biology.
This volume has been created in honor of the seventieth birthday of Ted Harris, which was celebrated on January 11th, 1989. The papers rep resent the wide range of subfields of probability theory in which Ted has made profound and fundamental contributions. This breadth in Ted's research complicates the task of putting together in his honor a book with a unified theme. One common thread noted was the spatial, or geometric, aspect of the phenomena Ted investigated. This volume has been organized around that theme, with papers covering four major subject areas of Ted's research: branching processes, percola tion, interacting particle systems, and stochastic flows. These four topics do not* exhaust his research interests; his major work on Markov chains is commemorated in the standard technology "Harris chain" and "Harris recurrent" . The editors would like to take this opportunity to thank the speakers at the symposium and the contributors to this volume. Their enthusi astic support is a tribute to Ted Harris. We would like to express our appreciation to Annette Mosley for her efforts in typing the manuscripts and to Arthur Ogawa for typesetting the volume. Finally, we gratefully acknowledge the National Science Foundation and the University of South ern California for their financial support.
This book describes informetric results from the point of view of
Lotkaian size-frequency functions, i.e. functions that are
decreasing power laws. Explanations and examples of this model are
given showing that it is the most important regularity amongst
other possible models. This theory is then developed in the
framework of IPPs (Information Production Processes) hereby also
indicating its relation with e.g. the law of Zipf. Applications are given in the following fields:
three-dimensional informetrics (positive reinforcement and
Type/Token-Taken informetrics), concentration theory (including the
description of Lorenz curves and concentration measures in Lotkaian
informetrics), fractal complexity theory (Lotkaian informetrics as
self-similar fractals), Lotkaian informetrics in which items can
have multiple sources (where fractional size-frequency functions
are constructed), the theory of first-citation distributions and
the N-fold Cartesian product of IPPs (describing frequency
functions for N-grams and N-word phrases). In the Appendix, methods
are given to determine the parameters in the law of Lotka, based on
a set of discrete data. The book explains numerous informetric regularities, only based on a decreasing power law as size-frequency function, i.e. Lotka's law. It revives the historical formulation of Alfred Lotka of 1926 and shows the power of this power law, both in classical aspects of informetrics (libraries, bibliographies) as well as in "new" applications such as social networks (citation or collaboration networks and the Internet).
Nevanlinna-Pick interpolation for time-varying input-output maps: The discrete case.- 0. Introduction.- 1. Preliminaries.- 2. J-Unitary operators on ?2.- 3. Time-varying Nevanlinna-Pick interpolation.- 4. Solution of the time-varying tangential Nevanlinna-Pick interpolation problem.- 5. An illustrative example.- References.- Nevanlinna-Pick interpolation for time-varying input-output maps: The continuous time case.- 0. Introduction.- 1. Generalized point evaluation.- 2. Bounded input-output maps.- 3. Residue calculus and diagonal expansion.- 4. J-unitary and J-inner operators.- 5. Time-varying Nevanlinna-Pick interpolation.- 6. An example.- References.- Dichotomy of systems and invertibility of linear ordinary differential operators.- 1. Introduction.- 2. Preliminaries.- 3. Invertibility of differential operators on the real line.- 4. Relations between operators on the full line and half line.- 5. Fredholm properties of differential operators on a half line.- 6. Fredholm properties of differential operators on a full line.- 7. Exponentially dichotomous operators.- 8. References.- Inertia theorems for block weighted shifts and applications.- 1. Introduction.- 2. One sided block weighted shifts.- 3. Dichotomies for left systems and two sided systems.- 4. Two sided block weighted shifts.- 5. Asymptotic inertia.- 6. References.- Interpolation for upper triangular operators.- 1. Introduction.- 2. Preliminaries.- 3. Colligations & characteristic functions.- 4. Towards interpolation.- 5. Explicit formulas for ?.- 6. Admissibility and more on general interpolation.- 7. Nevanlinna-Pick Interpolation.- 8. Caratheodory-Fejer interpolation.- 9. Mixed interpolation problems.- 10. Examples.- 11. Block Toeplitz & some implications.- 12. Varying coordinate spaces.- 13. References.- Minimality and realization of discrete time-varying systems.- 1. Preliminaries.- 2. Observability and reachability.- 3. Minimality for time-varying systems.- 4. Proofs of the minimality theorems.- 5. Realizations of infinite lower triangular matrices.- 6. The class of systems with constant state space dimension.- 7. Minimality and realization for periodical systems.- References.
This book describes the properties of stochastic probabilistic models and develops the applied mathematics of stochastic point processes. It is useful to students and research workers in probability and statistics and also to research workers wishing to apply stochastic point processes.
"Et moi, ... si j'avait su comment en revenir. One service mathematics has rendered the je n'y serais poin t aile.' human race. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non- The series is divergent; therefore we may be sense'. able to do something with it. Eric T. Bell O. H ea viside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non Iinearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service. topology has rendered mathematical physics .. .' 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d 'e1: re of this series."
The concept of conditional specification of distributions is not new but, except in normal families, it has not been well developed in the literature. Computational difficulties undoubtedly hindered or discouraged developments in this direction. However, such roadblocks are of dimished importance today. Questions of compatibility of conditional and marginal specifications of distributions are of fundamental importance in modeling scenarios. Models with conditionals in exponential families are particularly tractable and provide useful models in a broad variety of settings.
Covering CUSUMs from an application-oriented viewpoint, while also providing the essential theoretical underpinning, this is an accessible guide for anyone with a basic statistical training. The text is aimed at quality practitioners, teachers and students of quality methodologies, and people interested in analysis of time-ordered data. Further support is available from a Web site containing CUSUM software and data sets.
High dimensional probability, in the sense that encompasses the topics rep resented in this volume, began about thirty years ago with research in two related areas: limit theorems for sums of independent Banach space valued random vectors and general Gaussian processes. An important feature in these past research studies has been the fact that they highlighted the es sential probabilistic nature of the problems considered. In part, this was because, by working on a general Banach space, one had to discard the extra, and often extraneous, structure imposed by random variables taking values in a Euclidean space, or by processes being indexed by sets in R or Rd. Doing this led to striking advances, particularly in Gaussian process theory. It also led to the creation or introduction of powerful new tools, such as randomization, decoupling, moment and exponential inequalities, chaining, isoperimetry and concentration of measure, which apply to areas well beyond those for which they were created. The general theory of em pirical processes, with its vast applications in statistics, the study of local times of Markov processes, certain problems in harmonic analysis, and the general theory of stochastic processes are just several of the broad areas in which Gaussian process techniques and techniques from probability in Banach spaces have made a substantial impact. Parallel to this work on probability in Banach spaces, classical proba bility and empirical process theory were enriched by the development of powerful results in strong approximations."
The first edition was released in 1996 and has sold close to 2200 copies. Provides an up-to-date comprehensive treatment of MDS, a statistical technique used to analyze the structure of similarity or dissimilarity data in multidimensional space. The authors have added three chapters and exercise sets. The text is being moved from SSS to SSPP. The book is suitable for courses in statistics for the social or managerial sciences as well as for advanced courses on MDS. All the mathematics required for more advanced topics is developed systematically in the text.
This book offers a straightforward introduction to the mathematical theory of probability. It presents the central results and techniques of the subject in a complete and self-contained account. As a result, the emphasis is on giving results in simple forms with clear proofs and to eschew more powerful forms of theorems which require technically involved proofs. Throughout there are a wide variety of exercises to illustrate and to develop ideas in the text.
This book is devoted to Corrado Gini, father of the Italian statistical school. It celebrates the 50th anniversary of his death by bearing witness to the continuing extraordinary scientific relevance of his interdisciplinary interests. The book comprises a selection of the papers presented at the conference of the Italian Statistical Society, Statistics and Demography - the Legacy of Corrado Gini, held in Treviso in September 2015. The work covers many topics linked to Gini's scientific legacy, ranging from the theory of statistical inference to multivariate statistical analysis, demography and sociology. In this volume, readers will find many interesting contributions on entropy measures, permutation procedures for the heterogeneity test, robust estimation of skew-normal parameters, S-weighted estimator, measures of multidimensional performance using Gini's delta, small-sample confidence intervals for Gini's gamma index, Bayesian estimation of the Gini-Simpson index, spatial residential patterns of selected foreign groups, minority segregation processes, dynamic time warping to study cruise tourism, and financial stress spill over. This book will appeal to all statisticians, demographers, economists, and sociologists interested in the field.
Kiyosi Ito, the founder of stochastic calculus, is one of the few central figures of the twentieth century mathematics who reshaped the mathematical world. Today stochastic calculus is a central research field with applications in several other mathematical disciplines, for example physics, engineering, biology, economics and finance. The Abel Symposium 2005 was organized as a tribute to the work of Kiyosi Ito on the occasion of his 90th birthday. Distinguished researchers from all over the world were invited to present the newest developments within the exciting and fast growing field of stochastic analysis. The present volume combines both papers from the invited speakers and contributions by the presenting lecturers. A special feature is the Memoirs that Kiyoshi Ito wrote for this occasion. These are valuable pages for both young and established researchers in the field.
Harmonic analysis and probability have long enjoyed a mutually beneficial relationship that has been rich and fruitful. This monograph, aimed at researchers and students in these fields, explores several aspects of this relationship. The primary focus of the text is the nontangential maximal function and the area function of a harmonic function and their probabilistic analogues in martingale theory. The text first gives the requisite background material from harmonic analysis and discusses known results concerning the nontangential maximal function and area function, as well as the central and essential role these have played in the development of the field.The book next discusses further refinements of traditional results: among these are sharp good-lambda inequalities and laws of the iterated logarithm involving nontangential maximal functions and area functions. Many applications of these results are given. Throughout, the constant interplay between probability and harmonic analysis is emphasized and explained. The text contains some new and many recent results combined in a coherent presentation.
This book is in two volumes, and is intended as a text for introductory courses in probability and statistics at the second or third year university level. It em phasizes applications and logical principles rather than mathematical theory. A good background in freshman calculus is sufficient for most of the material presented. Several starred sections have been included as supplementary material. Nearly 900 problems and exercises of varying difficulty are given, and Appendix A contains answers to about one-third of them. The first volume (Chapters 1-8) deals with probability models and with math ematical methods for describing and manipulating them. It is similar in content and organization to the 1979 edition. Some sections have been rewritten and expanded-for example, the discussions of independent random variables and conditional probability. Many new exercises have been added. In the second volume (Chapters 9-16), probability models are used as the basis for the analysis and interpretation of data. This material has been revised extensively. Chapters 9 and 10 describe the use of the likelihood function in estimation problems, as in the 1979 edition. Chapter 11 then discusses frequency properties of estimation procedures, and introduces coverage probability and confidence intervals. Chapter 12 describes tests of significance, with applications primarily to frequency data. The likelihood ratio statistic is used to unify the material on testing, and connect it with earlier material on estimation."
By far the best-selling introduction to statistics for students in the behavioral and social sciences, this text continues to offer straightforward instruction, accuracy, built-in learning aids, and real-world examples. The goal of STATISTICS FOR THE BEHAVIORAL SCIENCES, International Edition is to not only teach the methods of statistics, but also to convey the basic principles of objectivity and logic that are essential for science and valuable in everyday life. Authors Frederick Gravetter and Larry Wallnau help students understand statistical procedures through a conceptual context that explains why the procedures were developed and when they should be used. Students have numerous opportunities to practice statistical techniques through Learning Checks, examples, step-by-step Demonstrations, and problems. A strong ancillary package includes PowerLecture (TM), which contains lecture slides, JoinIn (TM) Student Response System content, and a computerized test bank; Enhanced WebAssign, a complete and easy-to-use homework management system; WebTutor (TM); an Instructor's Manual/TestBank, plus other online and print resources. |
You may like...
Walter of Henley's Husbandry - Together…
Walter de Henley, Elizabeth Lamond
Hardcover
R832
Discovery Miles 8 320
Programming Ruby 1.9 & 2.0 4ed
David Thomas, Andy Hunt, …
Paperback
Perspectives on School Algebra
Rosamund Sutherland, Teresa Rojano, …
Hardcover
R4,164
Discovery Miles 41 640
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
International Handbook of Mathematics…
Despina Potari, Olive Chapman
Hardcover
R6,248
Discovery Miles 62 480
|