![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
This book presents Markov and quantum processes as two sides of a coin called generated stochastic processes. It deals with quantum processes as reversible stochastic processes generated by one-step unitary operators, while Markov processes are irreversible stochastic processes generated by one-step stochastic operators. The characteristic feature of quantum processes are oscillations, interference, lots of stationary states in bounded systems and possible asymptotic stationary scattering states in open systems, while the characteristic feature of Markov processes are relaxations to a single stationary state. Quantum processes apply to systems where all variables, that control reversibility, are taken as relevant variables, while Markov processes emerge when some of those variables cannot be followed and are thus irrelevant for the dynamic description. Their absence renders the dynamic irreversible. A further aim is to demonstrate that almost any subdiscipline of theoretical physics can conceptually be put into the context of generated stochastic processes. Classical mechanics and classical field theory are deterministic processes which emerge when fluctuations in relevant variables are negligible. Quantum mechanics and quantum field theory consider genuine quantum processes. Equilibrium and non-equilibrium statistics apply to the regime where relaxing Markov processes emerge from quantum processes by omission of a large number of uncontrollable variables. Systems with many variables often self-organize in such a way that only a few slow variables can serve as relevant variables. Symmetries and topological classes are essential in identifying such relevant variables. The third aim of this book is to provide conceptually general methods of solutions which can serve as starting points to find relevant variables as to apply best-practice approximation methods. Such methods are available through generating functionals. The potential reader is a graduate student who has heard already a course in quantum theory and equilibrium statistical physics including the mathematics of spectral analysis (eigenvalues, eigenvectors, Fourier and Laplace transformation). The reader should be open for a unifying look on several topics.
This book was written to serve as a graduate-level textbook for special topics classes in mathematics, statistics, and economics, to introduce these topics to other researchers, and for use in short courses. It is an introduction to the theory of majorization and related notions, and contains detailed material on economic applications of majorization and the Lorenz order, investigating the theoretical aspects of these two interrelated orderings. Revising and expanding on an earlier monograph, Majorization and the Lorenz Order: A Brief Introduction, the authors provide a straightforward development and explanation of majorization concepts, addressing historical development of the topics, and providing up-to-date coverage of families of Lorenz curves. The exposition of multivariate Lorenz orderings sets it apart from existing treatments of these topics. Mathematicians, theoretical statisticians, economists, and other social scientists who already recognize the utility of the Lorenz order in income inequality contexts and arenas will find the book useful for its sound development of relevant concepts rigorously linked to both the majorization literature and the even more extensive body of research on economic applications. Barry C. Arnold, PhD, is Distinguished Professor in the Statistics Department at the University of California, Riverside. He is a Fellow of the American Statistical Society, the American Association for the Advancement of Science, and the Institute of Mathematical Statistics, and is an elected member of the International Statistical Institute. He is the author of more than two hundred publications and eight books. Jose Maria Sarabia, PhD, is Professor of Statistics and Quantitative Methods in Business and Economics in the Department of Economics at the University of Cantabria, Spain. He is author of more than one hundred and fifty publications and ten books and is an associate editor of several journals including TEST, Communications in Statistics, and Journal of Statistical Distributions and Applications.
This book includes the texts of the survey lectures given by plenary speakers at the 11th International ISAAC Congress held in Vaxjoe, Sweden, on 14-18 August, 2017. It is the purpose of ISAAC to promote analysis, its applications, and its interaction with computation. Analysis is understood here in the broad sense of the word, including differential equations, integral equations, functional analysis, and function theory. With this objective, ISAAC organizes international Congresses for the presentation and discussion of research on analysis. The plenary lectures in the present volume, authored by eminent specialists, are devoted to some exciting recent developments, topics including: local solvability for subprincipal type operators; fractional-order Laplacians; degenerate complex vector fields in the plane; lower bounds for pseudo-differential operators; a survey on Morrey spaces; localization operators in Signal Theory and Quantum Mechanics. Thanks to the accessible style used, readers only need a basic command of Calculus. This book will appeal to scientists, teachers, and graduate students in Mathematics, in particular Mathematical Analysis, Probability and Statistics, Numerical Analysis and Mathematical Physics.
Probabilistic models have much to offer to philosophy. We continually receive information from a variety of sources: from our senses, from witnesses, from scientific instruments. When considering whether we should believe this information, we assess whether the sources are independent, how reliable they are, and how plausible and coherent the information is. Bovens and Hartmann provide a systematic Bayesian account of these features of reasoning. Simple Bayesian networks allow us to model alternative assumptions about the nature of the information sources. Measurement of the coherence of information is a controversial matter: arguably, the more coherent a set of information is, the more confident we may be that its content is true, other things being equal. The authors offer a new treatment of coherence which respects this claim and shows its relevance to scientific theory choice. Bovens and Hartmann apply this methodology to a wide range of much-discussed issues regarding evidence, testimony, scientific theories and voting. "Bayesian Epistemology" is for anyone working on probabilistic methods in philosophy, and has broad implications for many other disciplines.
This thesis develops a systematic, data-based dynamic modeling framework for industrial processes in keeping with the slowness principle. Using said framework as a point of departure, it then proposes novel strategies for dealing with control monitoring and quality prediction problems in industrial production contexts. The thesis reveals the slowly varying nature of industrial production processes under feedback control, and integrates it with process data analytics to offer powerful prior knowledge that gives rise to statistical methods tailored to industrial data. It addresses several issues of immediate interest in industrial practice, including process monitoring, control performance assessment and diagnosis, monitoring system design, and product quality prediction. In particular, it proposes a holistic and pragmatic design framework for industrial monitoring systems, which delivers effective elimination of false alarms, as well as intelligent self-running by fully utilizing the information underlying the data. One of the strengths of this thesis is its integration of insights from statistics, machine learning, control theory and engineering to provide a new scheme for industrial process modeling in the era of big data.
Statistical Techniques for Transportation Engineering is written with a systematic approach in mind and covers a full range of data analysis topics, from the introductory level (basic probability, measures of dispersion, random variable, discrete and continuous distributions) through more generally used techniques (common statistical distributions, hypothesis testing), to advanced analysis and statistical modeling techniques (regression, AnoVa, and time series). The book also provides worked out examples and solved problems for a wide variety of transportation engineering challenges.
This book provides a general discussion beneficial to librarians and library school students, and demonstrates the steps of the research process, decisions made in the selection of a statistical technique, how to program a computer to perform number crunching, how to compute those statistical techniques appearing most frequently in the literature of library and information science, and examples from the literature of the uses of different statistical techniques. The book accomplishes the following objectives: to provide an overview of the research process and to show where statistics fit in; to identify journals in library and information science most likely to publish research articles; to identify reference tools that provide access to the research literature; to show how microcomputers can be programmed to engage in number crunching; to introduce basic statistical concepts and terminology; to present basic statistical procedures that appear most frequently in the literature of library and information science and that have application to library decision making; to discuss library decision support systems and show the types of statistical techniques they can perform; and to summarize the major decisions that researchers must address in deciding which statistical techniques to employ.
This book presents state-of-the-art probabilistic methods for the reliability analysis and design of engineering products and processes. It seeks to facilitate practical application of probabilistic analysis and design by providing an authoritative, in-depth, and practical description of what probabilistic analysis and design is and how it can be implemented. The text is packed with many practical engineering examples (e.g., electric power transmission systems, aircraft power generating systems, and mechanical transmission systems) and exercise problems. It is an up-to-date, fully illustrated reference suitable for both undergraduate and graduate engineering students, researchers, and professional engineers who are interested in exploring the fundamentals, implementation, and applications of probabilistic analysis and design methods.
Modeling Uncertainty: An Examination of Stochastic Theory, Methods, and Applications, is a volume undertaken by the friends and colleagues of Sid Yakowitz in his honor. Fifty internionally known scholars have collectively contributed 30 papers on modeling uncertainty to this volume. Each of these papers was carefully reviewed and in the majority of cases the original submission was revised before being accepted for publication in the book. The papers cover a great variety of topics in probability, statistics, economics, stochastic optimization, control theory, regression analysis, simulation, stochastic programming, Markov decision process, application in the HIV context, and others. There are papers with a theoretical emphasis and others that focus on applications. A number of papers survey the work in a particular area and in a few papers the authors present their personal view of a topic. It is a book with a considerable number of expository articles, which are accessible to a nonexpert - a graduate student in mathematics, statistics, engineering, and economics departments, or just anyone with some mathematical background who is interested in a preliminary exposition of a particular topic. Many of the papers present the state of the art of a specific area or represent original contributions which advance the present state of knowledge. In sum, it is a book of considerable interest to a broad range of academic researchers and students of stochastic systems.
With a foreword by Major-General Nico Geerts, Commander Netherlands Defence Academy, Breda, The Netherlands International conflict resolution increasingly involves the use of non-military power and non-kinetic capabilities alongside military capabilities in the face of hybrid threats. In this book, counter-measures to those threats are addressed by academics with both practical and theoretical experience and knowledge, providing strategic and operational insights into non-kinetic conflict resolution and on the use of power to influence, affect, deter or coerce states and non-state actors. This volume in the NL ARMS series deals with the non-kinetic capabilities to address international crises and conflicts and as always views matters from a global perspective. Included are chapters on the promise, practice and challenges of non-kinetic instruments of power, the instrumentality of soft power, information as a power instrument and manoeuvring in the information environment, Russia's use of deception and misinformation in conflict, applying counter-marketing techniques to fight ISIL, using statistics to profile terrorists, and employing tools such as Actor and Audience Analysis. Such diverse subjects as lawfare, the Law of Armed Conflict rules for non-kinetic cyber attacks, navigation warfare, GPS-spoofing, maritime interception operations, and finally, as a prerequisite, innovative ways for intelligence collection in UN Peacekeeping in Mali come up for discussion.The book will provide both professionals such as (foreign) policy makers and those active in the military services, academics at a master level and those with an interest in military law and the law of armed conflict with useful and up-to-date insights into the wide range of subjects that are contained within it. Paul A.L. Ducheine and Frans P.B. Osinga are General Officers and full professors at the Faculty of Military Sciences of the Netherlands Defence Academy in Breda, The Netherlands. Specific to this volume in the Series: * Written by academics with both practical and theoretical experience* Addresses counter measures to hybrid crises* Offers both strategic and operational insights to non-kinetic conflict resolution
Markov process theory is basically an extension of ordinary
calculus to accommodate functions whos time evolutions are not
entirely deterministic. It is a subject that is becoming
increasingly important for many fields of science. This book
develops the single-variable theory of both continuous and jump
Markov processes in a way that should appeal especially to
physicists and chemists at the senior and graduate level.
The monograph compares two approaches that describe the statistical stability phenomenon - one proposed by the probability theory that ignores violations of statistical stability and another proposed by the theory of hyper-random phenomena that takes these violations into account. There are five parts. The first describes the phenomenon of statistical stability. The second outlines the mathematical foundations of probability theory. The third develops methods for detecting violations of statistical stability and presents the results of experimental research on actual processes of different physical nature that demonstrate the violations of statistical stability over broad observation intervals. The fourth part outlines the mathematical foundations of the theory of hyper-random phenomena. The fifth part discusses the problem of how to provide an adequate description of the world. The monograph should be interest to a wide readership: from university students on a first course majoring in physics, engineering, and mathematics to engineers, post-graduate students, and scientists carrying out research on the statistical laws of natural physical phenomena, developing and using statistical methods for high-precision measurement, prediction, and signal processing over broad observation intervals. To read the book, it is sufficient to be familiar with a standard first university course on mathematics.
This is a comprehensive survey on the research on the parabolic Anderson model - the heat equation with random potential or the random walk in random potential - of the years 1990 - 2015. The investigation of this model requires a combination of tools from probability (large deviations, extreme-value theory, e.g.) and analysis (spectral theory for the Laplace operator with potential, variational analysis, e.g.). We explain the background, the applications, the questions and the connections with other models and formulate the most relevant results on the long-time behavior of the solution, like quenched and annealed asymptotics for the total mass, intermittency, confinement and concentration properties and mass flow. Furthermore, we explain the most successful proof methods and give a list of open research problems. Proofs are not detailed, but concisely outlined and commented; the formulations of some theorems are slightly simplified for better comprehension.
This book collects research papers on the philosophical foundations of probability, causality, spacetime and quantum theory. The papers are related to talks presented in six subsequent workshops organized by The Budapest-Krakow Research Group on Probability, Causality and Determinism. Coverage consists of three parts. Part I focuses on the notion of probability from a general philosophical and formal epistemological perspective. Part II applies probabilistic considerations to address causal questions in the foundations of quantum mechanics. Part III investigates the question of indeterminism in spacetime theories. It also explores some related questions, such as decidability and observation. The contributing authors are all philosophers of science with a strong background in mathematics or physics. They believe that paying attention to the finer formal details often helps avoiding pitfalls that exacerbate the philosophical problems that are in the center of focus of contemporary research. The papers presented here help make explicit the mathematical-structural assumptions that underlie key philosophical argumentations. This formally rigorous and conceptually precise approach will appeal to researchers and philosophers as well as mathematicians and statisticians.
This book deals with the number-theoretic properties of almost all real numbers. It brings together many different types of result never covered within the same volume before, thus showing interactions and common ideas between different branches of the subject. It provides an indispensable compendium of basic results, important theorems and open problems. Starting from the classical results of Borel, Khintchine and Weyl, normal numbers, Diophantine approximation and uniform distribution are all discussed. Questions are generalized to higher dimensions and various non-periodic problems are also considered (for example restricting approximation to fractions with prime numerator and denominator). Finally, the dimensions of some of the exceptional sets of measure zero are considered.
This book presents the data privacy protection which has been extensively applied in our current era of big data. However, research into big data privacy is still in its infancy. Given the fact that existing protection methods can result in low data utility and unbalanced trade-offs, personalized privacy protection has become a rapidly expanding research topic.In this book, the authors explore emerging threats and existing privacy protection methods, and discuss in detail both the advantages and disadvantages of personalized privacy protection. Traditional methods, such as differential privacy and cryptography, are discussed using a comparative and intersectional approach, and are contrasted with emerging methods like federated learning and generative adversarial nets. The advances discussed cover various applications, e.g. cyber-physical systems, social networks, and location-based services. Given its scope, the book is of interest to scientists, policy-makers, researchers, and postgraduates alike.
John E. Freund's Mathematical Statistics with Applications, Eighth Edition, provides a calculus-based introduction to the theory and application of statistics, based on comprehensive coverage that reflects the latest in statistical thinking, the teaching of statistics, and current practices.
This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and government organizations.
This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).
This book shows how to develop efficient quantitative methods to characterize neural data and extra information that reveals underlying dynamics and neurophysiological mechanisms. Written by active experts in the field, it contains an exchange of innovative ideas among researchers at both computational and experimental ends, as well as those at the interface. Authors discuss research challenges and new directions in emerging areas with two goals in mind: to collect recent advances in statistics, signal processing, modeling, and control methods in neuroscience; and to welcome and foster innovative or cross-disciplinary ideas along this line of research and discuss important research issues in neural data analysis. Making use of both tutorial and review materials, this book is written for neural, electrical, and biomedical engineers; computational neuroscientists; statisticians; computer scientists; and clinical engineers.
Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: * Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving * More than 430 unique exercises with select solutions * Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.
Intelligent Computing for Interactive System Design provides a comprehensive resource on what has become the dominant paradigm in designing novel interaction methods, involving gestures, speech, text, touch and brain-controlled interaction, embedded in innovative and emerging human-computer interfaces. These interfaces support ubiquitous interaction with applications and services running on smartphones, wearables, in-vehicle systems, virtual and augmented reality, robotic systems, the Internet of Things (IoT), and many other domains that are now highly competitive, both in commercial and in research contexts. This book presents the crucial theoretical foundations needed by any student, researcher, or practitioner working on novel interface design, with chapters on statistical methods, digital signal processing (DSP), and machine learning (ML). These foundations are followed by chapters that discuss case studies on smart cities, brain-computer interfaces, probabilistic mobile text entry, secure gestures, personal context from mobile phones, adaptive touch interfaces, and automotive user interfaces. The case studies chapters also highlight an in-depth look at the practical application of DSP and ML methods used for processing of touch, gesture, biometric, or embedded sensor inputs. A common theme throughout the case studies is ubiquitous support for humans in their daily professional or personal activities. In addition, the book provides walk-through examples of different DSP and ML techniques and their use in interactive systems. Common terms are defined, and information on practical resources is provided (e.g., software tools, data resources) for hands-on project work to develop and evaluate multimodal and multi-sensor systems. In a series of in-chapter commentary boxes, an expert on the legal and ethical issues explores the emergent deep concerns of the professional community, on how DSP and ML should be adopted and used in socially appropriate ways, to most effectively advance human performance during ubiquitous interaction with omnipresent computers. This carefully edited collection is written by international experts and pioneers in the fields of DSP and ML. It provides a textbook for students and a reference and technology roadmap for developers and professionals working on interaction design on emerging platforms. |
You may like...
Research Developments in Computer Vision…
Rajeev Srivastava, S.K. Singh, …
Hardcover
R4,981
Discovery Miles 49 810
Optical Coherence Tomography - A…
Rui Bernardes, Jose Cunha Vaz
Hardcover
R2,675
Discovery Miles 26 750
Advanced Signal Processing for Industry…
Irshad Ahmad Ansari, Varun Bajaj
Hardcover
R3,271
Discovery Miles 32 710
Colour Atlas Of Ophthalmology (Fifth…
Arthur S M Lim, Ian J Constable, …
Paperback
R867
Discovery Miles 8 670
Advanced Topics on Computer Vision…
Osslan Osiris Vergara Villegas, Manuel Nandayapa, …
Hardcover
R4,306
Discovery Miles 43 060
Infrastructure Computer Vision
Ioannis Brilakis, Carl Thomas Michael Haas
Paperback
R3,039
Discovery Miles 30 390
|