![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Probability & statistics
This book introduces the basic fundamentals, models, emulators and analyses of mem-elements in the circuit theory with applications. The book starts reviewing the literature on mem-elements, models and their recent applications. It presents mathematical models, numerical results, circuit simulations, and experimental results for double-loop hysteresis behavior of mem-elements. The authors introduce a generalized memristor model in the fractional-order domain under different input and different designs for emulator-based mem-elements, with circuit and experimental results. The basic concept of memristive-based relaxation-oscillators in the circuit theory is also covered. The reader will moreover find in this book information on memristor-based multi-level digital circuits, memristor-based multi-level multiplier and memcapacitor-based oscillators and synaptic circuits.
The articles in this collection are a sampling of some of the research presented during the conference "Stochastic Analysis and Related Topics", held in May of 2015 at Purdue University in honor of the 60th birthday of Rodrigo Banuelos. A wide variety of topics in probability theory is covered in these proceedings, including heat kernel estimates, Malliavin calculus, rough paths differential equations, Levy processes, Brownian motion on manifolds, and spin glasses, among other topics.
During the last two decades, structural equation modelling (SEM) has emerged as a powerful multivariate data analysis tool in social science research settings, especially in the fields of sociology, psychology, and education. Social science researchers and students benefit greatly from acquiring knowledge and skills in SEM, since the methods can provide a bridge between the theoretical and empirical aspects of behavioural research. Ramlall explains in a rigorous, concise, and practical manner all the vital components embedded in structural equation modelling (SEM). Focusing on R and Stata to implement and perform various structural equation models, Ramlall examines the types, benefits, and drawbacks of SEM, delving into model specifications and identifications, fit evaluations, and path diagrams.
This volume is a unique combination of papers that cover critical topics in biostatistics from academic, government, and industry perspectives. The 6 sections cover Bayesian methods in biomedical research; Diagnostic medicine and classification; Innovative Clinical Trials Design; Modelling and Data Analysis; Personalized Medicine; and Statistical Genomics. The real world applications are in clinical trials, diagnostic medicine and genetics. The peer-reviewed contributions were solicited and selected from some 400 presentations at the annual meeting of the International Chinese Statistical Association (ICSA), held with the International Society for Biopharmaceutical Statistics (ISBS). The conference was held in Bethesda in June 2013, and the material has been subsequently edited and expanded to cover the most recent developments.
The ideas of Fourier have made their way into every branch of mathematics and mathematical physics, from the theory of numbers to quantum mechanics. Fourier Series and Integrals focuses on the extraordinary power and flexibility of Fourier's basic series and integrals and on the astonishing variety of applications in which it is the chief tool. It presents a mathematical account of Fourier ideas on the circle and the line, on finite commutative groups, and on a few important noncommutative groups. A wide variety of exercises are placed in nearly every section as an integral part of the text.
This text presents a wide-ranging and rigorous overview of nearest neighbor methods, one of the most important paradigms in machine learning. Now in one self-contained volume, this book systematically covers key statistical, probabilistic, combinatorial and geometric ideas for understanding, analyzing and developing nearest neighbor methods. Gerard Biau is a professor at Universite Pierre et Marie Curie (Paris). Luc Devroye is a professor at the School of Computer Science at McGill University (Montreal).
Hardbound. The Handbook is a definitive reference source and teaching aid for econometricians. It examines models, estimation theory, data analysis and field applications in econometrics. Comprehensive surveys, written by experts, discuss recent developments at a level suitable for professional use by economists, econometricians, statisticians, and in advanced graduate econometrics courses.For more information on the Handbooks in Economics series, please see our home page on http: //www.elsevier.nl/locate/he
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. Volume I deals with Mathematical Methods in Economics, including reviews of the concepts and techniques that have been most useful for the mathematical development of economic theory. Volume II elaborates on Mathematical Approaches to Microeconomic Theory, including consumer, producer, oligopoly, and duality theory, as well as Mathematical Approaches to Competitive Equilibrium including such aspects of competitive equilibrium as existence, stability, uncertainty, the computation of equilibrium prices, and the core of an economy.
This book is a comprehensive introduction of the reader into the simulation and modelling techniques and their application in the management of organisations. The book is rooted in the thorough understanding of systems theory applied to organisations and focuses on how this theory can apply to econometric models used in the management of organisations. The econometric models in this book employ linear and dynamic programming, graph theory, queuing theory, game theory, etc. and are presented and analysed in various fields of application, such as investment management, stock management, strategic decision making, management of production costs and the lifecycle costs of quality and non-quality products, production quality Management, etc.
This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cadiz (Spain) between June 11-16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers from around the globe, and contribute to the further development of the field.
This book fills an important gap in studies on D. D. Kosambi. For the first time, the mathematical work of Kosambi is described, collected and presented in a manner that is accessible to non-mathematicians as well. A number of his papers that are difficult to obtain in these areas are made available here. In addition, there are essays by Kosambi that have not been published earlier as well as some of his lesser known works. Each of the twenty four papers is prefaced by a commentary on the significance of the work, and where possible, extracts from technical reviews by other mathematicians.
Hardbound. This volume covers an area of statistics dealing with complex problems in the production of goods and services, maintenance and repair, and management and operations. The opening chapter is by W. Edwards Deming, pioneer in statistical quality control, who was involved in the quality control movement in Japan and helped the country in its rapid industrial development. He gives a program to keep a country in an ascending path of industrial development.Areas covered in the further 23 chapters of the work include: - reliability of hardware and process control software;- the concepts and theory of reliability and the statistical inference problems arising therein;- the aspects of Quality Control of manufactured goods.
This work illustrates research conducted over a ten-year timespan and addresses a fundamental issue in reliability theory. This still appears to be an empirically disorganized field and the book suggests employing a deductive base in order to evolve reliability as a science. The study is in line with the fundamental work by Gnedenko. Boris Vladimirovich Gnedenko (1912 - 1995) was a Soviet mathematician who made significant contributions in various scientific areas. His name is especially associated with studies of dependability, for which he is often recognized as the 'father' of reliability theory. In the last few decades, this area has expanded in new directions such as safety, security, risk analysis and other fields, yet the book 'Mathematical Methods in Reliability Theory' written by Gnedenko with Alexander Soloviev and Yuri Belyaev still towers as a pillar of the reliability sector's configuration and identity. The present book proceeds in the direction opened by the cultural project of the Russian authors; in particular it identifies different trends in the hazard rate functions by means of deductive logic and demonstrations. Further, it arrives at multiple results by means of the entropy function, an original mathematical tool in the reliability domain. As such, it will greatly benefit all specialists in the field who are interested in unconventional solutions.
The Handbook is a definitive reference source and teaching aid for
econometricians. It examines models, estimation theory, data
analysis and field applications in econometrics. Comprehensive
surveys, written by experts, discuss recent developments at a level
suitable for professional use by economists, econometricians,
statisticians, and in advanced graduate econometrics courses. For
more information on the Handbooks in Economics series, please see
our home page on http: //www.elsevier.nl/locate/hes
This is the revised and augmented edition of a now classic book
which is an introduction to sub-Markovian kernels on general
measurable spaces and their associated homogeneous Markov chains.
The first part, an expository text on the foundations of the
subject, is intended for post-graduate students. A study of
potential theory, the basic classification of chains according to
their asymptotic behaviour and the celebrated Chacon-Ornstein
theorem are examined in detail.
The Handbook of Mathematical Economics aims to provide a definitive source, reference, and teaching supplement for the field of mathematical economics. It surveys, as of the late 1970's the state of the art of mathematical economics. This is a constantly developing field and all authors were invited to review and to appraise the current status and recent developments in their presentations. In addition to its use as a reference, it is intended that this Handbook will assist researchers and students working in one branch of mathematical economics to become acquainted with other branches of this field. Volume 1 deals with "Mathematical Methods in Economics," including reviews of the concepts and techniques that have been most useful for the mathematical development of economic theory. For more information on the Handbooks in Economics series,
please see our home page on http:
//www.elsevier.nl/locate/hes
1. Lossless Coding.- 2.Universal Coding on Finite Alphabets.- 3.Universal Coding on Infinite Alphabets.- 4.Model Order Estimation.- Notation.- Index.
The Handbook is a definitive reference source and teaching aid for
econometricians. It examines models, estimation theory, data
analysis and field applications in econometrics. Comprehensive
surveys, written by experts, discuss recent developments at a level
suitable for professional use by economists, econometricians,
statisticians, and in advanced graduate econometrics courses. For
more information on the Handbooks in Economics series, please see
our home page on http: //www.elsevier.nl/locate/hes
R is open source statistical computing software. Since the R core group was formed in 1997, R has been extended by a very large number of packages with extensive documentation along with examples freely available on the internet. It offers a large number of statistical and numerical methods and graphical tools and visualization of extraordinarily high quality. R was recently ranked in 14th place by the Transparent Language Popularity Index and 6th as a scripting language, after PHP, Python, and Perl. The book is designed so that it can be used right away by novices while appealing to experienced users as well. Each article begins with a data example that can be downloaded directly from the R website. Data analysis questions are articulated following the presentation of the data. The necessary R commands are spelled out and executed and the output is presented and discussed. Other examples of data sets with a different flavor and different set of commands but following the theme of the article are presented as well. Each chapter predents a hands-on-experience. R has superb graphical outlays and the book brings out the essentials in this arena. The end user can benefit immensely by applying the graphics to enhance research findings. The core statistical methodologies such as regression, survival analysis, and discrete data are all covered.
Articles from many of the main contributors to recent progress in stochastic analysis are included in this volume, which provides a snapshot of the current state of the area and its ongoing developments. It constitutes the proceedings of the conference on "Stochastic Analysis and Applications" held at the University of Oxford and the Oxford-Man Institute during 23-27 September, 2013. The conference honored the 60th birthday of Professor Terry Lyons FLSW FRSE FRS, Wallis Professor of Mathematics, University of Oxford. Terry Lyons is one of the leaders in the field of stochastic analysis. His introduction of the notion of rough paths has revolutionized the field, both in theory and in practice. Stochastic Analysis is the branch of mathematics that deals with the analysis of dynamical systems affected by noise. It emerged as a core area of mathematics in the late 20th century and has subsequently developed into an important theory with a wide range of powerful and novel tools, and with impressive applications within and beyond mathematics. Many systems are profoundly affected by stochastic fluctuations and it is not surprising that the array of applications of Stochastic Analysis is vast and touches on many aspects of life. The present volume is intended for researchers and Ph.D. students in stochastic analysis and its applications, stochastic optimization and financial mathematics, as well as financial engineers and quantitative analysts.
This book focuses on solving integral equations with difference kernels on finite intervals. The corresponding problem on the semiaxis was previously solved by N. Wiener-E. Hopf and by M.G. Krein. The problem on finite intervals, though significantly more difficult, may be solved using our method of operator identities. This method is also actively employed in inverse spectral problems, operator factorization and nonlinear integral equations. Applications of the obtained results to optimal synthesis, light scattering, diffraction, and hydrodynamics problems are discussed in this book, which also describes how the theory of operators with difference kernels is applied to stable processes and used to solve the famous M. Kac problems on stable processes. In this second edition these results are extensively generalized and include the case of all Levy processes. We present the convolution expression for the well-known Ito formula of the generator operator, a convolution expression that has proven to be fruitful. Furthermore we have added a new chapter on triangular representation, which is closely connected with previous results and includes a new important class of operators with non-trivial invariant subspaces. Numerous formulations and proofs have now been improved, and the bibliography has been updated to reflect more recent additions to the body of literature.
This book provides a snapshot of representative modeling analyses of coastal hypoxia and its effects. Hypoxia refers to conditions in the water column where dissolved oxygen falls below levels that can support most metazoan marine life (i.e., 2 mg O2 l-1). The number of hypoxic zones has been increasing at an exponential rate since the 1960s; there are currently more than 600 documented hypoxic zones in the estuarine and coastal waters worldwide. Hypoxia develops as a synergistic product of many physical and biological factors that affect the balance of dissolved oxygen in seawater, including temperature, solar radiation, wind, freshwater discharge, nutrient supply, and the production and decay of organic matter. A number of modeling approaches have been increasingly used in hypoxia research, along with the more traditional observational and experimental studies. Modeling is necessary because of rapidly changing coastal circulation and stratification patterns that affect hypoxia, the large spatial extent over which hypoxia develops, and limitations on our capabilities to directly measure hypoxia over large spatial and temporal scales. This book consists of 15 chapters that are broadly organized around three main topics: (1) Modeling of the physical controls on hypoxia, (2) Modeling of biogeochemical controls and feedbacks, and, (3) Modeling of the ecological effects of hypoxia. The final chapter is a synthesis chapter that draws generalities from the earlier chapters, highlights strengths and weaknesses of the current state-of-the-art modeling, and offers recommendations on future directions.
Current research results in stochastic and deterministic global optimization including single and multiple objectives are explored and presented in this book by leading specialists from various fields. Contributions include applications to multidimensional data visualization, regression, survey calibration, inventory management, timetabling, chemical engineering, energy systems, and competitive facility location. Graduate students, researchers, and scientists in computer science, numerical analysis, optimization, and applied mathematics will be fascinated by the theoretical, computational, and application-oriented aspects of stochastic and deterministic global optimization explored in this book. This volume is dedicated to the 70th birthday of Antanas Zilinskas who is a leading world expert in global optimization. Professor Zilinskas's research has concentrated on studying models for the objective function, the development and implementation of efficient algorithms for global optimization with single and multiple objectives, and application of algorithms for solving real-world practical problems. |
You may like...
Existence: Semantics and Syntax
Ileana Comorovski, Klaus Heusinger
Hardcover
R4,422
Discovery Miles 44 220
The Morphosyntax of Portuguese and…
Mary A. Kato, Francisco Ordonez
Hardcover
R3,766
Discovery Miles 37 660
English Vocabulary Elements - A Course…
William R. Leben, Brett Kessler, …
Hardcover
R2,444
Discovery Miles 24 440
|