![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Probability & statistics
The domain of non-extensive thermostatistics has been subject to intensive research over the past twenty years and has matured significantly. Generalised Thermostatistics cuts through the traditionalism of many statistical physics texts by offering a fresh perspective and seeking to remove elements of doubt and confusion surrounding the area. The book is divided into two parts - the first covering topics from conventional statistical physics, whilst adopting the perspective that statistical physics is statistics applied to physics. The second developing the formalism of non-extensive thermostatistics, of which the central role is played by the notion of a deformed exponential family of probability distributions. Presented in a clear, consistent, and deductive manner, the book focuses on theory, part of which is developed by the author himself, but also provides a number of references towards application-based texts. Written by a leading contributor in the field, this book will provide a useful tool for learning about recent developments in generalized versions of statistical mechanics and thermodynamics, especially with respect to self-study. Written for researchers in theoretical physics, mathematics and statistical mechanics, as well as graduates of physics, mathematics or engineering. A prerequisite knowledge of elementary notions of statistical physics and a substantial mathematical background are required.
This book contains selected contributions from the geoENV96 - First European Conference on Geostatistics for Environmental Applications, held in Lisbon in November 1996. This is the first of a geoENV series of biennial planned books. The series is intended to show the state of the art of geostatistics in environmental applications with new cases, results and relevant discussions from leading researchers and practitioners around the world. New and important theoretical and practical developments of geostatistics in the environmental field were compiled from three main areas: Hydrology, Groundwater and Groundwater Contamination Soil Contamination and Site Remediation Air Pollution, Ecology and Other Applications The book presents a set of geostatistical tools and approaches used to successfully resolve a variety of specific problems in environment modelling, especially those resulting from the typical scarcity of spatial sampling, the time component of very dynamic systems, the modelling of various systems of contaminants, the uncertainty assessment of health cost functions, etc. Prominent topics concerning methodological tools and methods, stochastic simulation techniques, models of integrating soft information (seismic and remote sensing images), inverse modelling of groundwater flow, neural network classification, change of support and up-scaling are also included in this book. This publication will be of great interest and practical value to geostatisticians working both in universities and in industry.
This volume presents 27 selected papers in topics that range from statistical applications in business and finance to applications in clinical trials and biomarker analysis. All papers feature original, peer-reviewed content. The editors intentionally selected papers that cover many topics so that the volume will serve the whole statistical community and a variety of research interests. The papers represent select contributions to the 21st ICSA Applied Statistics Symposium. The International Chinese Statistical Association (ICSA) Symposium took place between the 23rd and 26th of June, 2012 in Boston, Massachusetts. It was co-sponsored by the International Society for Biopharmaceutical Statistics (ISBS) and American Statistical Association (ASA). This is the inaugural proceedings volume to share research from the ICSA Applied Statistics Symposium.
This book covers the basic statistical and analytical techniques of computer intrusion detection. It is aimed at both statisticians looking to become involved in the data analysis aspects of computer security and computer scientists looking to expand their toolbox of techniques for detecting intruders. The book is self-contained, assumng no expertise in either computer security or statistics. It begins with a description of the basics of TCP/IP, followed by chapters dealing with network traffic analysis, network monitoring for intrusion detection, host based intrusion detection, and computer viruses and other malicious code. Each section develops the necessary tools as needed. There is an extensive discussion of visualization as it relates to network data and intrusion detection. The book also contains a large bibliography covering the statistical, machine learning, and pattern recognition literature related to network monitoring and intrusion detection. David Marchette is a scientist at the Naval Surface Warfacre Center in Dalhgren, Virginia. He has worked at Navy labs for 15 years, doing research in pattern recognition, computational statistics, and image analysis. He has been a fellow by courtesy in the mathematical sciences department of the Johns Hopkins University since 2000. He has been working in conputer intrusion detection for several years, focusing on statistical methods for anomaly detection and visualization. Dr. Marchette received a Masters in Mathematics from the University of California, San Diego in 1982 and a Ph.D. in Computational Sciences and Informatics from George Mason University in 1996.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
Praise for the First Edition "If you ... want an up-to-date, definitive reference written by authors who have contributed much to this field, then this book is an essential addition to your library." --Journal of the American Statistical Association A COMPREHENSIVE REVIEW OF MODERN EXPERIMENTAL DESIGN Experiments: Planning, Analysis, and Optimization, Third Edition provides a complete discussion of modern experimental design for product and process improvement--the design and analysis of experiments and their applications for system optimization, robustness, and treatment comparison. While maintaining the same easy-to-follow style as the previous editions, this book continues to present an integrated system of experimental design and analysis that can be applied across various fields of research including engineering, medicine, and the physical sciences. New chapters provide modern updates on practical optimal design and computer experiments, an explanation of computer simulations as an alternative to physical experiments. Each chapter begins with a real-world example of an experiment followed by the methods required to design that type of experiment. The chapters conclude with an application of the methods to the experiment, bridging the gap between theory and practice. The authors modernize accepted methodologies while refining many cutting-edge topics including robust parameter design, analysis of non-normal data, analysis of experiments with complex aliasing, multilevel designs, minimum aberration designs, and orthogonal arrays. The third edition includes: Information on the design and analysis of computer experiments A discussion of practical optimal design of experiments An introduction to conditional main effect (CME) analysis and definitive screening designs (DSDs) New exercise problems This book includes valuable exercises and problems, allowing the reader to gauge their progress and retention of the book's subject matter as they complete each chapter. Drawing on examples from their combined years of working with industrial clients, the authors present many cutting-edge topics in a single, easily accessible source. Extensive case studies, including goals, data, and experimental designs, are also included, and the book's data sets can be found on a related FTP site, along with additional supplemental material. Chapter summaries provide a succinct outline of discussed methods, and extensive appendices direct readers to resources for further study. Experiments: Planning, Analysis, and Optimization, Third Edition is an excellent book for design of experiments courses at the upper-undergraduate and graduate levels. It is also a valuable resource for practicing engineers and statisticians.
These three volumes comprise the proceedings of the US/Japan Conference, held in honour of Professor H. Akaike, on the Frontiers of Statistical Modeling: an Informational Approach'. The major theme of the conference was the implementation of statistical modeling through an informational approach to complex, real-world problems. Volume 1 contains papers which deal with the Theory and Methodology of Time Series Analysis. Volume 1 also contains the text of the Banquet talk by E. Parzen and the keynote lecture of H. Akaike. Volume 2 is devoted to the general topic of Multivariate Statistical Modeling, and Volume 3 contains the papers relating to Engineering and Scientific Applications. For all scientists whose work involves statistics.
This volume presents selections of Peter J. Bickel's major papers, along with comments on their novelty and impact on the subsequent development of statistics as a discipline. Each of the eight parts concerns a particular area of research and provides new commentary by experts in the area. The parts range from Rank-Based Nonparametrics to Function Estimation and Bootstrap Resampling. Peter's amazing career encompasses the majority of statistical developments in the last half-century or about about half of the entire history of the systematic development of statistics. This volume shares insights on these exciting statistical developments with future generations of statisticians. The compilation of supporting material about Peter's life and work help readers understand the environment under which his research was conducted. The material will also inspire readers in their own research-based pursuits. This volume includes new photos of Peter Bickel, his biography, publication list, and a list of his students. These give the reader a more complete picture of Peter Bickel as a teacher, a friend, a colleague, and a family man.
This book discusses dynamical systems that are typically driven by stochastic dynamic noise. It is written by two statisticians essentially for the statistically inclined readers, although readers whose primary interests are in determinate systems will find some of the methodology explained in this book of interest. The statistical approach adopted in this book differs in many ways from the deterministic approach to dynamical systems. Even the very basic notion of initial-value sensitivity requires careful development in the new setting provided. This book covers, in varying depth, many of the contributions made by the statisticians in the past twenty years or so towards our understanding of estimation, the Lyapunov-like index, the nonparametric regression, and many others, many of which are motivated by their dynamical system counterparts but have now acquired a distinct statistical flavour. Kung-Sik Chan is a professor at the University of Iowa, Department of Statistics and Actuarial Science. He is an elected member of the International Statistical Institute. He has served on the editorial boards of the Journal of Business and Economic Statistics and Statistica Sinica. He received a Faculty Scholar Award from the University of Iowa in 1996. Howell Tong holds the Chair of Statistics at the London School of Economics and the University of Hong Kong. He is a foreign member of the Norwegian Academy of Science and Letters, an elected member of the International Statistical Institute and a Council member of its Bernoulli Society, an elected fellow of the Institute of Mathematical Statistics, and an honorary fellow of the Institute of Actuaries (London). He was the Founding Dean of the Graduate School and sometimes the Acting Pro-Vice Chancellor (Research) at the University of Hong Kong. He has served on the editorial boards of several international journals, including Biometrika, Journal of Royal Statistical Society (Series B), Statistica Sinica, and others. He is a guest professor of the Academy of Mathematical and System Sciences of the Chinese Academy of Sciences and received a National Natural Science Prize (China) in the category of Mathematics and Mechanics (Class II) in 2001. He has also held visiting professorships at various universities, including the Imperial College in London, the ETH in Zurich, the Fourier University in Grenoble, the Wall Institute at the University of British Columbia, Vancouver, and the Chinese University of Hong Kong.
This book describes a system of mathematical models and methods that can be used to analyze real economic and managerial decisions and to improve their effectiveness. Application areas include: management of development and operation budgets, assessment and management of economic systems using an energy entropy approach, equation of exchange rates and forecasting foreign exchange operations, evaluation of innovative projects, monitoring of governmental programs, risk management of investment processes, decisions on the allocation of resources, and identification of competitive industrial clusters. The proposed methods and models were tested on the example of Kazakhstan's economy, but the generated solutions will be useful for applications at other levels and in other countries. Regarding your book "Mathematical Methods and Models in Economics", I am impressed because now it is time when "econometrics" is becoming more appreciated by economists and by schools that are the hosts or employers of modern economists. ... Your presented results really impressed me. John F. Nash, Jr., Princeton University, Nobel Memorial Prize in Economic Sciences The book is within my scope of interest because of its novelty and practicality. First, there is a need for realistic modeling of complex systems, both natural and artificial that conclude computer and economic systems. There has been an ongoing effort in developing models dealing with complexity and incomplete knowledge. Consequently, it is clear to recognize the contribution of Mutanov to encapsulate economic modeling with emphasis on budgeting and innovation. Secondly, the method proposed by Mutanov has been verified by applying to the case of the Republic of Kazakhstan, with her vibrant emerging economy. Thirdly, Chapter 5 of the book is of particular interest for the computer technology community because it deals with innovation. In summary, the book of Mutanov should become one of the outstanding recognized pragmatic guides for dealing with innovative systems. Andrzej Rucinski, University of New Hampshire This book is unique in its theoretical findings and practical applicability. The book is an illuminating study based on an applied mathematical model which uses methods such as linear programming and input-output analysis. Moreover, this work demonstrates the author's great insight and academic brilliance in the fields of finance, technological innovations and marketing vis-a-vis the market economy. From both theoretical and practical standpoint, this work is indeed a great achievement. Yeon Cheon Oh, President of Seoul National University
The aim of this book is to report on the progress realized in
probability theory in the field of dynamic random walks and to
present applications in computer science, mathematical physics and
finance. Each chapter contains didactical material as well as more
advanced technical sections. Few appendices will help refreshing
memories (if necessary!).
Multivariate polynomials are a main tool in approximation. The book begins with an introduction to the general theory by presenting the most important facts on multivariate interpolation, quadrature, orthogonal projections and their summation, all treated under a constructive view, and embedded in the theory of positive linear operators. On this background, the book gives the first comprehensive introduction to the recently developped theory of generalized hyperinterpolation. As an application, the book gives a quick introduction to tomography. Several parts of the book are based on rotation principles, which are presented in the beginning of the book, together with all other basic facts needed.
This book outlines Bayesian statistical analysis in great detail, from the development of a model through the process of making statistical inference. The key feature of this book is that it covers models that are most commonly used in social science research - including the linear regression model, generalized linear models, hierarchical models, and multivariate regression models - and it thoroughly develops each real-data example in painstaking detail.
This book discusses the theory, methods, and applications of flow of funds analysis. The book integrates the basic principles of economic statistics, financial accounts, international finance, econometric models, and financial network analysis, providing a systematic and comprehensive introduction to the interconnection between these research fields. It thus provides the reader with the intellectual groundwork indispensable for understanding the workings and interactions of today's globalized financial markets. The main focus of the book is how to observe the flow of funds in macroeconomics, how to measure the global flow of funds (GFF), and how to use GFF data to carry out an analysis. Based on the statistical framework for measuring GFF under the System of National Accounts, the book identifies the systematic relationship of financial linkages among economic sectors and with the rest of the world while integrating data sources that include stock data, geographically broken down by country-region, and selected financial instruments. It sets out the GFF concept and constructs a GFF matrix (metadata) on a from-whom-to-whom basis within a country-by-country pattern. Lastly, an established GFF matrix table is used to conduct an empirical study including an econometric model and financial network analysis.
Real-life problems are often quite complicated in form and nature and, for centuries, many different mathematical concepts, ideas and tools have been developed to formulate these problems theoretically and then to solve them either exactly or approximately. This book aims to gather a collection of papers dealing with several different problems arising from many disciplines and some modern mathematical approaches to handle them. In this respect, the book offers a wide overview on many of the current trends in Mathematics as valuable formal techniques in capturing and exploiting the complexity involved in real-world situations. Several researchers, colleagues, friends and students of Professor Maria Luisa Menendez have contributed to this volume to pay tribute to her and to recognize the diverse contributions she had made to the fields of Mathematics and Statistics and to the profession in general. She had a sweet and strong personality, and instilled great values and work ethics in her students through her dedication to teaching and research. Even though the academic community lost her prematurely, she would continue to provide inspiration to many students and researchers worldwide through her published work."
The changes of populations are determined by fertility, mortality and migration. On the national level, international migration is a factor of increasing demographic, economic, social and political importance. This book addresses the debate on the impact of international migration and economic activity on population and labour force resources in future. It presents a study conducted for 27 European countries, looking 50 years ahead (2002-2052). An extended discussion of theories and factors underlying the assumed evolution of the components of change and economic activity is included as well as a detailed analysis of the historical trends. These theoretical and empirical considerations lead to defining scenarios of future mortality, fertility, economic activity and international migration, which have been fed into a projection model, producing various future population dynamics and labour force trajectories. In addition, simulations have been made to estimate the size of replacement migration needed to maintain selected demographic and labour market parameters in the countries of Europe. The results presented in this book allow researchers, governments and policy makers to evaluate to what extent various migration and labour market policies may be instrumental in achieving the desired population and labour size and structures. The secondary purpose of this volume is to reveal the methodology and argumentation lying behind a complex population forecasting and simulation exercise, which is not done frequently, but is critical for the assessment of the forecasts and also valuable from a purely didactic point of view.
The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, analysis of variance, contingency table analysis, and measures of association and agreement. A non-mathematical approach makes the text accessible to readers of all levels.
Often a statistical analysis involves use of a set of alternative models for the data. A "model-selection criterion" is a formula which provides a figure-of merit for the alternative models. Generally the alternative models will involve different numhers of parameters. Model-selection criteria take into account hoth the goodness-or-fit of a model and the numher of parameters used to achieve that fit. 1.1. SETS OF ALTERNATIVE MODELS Thus the focus in this paper is on data-analytic situations ill which there is consideration of a set of alternative models. Choice of a suhset of explanatory variahles in regression, the degree of a polynomial regression, the number of factors in factor analysis, or the numher of dusters in duster analysis are examples of such situations. 1.2. MODEL SELECTION VERSUS HYPOTHESIS TESTING In exploratory data analysis or in a preliminary phase of inference an approach hased on model-selection criteria can offer advantages over tests of hypotheses. The model-selection approach avoids the prohlem of specifying error rates for the tests. With model selection the focus can he on simultaneous competition between a hroad dass of competing models rather than on consideration of a sequence of simpler and simpler models."
An increasing number of statistical problems and methods involve infinite-dimensional aspects. This is due to the progress of technologies which allow us to store more and more information while modern instruments are able to collect data much more effectively due to their increasingly sophisticated design. This evolution directly concerns statisticians, who have to propose new methodologies while taking into account such high-dimensional data (e.g. continuous processes, functional data, etc.). The numerous applications (micro-arrays, paleo- ecological data, radar waveforms, spectrometric curves, speech recognition, continuous time series, 3-D images, etc.) in various fields (biology, econometrics, environmetrics, the food industry, medical sciences, paper industry, etc.) make researching this statistical topic very worthwhile. This book gathers important contributions on the functional and operatorial statistics fields.
The author investigates athermal fluctuation from the viewpoints of statistical mechanics in this thesis. Stochastic methods are theoretically very powerful in describing fluctuation of thermodynamic quantities in small systems on the level of a single trajectory and have been recently developed on the basis of stochastic thermodynamics. This thesis proposes, for the first time, a systematic framework to describe athermal fluctuation, developing stochastic thermodynamics for non-Gaussian processes, while thermal fluctuations are mainly addressed from the viewpoint of Gaussian stochastic processes in most of the conventional studies. First, the book provides an elementary introduction to the stochastic processes and stochastic thermodynamics. The author derives a Langevin-like equation with non-Gaussian noise as a minimal stochastic model for athermal systems, and its analytical solution by developing systematic expansions is shown as the main result. Furthermore, the a uthor shows a thermodynamic framework for such non-Gaussian fluctuations, and studies some thermodynamics phenomena, i.e. heat conduction and energy pumping, which shows distinct characteristics from conventional thermodynamics. The theory introduced in the book would be a systematic foundation to describe dynamics of athermal fluctuation quantitatively and to analyze their thermodynamic properties on the basis of stochastic methods.
A unique, practical guide for industry professionals who need to improve product quality and reliability in repairable systems Owing to its vital role in product quality, reliability has been intensely studied in recent decades. Most of this research, however, addresses systems that are nonrepairable and therefore discarded upon failure. Statistical Methods for the Reliability of Repairable Systems fills the gap in the field, focusing exclusively on an important yet long-neglected area of reliability. Written by two highly recognized members of the reliability and statistics community, this new work offers a unique, systematic treatment of probabilistic models used for repairable systems as well as the statistical methods for analyzing data generated from them. Liberally supplemented with examples as well as exercises boasting real data, the book clearly explains the difference between repairable and nonrepairable systems and helps readers develop an understanding of stochastic point processes. Data analysis methods are discussed for both single and multiple systems and include graphical methods, point estimation, interval estimation, hypothesis tests, goodness-of-fit tests, and reliability prediction. Complete with extensive graphs, tables, and references, Statistical Methods for the Reliability of Repairable Systems is an excellent working resource for industry professionals involved in producing reliable systems and a handy reference for practitioners and researchers in the field.
This is the proceedings of the "8th IMACS Seminar on Monte Carlo Methods" held from August 29 to September 2, 2011 in Borovets, Bulgaria, and organized by the Institute of Information and Communication Technologies of the Bulgarian Academy of Sciences in cooperation with the International Association for Mathematics and Computers in Simulation (IMACS). Included are 24 papers which cover all topics presented in the sessions of the seminar: stochastic computation and complexity of high dimensional problems, sensitivity analysis, high-performance computations for Monte Carlo applications, stochastic metaheuristics for optimization problems, sequential Monte Carlo methods for large-scale problems, semiconductor devices and nanostructures. The history of the IMACS Seminar on Monte Carlo Methods goes back to April 1997 when the first MCM Seminar was organized in Brussels: 1st IMACS Seminar, 1997, Brussels, Belgium 2nd IMACS Seminar, 1999, Varna, Bulgaria 3rd IMACS Seminar, 2001, Salzburg, Austria 4th IMACS Seminar, 2003, Berlin, Germany 5th IMACS Seminar, 2005, Tallahassee, USA 6th IMACS Seminar, 2007, Reading, UK 7th IMACS Seminar, 2009, Brussels, Belgium 8th IMACS Seminar, 2011, Borovets, Bulgaria
This book presents current research on Ulam stability for functional equations and inequalities. Contributions from renowned scientists emphasize fundamental and new results, methods and techniques. Detailed examples are given to theories to further understanding at the graduate level for students in mathematics, physics, and engineering. Key topics covered in this book include: Quasi means Approximate isometries Functional equations in hypergroups Stability of functional equations Fischer-Muszely equation Haar meager sets and Haar null sets Dynamical systems Functional equations in probability theory Stochastic convex ordering Dhombres functional equation Nonstandard analysis and Ulam stability This book is dedicated in memory of Stanilsaw Marcin Ulam, who posed the fundamental problem concerning approximate homomorphisms of groups in 1940; which has provided the stimulus for studies in the stability of functional equations and inequalities.
Gian-Carlo Rota was born in Vigevano, Italy, in 1932. He died in Cambridge, Mas sachusetts, in 1999. He had several careers, most notably as a mathematician, but also as a philosopher and a consultant to the United States government. His mathe matical career was equally varied. His early mathematical studies were at Princeton (1950 to 1953) and Yale (1953 to 1956). In 1956, he completed his doctoral thesis under the direction of Jacob T. Schwartz. This thesis was published as the pa per "Extension theory of differential operators I", the first paper reprinted in this volume. Rota's early work was in analysis, more specifically, in operator theory, differ ential equations, ergodic theory, and probability theory. In the 1960's, Rota was motivated by problems in fluctuation theory to study some operator identities of Glen Baxter (see [7]). Together with other problems in probability theory, this led Rota to study combinatorics. His series of papers, "On the foundations of combi natorial theory", led to a fundamental re-evaluation of the subject. Later, in the 1990's, Rota returned to some of the problems in analysis and probability theory which motivated his work in combinatorics. This was his intention all along, and his early death robbed mathematics of his unique perspective on linkages between the discrete and the continuous. Glimpses of his new research programs can be found in [2,3,6,9,10].
This book offers a comprehensive guide to the modelling of operational risk using possibility theory. It provides a set of methods for measuring operational risks under a certain degree of vagueness and impreciseness, as encountered in real-life data. It shows how possibility theory and indeterminate uncertainty-encompassing degrees of belief can be applied in analysing the risk function, and describes the parametric g-and-h distribution associated with extreme value theory as an interesting candidate in this regard. The book offers a complete assessment of fuzzy methods for determining both value at risk (VaR) and subjective value at risk (SVaR), together with a stability estimation of VaR and SVaR. Based on the simulation studies and case studies reported on here, the possibilistic quantification of risk performs consistently better than the probabilistic model. Risk is evaluated by integrating two fuzzy techniques: the fuzzy analytic hierarchy process and the fuzzy extension of techniques for order preference by similarity to the ideal solution. Because of its specialized content, it is primarily intended for postgraduates and researchers with a basic knowledge of algebra and calculus, and can be used as reference guide for research-level courses on fuzzy sets, possibility theory and mathematical finance. The book also offers a useful source of information for banking and finance professionals investigating different risk-related aspects. |
![]() ![]() You may like...
Subdifferentials - Theory and…
Anatoly G. Kusraev, Semen S. Kutateladze
Hardcover
R3,426
Discovery Miles 34 260
Multi-scale Analysis for Random Quantum…
Victor Chulaevsky, Yuri Suhov
Hardcover
R3,216
Discovery Miles 32 160
Negative Symptom and Cognitive Deficit…
Richard S.E. Keefe, Joseph P. McEvoy
Hardcover
Photoacoustic and Photothermal…
Surya N. Thakur, Virendra N. Rai, …
Paperback
R4,694
Discovery Miles 46 940
Structure-Preserving Doubling Algorithms…
Tsung-Ming Huang, Ren-Cang Li, …
Paperback
R1,744
Discovery Miles 17 440
Connectionist Psycholinguistics
Morten H. Christiansen, Nick Chater
Hardcover
R2,820
Discovery Miles 28 200
|