![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Applied mathematics
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and renewal theory. Many new introductory problems and exercises have also been added. This book combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The book begins with three chapters that develop probability theory and introduce the axioms of probability, random variables, and joint distributions. The next two chapters introduce limit theorems and simulation. Also included is a chapter on statistical inference with a focus on Bayesian statistics, which is an important, though often neglected, topic for undergraduate-level texts. Markov chains in discrete and continuous time are also discussed within the book. More than 400 examples are interspersed throughout to help illustrate concepts and theory and to assist readers in developing an intuitive sense of the subject. Readers will find many of the examples to be both entertaining and thought provoking. This is also true for the carefully selected problems that appear at the end of each chapter.
Integrable models have a fascinating history with many important discoveries that dates back to the famous Kepler problem of planetary motion. Nowadays it is well recognised that integrable systems play a ubiquitous role in many research areas ranging from quantum field theory, string theory, solvable models of statistical mechanics, black hole physics, quantum chaos and the AdS/CFT correspondence, to pure mathematics, such as representation theory, harmonic analysis, random matrix theory and complex geometry. Starting with the Liouville theorem and finite-dimensional integrable models, this book covers the basic concepts of integrability including elements of the modern geometric approach based on Poisson reduction, classical and quantum factorised scattering and various incarnations of the Bethe Ansatz. Applications of integrability methods are illustrated in vast detail on the concrete examples of the Calogero-Moser-Sutherland and Ruijsenaars-Schneider models, the Heisenberg spin chain and the one-dimensional Bose gas interacting via a delta-function potential. This book has intermediate and advanced topics with details to make them clearly comprehensible.
This book is a course in methods and models rooted in physics and
used in modelling economic and social phenomena. It covers the
discipline of econophysics, which creates an interface between
physics and economics. Besides the main theme, it touches on the
theory of complex networks and simulations of social phenomena in
general.
Originating from the 42nd conference on Boundary Elements and other Mesh Reduction Methods (BEM/MRM), the research presented in this book consist of high quality papers that report on advances in techniques that reduce or eliminate the type of meshes associated with such methods as finite elements or finite differences. The maturity of BEM since 1978 has resulted in a substantial number of industrial applications which demonstrate the accuracy, robustness and easy use of the technique. Their range still needs to be widened, taking into account the potentialities of the Mesh Reduction techniques in general. As design, analysis and manufacture become more integrated the chances are that the users will be less aware of the capabilities of the analytical techniques that are at the core of the process. This reinforces the need to retain expertise in certain specialised areas of numerical methods, such as BEM/MRM, to ensure that all new tools perform satisfactorily in the integrated process. The papers in this volume help to expand the range of applications as well as the type of materials in response to industrial and professional requirements. Some of the topics include: Hybrid foundations; Meshless and mesh reduction methods; Structural mechanics; Solid mechanics; Heat and mass transfer; Electrical engineering and electromagnetics; Fluid flow modelling; Damage mechanics and fracture; Dynamics and vibrations analysis.
The Boussinesq equation is the first model of surface waves in shallow water that considers the nonlinearity and the dispersion and their interaction as a reason for wave stability known as the Boussinesq paradigm. This balance bears solitary waves that behave like quasi-particles. At present, there are some Boussinesq-like equations. The prevalent part of the known analytical and numerical solutions, however, relates to the 1d case while for multidimensional cases, almost nothing is known so far. An exclusion is the solutions of the Kadomtsev-Petviashvili equation. The difficulties originate from the lack of known analytic initial conditions and the nonintegrability in the multidimensional case. Another problem is which kind of nonlinearity will keep the temporal stability of localized solutions. The system of coupled nonlinear Schroedinger equations known as well as the vector Schroedinger equation is a soliton supporting dynamical system. It is considered as a model of light propagation in Kerr isotropic media. Along with that, the phenomenology of the equation opens a prospect of investigating the quasi-particle behavior of the interacting solitons. The initial polarization of the vector Schroedinger equation and its evolution evolves from the vector nature of the model. The existence of exact (analytical) solutions usually is rendered to simpler models, while for the vector Schroedinger equation such solutions are not known. This determines the role of the numerical schemes and approaches. The vector Schroedinger equation is a spring-board for combining the reduced integrability and conservation laws in a discrete level. The experimental observation and measurement of ultrashort pulses in waveguides is a hard job and this is the reason and stimulus to create mathematical models for computer simulations, as well as reliable algorithms for treating the governing equations. Along with the nonintegrability, one more problem appears here - the multidimensionality and necessity to split and linearize the operators in the appropriate way.
Providing a practical introduction to state space methods as
applied to unobserved components time series models, also known as
structural time series models, this book introduces time series
analysis using state space methodology to readers who are neither
familiar with time series analysis, nor with state space methods.
The only background required in order to understand the material
presented in the book is a basic knowledge of classical linear
regression models, of which brief review is provided to refresh the
reader's knowledge. Also, a few sections assume familiarity with
matrix algebra, however, these sections may be skipped without
losing the flow of the exposition.
This book is specially designed to refresh and elevate the level of understanding of the foundational background in probability and distributional theory required to be successful in a graduate-level statistics program. Advanced undergraduate students and introductory graduate students from a variety of quantitative backgrounds will benefit from the transitional bridge that this volume offers, from a more generalized study of undergraduate mathematics and statistics to the career-focused, applied education at the graduate level. In particular, it focuses on growing fields that will be of potential interest to future M.S. and Ph.D. students, as well as advanced undergraduates heading directly into the workplace: data analytics, statistics and biostatistics, and related areas.
Science and engineering students depend heavily on concepts of
mathematical modeling. In an age where almost everything is done on
a computer, author Clive Dym believes that students need to
understand and "own" the underlying mathematics that computers are
doing on their behalf. His goal for Principles of Mathematical
Modeling, Second Edition, is to engage the student reader in
developing a foundational understanding of the subject that will
serve them well into their careers.
This book demonstrates some of the ways in which Microsoft Excel (R) may be used to solve numerical problems in the field of physics.
This volume shares and makes accessible new research lines and recent results in several branches of theoretical and mathematical physics, among them Quantum Optics, Coherent States, Integrable Systems, SUSY Quantum Mechanics, and Mathematical Methods in Physics. In addition to a selection of the contributions presented at the "6th International Workshop on New Challenges in Quantum Mechanics: Integrability and Supersymmetry", held in Valladolid, Spain, 27-30 June 2017, several high quality contributions from other authors are also included. The conference gathered 60 participants from many countries working in different fields of Theoretical Physics, and was dedicated to Prof. Veronique Hussin-an internationally recognized expert in many branches of Mathematical Physics who has been making remarkable contributions to this field since the 1980s. The reader will find interesting reviews on the main topics from internationally recognized experts in each field, as well as other original contributions, all of which deal with recent applications or discoveries in the aforementioned areas.
New Edition of a Classic Guide to Statistical Applications in the Biomedical Sciences In the last decade, there have been significant changes in the way statistics is incorporated into biostatistical, medical, and public health research. Addressing the need for a modernized treatment of these statistical applications, Basic Statistics, Fourth Edition presents relevant, up-to-date coverage of research methodology using careful explanations of basic statistics and how they are used to address practical problems that arise in the medical and public health settings. Through concise and easy-to-follow presentations, readers will learn to interpret and examine data by applying common statistical tools, such as sampling, random assignment, and survival analysis. Continuing the tradition of its predecessor, this new edition outlines a thorough discussion of different kinds of studies and guides readers through the important, related decision-making processes such as determining what information is needed and planning the collections process. The book equips readers with the knowledge to carry out these practices by explaining the various types of studies that are commonly conducted in the fields of medical and public health, and how the level of evidence varies depending on the area of research. Data screening and data entry into statistical programs is explained and accompanied by illustrations of statistical analyses and graphs. Additional features of the Fourth Edition include: A new chapter on data collection that outlines the initial steps in planning biomedical and public health studiesA new chapter on nonparametric statistics that includes a discussion and application of the Sign test, the Wilcoxon Signed Rank test, and the Wilcoxon Rank Sum test and its relationship to the Mann-Whitney U testAn updated introduction to survival analysis that includes the Kaplan Meier method for graphing the survival function and a brief introduction to tests for comparing survival functionsIncorporation of modern statistical software, such as SAS, Stata, SPSS, and Minitab into the presented discussion of data analysisUpdated references at the end of each chapter "Basic Statistics," Fourth Edition is an ideal book for courses on biostatistics, medicine, and public health at the upper-undergraduate and graduate levels. It is also appropriate as a reference for researchers and practitioners who would like to refresh their fundamental understanding of statistical techniques.
The main purpose of this book is not only to present recent studies and advances in the field of social science research, but also to stimulate discussion on related practical issues concerning statistics, mathematics, and economics. Accordingly, a broad range of tools and techniques that can be used to solve problems on these topics are presented in detail in this book, which offers an ideal reference work for all researchers interested in effective quantitative and qualitative tools. The content is divided into three major sections. The first, which is titled "Social work", collects papers on problems related to the social sciences, e.g. social cohesion, health, and digital technologies. Papers in the second part, "Education and teaching issues," address qualitative aspects, education, learning, violence, diversity, disability, and ageing, while the book's final part, "Recent trends in qualitative and quantitative models for socio-economic systems and social work", features contributions on both qualitative and quantitative issues. The book is based on a scientific collaboration, in the social sciences, mathematics, statistics, and economics, among experts from the "Pablo de Olavide" University of Seville (Spain), the "University of Defence" of Brno (Czech Republic), the "G. D'Annunzio" University of Chieti-Pescara (Italy) and "Alexandru Ioan Cuza University" of Iasi (Romania). The contributions, which have been selected using a peer-review process, examine a wide variety of topics related to the social sciences in general, while also highlighting new and intriguing empirical research conducted in various countries. Given its scope, the book will appeal, in equal measure, to sociologists, mathematicians, statisticians and philosophers, and more generally to scholars and specialists in related fields.
In two volumes, this book presents a detailed, systematic treatment of electromagnetics with application to the propagation of transient electromagnetic fields (including ultrawideband signals and ultrashort pulses) in dispersive attenuative media. The development in this expanded, updated, and reorganized new edition is mathematically rigorous, progressing from classical theory to the asymptotic description of pulsed wave fields in Debye and Lorentz model dielectrics, Drude model conductors, and composite model semiconductors. It will be of use to researchers as a resource on electromagnetic radiation and wave propagation theory with applications to ground and foliage penetrating radar, medical imaging, communications, and safety issues associated with ultrawideband pulsed fields. With meaningful exercises, and an authoritative selection of topics, it can also be used as a textbook to prepare graduate students for research. Volume 2 presents a detailed asymptotic description of plane wave pulse propagation in dielectric, conducting, and semiconducting materials as described by the classical Lorentz model of dielectric resonance, the Rocard-Powles-Debye model of orientational polarization, and the Drude model of metals. The rigorous description of the signal velocity of a pulse in a dispersive material is presented in connection with the question of superluminal pulse propagation. The second edition contains new material on the effects of spatial dispersion on precursor formation, and pulse transmission into a dispersive half space and into multilayered media. Volume 1 covers spectral representations in temporally dispersive media.
This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract "best estimate" values for model parameters and predicted results, together with "best estimate" uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information. The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, "cost functional" (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the "data adjustment" and/or the "4D-VAR" data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a "model validation metric" which quantifies the consistency (agreement/disagreement) between measurements and computations. This "model validation metric" (or "consistency indicator") is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters. Traditional methods for computing response sensitivities are hampered by the "curse of dimensionality," which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, "best estimate" predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.
Classical Mechanics teaches readers how to solve physics problems; in other words, how to put math and physics together to obtain a numerical or algebraic result and then interpret these results physically. These skills are important and will be needed in more advanced science and engineering courses. However, more important than developing problem-solving skills and physical-interpretation skills, the main purpose of this multi-volume series is to survey the basic concepts of classical mechanics and to provide the reader with a solid understanding of the foundational content knowledge of classical mechanics. Classical Mechanics: Conservation laws and rotational motion covers the conservation of energy and the conservation of momentum, which are crucial concepts in any physics course. It also introduces the concepts of center-of-mass and rotational motion.
This book addresses the concepts of unstable flow solutions, convective instability and absolute instability, with reference to simple (or toy) mathematical models, which are mathematically simple despite their purely abstract character. Within this paradigm, the book introduces the basic mathematical tools, Fourier transform, normal modes, wavepackets and their dynamics, before reviewing the fundamental ideas behind the mathematical modelling of fluid flow and heat transfer in porous media. The author goes on to discuss the fundamentals of the Rayleigh-Benard instability and other thermal instabilities of convective flows in porous media, and then analyses various examples of transition from convective to absolute instability in detail, with an emphasis on the formulation, deduction of the dispersion relation and study of the numerical data regarding the threshold of absolute instability. The clear descriptions of the analytical and numerical methods needed to obtain these parametric threshold data enable readers to apply them in different or more general cases. This book is of interest to postgraduates and researchers in mechanical and thermal engineering, civil engineering, geophysics, applied mathematics, fluid mechanics, and energy technology.
This book uses art photography as a point of departure for learning about physics, while also using physics as a point of departure for asking fundamental questions about the nature of photography as an art. Although not a how-to manual, the topics center around hands-on applications, sometimes illustrated by photographic processes that are inexpensive and easily accessible to students (including a versatile new process developed by the author, and first described in print in this series). A central theme is the connection between the physical interaction of light and matter on the one hand, and the artistry of the photographic processes and their results on the other. One half of Energy and Color focuses on the physics of energy, power, illuminance, and intensity of light, and how these relate to the photographic exposure, including a detailed example that follows the emission of light from the sun all the way through to the formation of the image in the camera. These concepts are described in both their traditional manner, but also using very-low sensitivity photography as an example, which brings the physical concepts to the fore in a visible way, whereas they are often hidden with ordinary high-speed photographic detectors. Energy and Color also considers color in terms of the spectrum of light, how it interacts with the subject, and how the camera's light detector interacts with the image focused upon it. But of equal concern is the only partially-understood and sometimes unexpected ways in which the human eye/brain interprets this spectral stimulus as color. The volume covers basic photographic subjects such as shutter, aperture, ISO, metering and exposure value, but also given their relations to the larger themes of the book less familiar topics such as the Jones-Condit equation, Lambertian versus isotropic reflections, reflection and response curves, and the opponent-process model of color perception. Although written at a beginning undergraduate level, the topics are chosen for their role in a more general discussion of the relation between science and art that is of interest to readers of all backgrounds and levels of expertise.
Blast Mitigation: Experimental and Numerical Studies covers both experimental and numerical aspects of material and structural response to dynamic blast loads and its mitigation. The authors present the most up-to-date understanding from laboratory studies and computational analysis for researchers working in the field of blast loadings and their effect on material and structural failure, develop designs for lighter and highly efficient structural members for blast energy absorption, discuss vulnerability of underground structures, present methods for dampening blast overpressures, discuss structural post blast collapse and give attention to underwater explosion and implosion effects on submerged infrastructure and mitigation measures for this environment.
Covering a broad range of topics, this text provides a comprehensive survey of the modeling of chaotic dynamics and complexity in the natural and social sciences. Its attention to models in both the physical and social sciences and the detailed philosophical approach make this a unique text in the midst of many current books on chaos and complexity. Including an extensive index and bibliography along with numerous examples and simplified models, this is an ideal course text.
The development of man's understanding of planetary motions is the crown jewel of Newtonian mechanics. This book offers a concise but self-contained handbook-length treatment of this historically important topic for students at about the third-year-level of an undergraduate physics curriculum. After opening with a review of Kepler's three laws of planetary motion, it proceeds to analyze the general dynamics of "central force" orbits in spherical coordinates, how elliptical orbits satisfy Newton's gravitational law and how the geometry of ellipses relates to physical quantities such as energy and momentum. Exercises are provided and derivations are set up in such a way that readers can gain analytic practice by filling in missing steps. A brief bibliography lists sources for readers who wish to pursue further study on their own.
The relaxation method has enjoyed an intensive development during many decades and this new edition of this comprehensive text reflects in particular the main achievements in the past 20 years. Moreover, many further improvements and extensions are included, both in the direction of optimal control and optimal design as well as in numerics and applications in materials science, along with an updated treatment of the abstract parts of the theory. |
You may like...
Constructive Approximation on the Sphere…
W Freeden, T. Gervens, …
Hardcover
R3,855
Discovery Miles 38 550
New Trends in the Physics and Mechanics…
Martine Ben Amar, Alain Goriely, …
Hardcover
R2,505
Discovery Miles 25 050
Stochastic Analysis of Mixed Fractional…
Yuliya Mishura, Mounir Zili
Hardcover
Modelling and Control in Biomedical…
David Dagan Feng, Janan Zaytoon
Paperback
Exploring Quantum Mechanics - A…
Victor Galitski, Boris Karnakov, …
Hardcover
R6,101
Discovery Miles 61 010
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R3,940
Discovery Miles 39 400
|