![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Applied mathematics > General
This book provides the mathematical foundations of the theory of hyperhamiltonian dynamics, together with a discussion of physical applications. In addition, some open problems are discussed. Hyperhamiltonian mechanics represents a generalization of Hamiltonian mechanics, in which the role of the symplectic structure is taken by a hyperkahler one (thus there are three Kahler/symplectic forms satisfying quaternionic relations). This has proved to be of use in the description of physical systems with spin, including those which do not admit a Hamiltonian formulation. The book is the first monograph on the subject, which has previously been treated only in research papers.
This is the revised and enlarged 2nd edition of the authors' original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.
Fluid turbulence is often referred to as `the unsolved problem of classical physics'. Yet, paradoxically, its mathematical description resembles quantum field theory. The present book addresses the idealised problem posed by homogeneous, isotropic turbulence, in order to concentrate on the fundamental aspects of the general problem. It is written from the perspective of a theoretical physicist, but is designed to be accessible to all researchers in turbulence, both theoretical and experimental, and from all disciplines. The book is in three parts, and begins with a very simple overview of the basic statistical closure problem, along with a summary of current theoretical approaches. This is followed by a precise formulation of the statistical problem, along with a complete set of mathematical tools (as needed in the rest of the book), and a summary of the generally accepted phenomenology of the subject. Part 2 deals with current issues in phenomenology, including the role of Galilean invariance, the physics of energy transfer, and the fundamental problems inherent in numerical simulation. Part 3 deals with renormalization methods, with an emphasis on the taxonomy of the subject, rather than on lengthy mathematical derivations. The book concludes with some discussion of current lines of research and is supplemented by three appendices containing detailed mathematical treatments of the effect of isotropy on correlations, the properties of Gaussian distributions, and the evaluation of coefficients in statistical theories.
The Pacific Symposium on Biocomputing (PSB) 2011 is an international, multidisciplinary conference for the presentation and discussion of current research in the theory and application of computational methods in problems of biological significance. Presentations are rigorously peer reviewed and are published in an archival proceedings volume. PSB 2011 will be held on January 3 - 7, 2011 in Kohala Coast, Hawaii. Tutorials and workshops will be offered prior to the start of the conference.PSB 2011 will bring together top researchers from the US, Asia Pacific, and around the world to exchange research results and address pertinent issues in all aspects of computational biology. It is a forum for the presentation of work in databases, algorithms, interfaces, visualization, modeling, and other computational methods, as applied to biological problems, with emphasis on applications in data-rich areas of molecular biology.The PSB has been designed to be responsive to the need for critical mass in sub-disciplines within biocomputing. For that reason, it is the only meeting whose sessions are defined dynamically each year in response to specific proposals. PSB sessions are organized by leaders of research in biocomputing's "hot topics". In this way, the meeting provides an early forum for serious examination of emerging methods and approaches in this rapidly evolving field.
This book presents several new findings in the field of turbulent duct flows, which are important for a range of industrial applications. It presents both high-quality experiments and cutting-edge numerical simulations, providing a level of insight and rigour rarely found in PhD theses. The scientific advancements concern the effect of the Earth's rotation on large duct flows, the experimental confirmation of marginal turbulence in a pressure-driven square duct flow (previously only predicted in simulations), the identification of similar marginal turbulence in wall-driven flows using simulations (for the first time by any means) and, on a separate but related topic, a comprehensive experimental study on the phenomenon of drag reduction via polymer additives in turbulent duct flows. In turn, the work on drag reduction resulted in a correlation that provides a quantitative prediction of drag reduction based on a single, measurable material property of the polymer solution, regardless of the flow geometry or concentration. The first correlation of its kind, it represents an important advancement from both a scientific and practical perspective.
This volume presents current research in functional analysis and its applications to a variety of problems in mathematics and mathematical physics. The book contains over forty carefully refereed contributions to the conference "Functional Analysis in Interdisciplinary Applications" (Astana, Kazakhstan, October 2017). Topics covered include the theory of functions and functional spaces; differential equations and boundary value problems; the relationship between differential equations, integral operators and spectral theory; and mathematical methods in physical sciences. Presenting a wide range of topics and results, this book will appeal to anyone working in the subject area, including researchers and students interested to learn more about different aspects and applications of functional analysis.
Neutrinos continue to be the most mysterious and, arguably, the most fascinating particles of the Standard Model as their intrinsic properties such as absolute mass scale and CP properties are unknown. The open question of the absolute neutrino mass scale will be addressed with unprecedented accuracy by the Karlsruhe Tritium Neutrino (KATRIN) experiment, currently under construction. This thesis focusses on the spectrometer part of KATRIN and background processes therein. Various background sources such as small Penning traps, as well as nuclear decays from single radon atoms are fully characterized here for the first time. Most importantly, however, it was possible to reduce the background in the spectrometer by more than five orders of magnitude by eliminating Penning traps and by developing a completely new background reduction method by stochastically heating trapped electrons using electron cyclotron resonance (ECR). The work beautifully demonstrates that the obstacles and challenges in measuring the absolute mass scale of neutrinos can be met successfully if novel experimental tools (ECR) and novel computing methods (KASSIOPEIA) are combined to allow almost background-free tritium ss-spectroscopy.
Density Functional Theory (DFT) has firmly established itself as the workhorse for atomic-level simulations of condensed phases, pure or composite materials and quantum chemical systems. This work offers a rigorous and detailed introduction to the foundations of this theory, up to and including such advanced topics as orbital-dependent functionals as well as both time-dependent and relativistic DFT. Given the many ramifications of contemporary DFT, the text concentrates on the self-contained presentation of the basics of the most widely used DFT variants: this implies a thorough discussion of the corresponding existence theorems and effective single particle equations, as well as of key approximations utilized in implementations. The formal results are complemented by selected quantitative results, which primarily aim at illustrating the strengths and weaknesses of particular approaches or functionals. The structure and content of this book allow a tutorial and modular self-study approach: the reader will find that all concepts of many-body theory which are indispensable for the discussion of DFT - such as the single-particle Green's function or response functions - are introduced step by step, along with the actual DFT material. The same applies to basic notions of solid state theory, such as the Fermi surface of inhomogeneous, interacting systems. In fact, even the language of second quantization is introduced systematically in an Appendix for readers without formal training in many-body theory.
The book provides readers with an understanding of the mutual conditioning of spacetime and interactions and matter. The spacetime manifold will be looked at to be a reservoir for the parametrization of operation Lie groups or subgroup classes of Lie groups. With basic operation groups or Lie algebras, all physical structures can be interpreted in terms of corresponding realizations or representations. Physical properties are related eigenvalues or invariants. As an explicit example of operational spacetime is proposed, called electroweak spacetime, parametrizing the classes of the internal hypercharge - isospin group in the general linear group in two complex dimensions, i.e., the Lorentz cover group, extended by the casual (dilation) and phase group. Its representations and invariants will be investigated with the aim to connect them, qualitatively and numerically, with the properties of interactions and particles as arising in the representations of its tangent Minkowski spaces.
Regularized equations of motion can improve numerical integration for the propagation of orbits, and simplify the treatment of mission design problems. This monograph discusses standard techniques and recent research in the area. While each scheme is derived analytically, its accuracy is investigated numerically. Algebraic and topological aspects of the formulations are studied, as well as their application to practical scenarios such as spacecraft relative motion and new low-thrust trajectories.
Throughout my whole career including student time I have had a feeling that leaning and teaching electromagnetism, especially macroscopic Maxwell equations (M-eqs) is dif?cult. In order to make a good use of these equations, it seemed necessary to be able to use certain empirical knowledges and model-dependent concepts, rather than pure logics. Many of my friends, colleagues and the physicists I have met on various occasions have expressed similar impressions. This is not the case with microscopic M-eqs and quantum mechanics, which do not make us feel reluctant to teach, probably because of the clear logical structure. What makes us hesitate to teach is probably because we have to explain what we ourselves do not completely understand. Logic is an essential element in physics, as well as in mathematics, so that it does not matter for physicists to experience dif?culties at the initial phase, as far as the logical structure is clear. As the we- known principles of physics say, "a good theory should be logically consistent and explain relevant experiments." Our feeling about macroscopic M-eqs may be related with some incompleteness of their logical structure.
The production of heavy quarks in high-energy experiments offers a rich field to study, both experimentally and theoretically. Due to the additional quark mass, the description of these processes in the framework of perturbative QCD is much more demanding than it is for those involving only massless partons. In the last two decades, a large amount of precision data has been collected by the deep inelastic HERA experiment. In order to make full use of these data, a more precise theoretical description of charm quark production in deep inelastic scattering is needed. This work deals with the first calculation of fixed moments of the NNLO heavy flavor corrections to the proton structure function F2 in the limit of a small charm-quark mass. The correct treatment of these terms will allow not only a more precise analysis of the HERA data, but starting from there also a more precise determination of the parton distribution functions and the strong coupling constant, which is an essential input for LHC physics. The complexity of this calculation requires the application and development of technical and mathematical methods, which are also explained here in detail.
This new approach to real analysis stresses the use of the subject with respect to applications, i.e., how the principles and theory of real analysis can be applied in a variety of settings in subjects ranging from Fourier series and polynomial approximation to discrete dynamical systems and nonlinear optimization. Users will be prepared for more intensive work in each topic through these applications and their accompanying exercises. This book is appropriate for math enthusiasts with a prior knowledge of both calculus and linear algebra.
Quantum mechanics forms the foundation of all modern physics, including atomic, nuclear, and molecular physics, the physics of the elementary particles, condensed matter physics. Modern astrophysics also relies heavily on quantum mechanics. Quantum theory is needed to understand the basis for new materials, new devices, the nature of light coming from stars, the laws which govern the atomic nucleus, and the physics of biological systems. As a result the subject of this book is a required course for most physics graduate students. While there are many books on the subject, this book targets specifically graduate students and it is written with modern advances in various fields in mind. Many examples treated in the various chapters as well as the emphasis of the presentation in the book are designed from the perspective of such problems. For example, the book begins by putting the Schroedinger equation on a spatial discrete lattice and the continuum limit is also discussed, inspired by Hamiltonian lattice gauge theories. The latter and advances in quantum simulations motivated the inclusion of the path integral formulation. This formulation is applied to the imaginary-time evolution operator to project the exact ground state of the harmonic oscillator as is done in quantum simulations. As an example of how to take advantage of symmetry in quantum mechanics, one-dimensional periodic potentials are discussed, inspired by condensed matter physics. Atoms and molecules are discussed within mean-field like treatment (Hartree-Fock) and how to go beyond it. Motivated by the recent intense activity in condensed matter and atomic physics to study the Hubbard model, the electron correlations in the hydrogen molecule are taken into account by solving the two-site Hubbard model analytically. Using the canonical Hamiltonian quantization of quantum electrodynamics, the photons emerge as the quanta of the normal modes, in the same way as the phonons emerge in the treatment of the normal modes of the coupled array of atoms. This is used later to treat the interaction of radiation with atomic matter.
"Stochastic Tools in Mathematics and Science" covers basic stochastic tools used in physics, chemistry, engineering and the life sciences. The topics covered include conditional expectations, stochastic processes, Brownian motion and its relation to partial differential equations, Langevin equations, the Liouville and Fokker-Planck equations, as well as Markov chain Monte Carlo algorithms, renormalization, basic statistical mechanics, and generalized Langevin equations and the Mori-Zwanzig formalism. The applications include sampling algorithms, data assimilation, prediction from partial data, spectral analysis, and turbulence. The book is based on lecture notes from a class that has attracted graduate and advanced undergraduate students from mathematics and from many other science departments at the University of California, Berkeley. Each chapter is followed by exercises. The book will be useful for scientists and engineers working in a wide range of fields and applications. For this new edition the material has been thoroughly reorganized and updated, and new sections on scaling, sampling, filtering and data assimilation, based on recent research, have been added. There are additional figures and exercises. Review of earlier edition: "This is an excellent concise textbook which can be used for self-study by graduate and advanced undergraduate students and as a recommended textbook for an introductory course on probabilistic tools in science." Mathematical Reviews, 2006
This book focuses on the development of a theory of info-dynamics to support the theory of info-statics in the general theory of information. It establishes the rational foundations of information dynamics and how these foundations relate to the general socio-natural dynamics from the primary to the derived categories in the universal existence and from the potential to the actual in the ontological space. It also shows how these foundations relate to the general socio-natural dynamics from the potential to the possible to give rise to the possibility space with possibilistic thinking; from the possible to the probable to give rise to possibility space with probabilistic thinking; and from the probable to the actual to give rise to the space of knowledge with paradigms of thought in the epistemological space. The theory is developed to explain the general dynamics through various transformations in quality-quantity space in relation to the nature of information flows at each variety transformation. The theory explains the past-present-future connectivity of the evolving information structure in a manner that illuminates the transformation problem and its solution in the never-ending information production within matter-energy space under socio-natural technologies to connect the theory of info-statics, which in turn presents explanations to the transformation problem and its solution. The theoretical framework is developed with analytical tools based on the principle of opposites, systems of actual-potential polarities, negative-positive dualities under different time-structures with the use of category theory, fuzzy paradigm of thought and game theory in the fuzzy-stochastic cost-benefit space. The rational foundations are enhanced with categorial analytics. The value of the theory of info-dynamics is demonstrated in the explanatory and prescriptive structures of the transformations of varieties and categorial varieties at each point of time and over time from parent-offspring sequences. It constitutes a general explanation of dynamics of information-knowledge production through info-processes and info-processors induced by a socio-natural infinite set of technologies in the construction-destruction space.
With this thesis the author contributes to the development of a non-mainstream but long-standing approach to electroweak symmetry breaking based on an analogy with superconductivity. Electroweak symmetry breaking is assumed to be caused by dynamically generated masses of typical fermions, i.e., of quarks and leptons, which in turn assumes a new dynamics between quarks and leptons. Primarily it is designed to generate fermion masses and electroweak symmetry breaking is an automatic consequence. After the summary of the topic, the first main part of the thesis addresses the question as to whether the masses of known quarks and leptons provide sufficiently strong sources of electroweak symmetry breaking. It is demonstrated that neutrino masses subject tothe seesaw mechanism are indispensable ingredients. The other two parts of the thesis are dedicated to the presentation of two particular models: The first model is based on the new strong Yukawa dynamics and serves as a platform for studying the ability to reproduce fermion masses. The second, more realistic model introduces a flavor gauge dynamics and its phenomenological consequences are studied. Even though, in the past, this type of models has already been of some interest, following the discovery of the Standard-Model-like Higgs particle, it is regaining its relevance."
Gauge Field theory in Natural Geometric Language addresses the need to clarify basic mathematical concepts at the crossroad between gravitation and quantum physics. Selected mathematical and theoretical topics are exposed within a brief, integrated approach that exploits standard and non-standard notions, as well as recent advances, in a natural geometric language in which the role of structure groups can be regarded as secondary even in the treatment of the gauge fields themselves. In proposing an original bridge between physics and mathematics, this text will appeal not only to mathematicians who wish to understand some of the basic ideas involved in quantum particle physics, but also to physicists who are not satisfied with the usual mathematical presentations of their field.
Quantum trajectory theory is largely employed in theoretical quantum optics and quantum open system theory and is closely related to the conceptual formalism of quantum mechanics (quantum measurement theory). However, even research articles show that not all the features of the theory are well known or completely exploited. We wrote this monograph mainly for researchers in theoretical quantum optics and related ?elds with the aim of giving a self-contained and solid p- sentation of a part of quantum trajectory theory (the diffusive case) together with some signi?cant applications (mainly with purposes of illustration of the theory, but which in part have been recently developed). Another aim of the monograph is to introduce to this subject post-graduate or PhD students. To help them, in the most mathematical and conceptual chapters, summaries are given to ?x ideas. Moreover, as stochastic calculus is usually not in the background of the studies in physics, we added Appendix A to introduce these concepts. The book is written also for ma- ematicians with interests in quantum theories. Quantum trajectory theory is a piece of modern theoretical physics which needs an interplay of various mathematical subjects, such as functional analysis and probability theory (stochastic calculus), and offers to mathematicians a beautiful ?eld for applications, giving suggestions for new mathematical developments.
This book presents the proceedings of the international conference Particle Systems and Partial Differential Equations I, which took place at the Centre of Mathematics of the University of Minho, Braga, Portugal, from the 5th to the 7th of December, 2012. The purpose of the conference was to bring together world leaders to discuss their topics of expertise and to present some of their latest research developments in those fields. Among the participants were researchers in probability, partial differential equations and kinetics theory. The aim of the meeting was to present to a varied public the subject of interacting particle systems, its motivation from the viewpoint of physics and its relation with partial differential equations or kinetics theory and to stimulate discussions and possibly new collaborations among researchers with different backgrounds. The book contains lecture notes written by Francois Golse on the derivation of hydrodynamic equations (compressible and incompressible Euler and Navier-Stokes) from the Boltzmann equation, and several short papers written by some of the participants in the conference. Among the topics covered by the short papers are hydrodynamic limits; fluctuations; phase transitions; motions of shocks and anti shocks in exclusion processes; large number asymptotics for systems with self-consistent coupling; quasi-variational inequalities; unique continuation properties for PDEs and others. The book will benefit probabilists, analysts and mathematicians who are interested in statistical physics, stochastic processes, partial differential equations and kinetics theory, along with physicists."
This book, based on a graduate course given by the authors, is a pedagogic and self-contained introduction to the renormalization group with special emphasis on the functional renormalization group. The functional renormalization group is a modern formulation of the Wilsonian renormalization group in terms of formally exact functional differential equations for generating functionals. In Part I the reader is introduced to the basic concepts of the renormalization group idea, requiring only basic knowledge of equilibrium statistical mechanics. More advanced methods, such as diagrammatic perturbation theory, are introduced step by step. Part II then gives a self-contained introduction to the functional renormalization group. After a careful definition of various types of generating functionals, the renormalization group flow equations for these functionals are derived. This procedure is shown to encompass the traditional method of the mode elimination steps of the Wilsonian renormalization group procedure. Then, approximate solutions of these flow equations using expansions in powers of irreducible vertices or in powers of derivatives are given. Finally, in Part III the exact hierarchy of functional renormalization group flow equations for the irreducible vertices is used to study various aspects of non-relativistic fermions, including the so-called BCS-BEC crossover, thereby making the link to contemporary research topics.
ELEMENTARY LINEAR ALGEBRA, 8E, INTERNATIONAL METRIC EDITION's clear, careful, and concise presentation of material helps you fully understand how mathematics works. The author balances theory with examples, applications, and geometric intuition for a complete, step-by-step learning system. To engage you in the material, a new design highlights the relevance of the mathematics and makes the book easier to read. Data and applications reflect current statistics and examples, demonstrating the link between theory and practice. The companion website LarsonLinearAlgebra.com offers free access to multiple study tools and resources. CalcChat.com offers free step-by-step solutions to the odd-numbered exercises in the text.
This thesis presents a study of the scalar sector in the standard model (SM), as well as various searches for an extended scalar sector in theories beyond the SM (BSM). The first part of the thesis details the search for an SM Higgs boson decaying to taus, and produced by gluon fusion, vector boson fusion, or associated production with a vector boson, leading to evidence for decays of the Higgs boson to taus. In turn, the second part highlights several searches for an extended scalar sector, with scalar boson decays to taus. In all of the analyses presented, at least one scalar boson decays to a pair of taus. The results draw on data collected by the Compact Muon Solenoid (CMS) detector during proton-proton collisions with a center-of-mass energy of 7 or 8 TeV.
This book provides a quantitative framework for the analysis of conflict dynamics and for estimating the economic costs associated with civil wars. The author develops modified Lotka-Volterra equations to model conflict dynamics, to yield realistic representations of battle processes, and to allow us to assess prolonged conflict traps. The economic costs of civil wars are evaluated with the help of two alternative methods: Firstly, the author employs a production function to determine how the destruction of human and physical capital stocks undermines economic growth in the medium term. Secondly, he develops a synthetic control approach, where the cost is obtained as the divergence of actual economic activity from a hypothetical path in the absence of civil war. The difference between the two approaches gives an indication of the adverse externalities impinging upon the economy in the form of institutional destruction. By using detailed time-series regarding battle casualties, local socio-economic indicators, and capital stock destruction during the Greek Civil War (1946-1949), a full-scale application of the above framework is presented and discussed. |
You may like...
Sand and Gravel Spits
Giovanni Randazzo, Derek W.T. Jackson, …
Hardcover
Ancient Supercontinents and the…
Lauri J. Pesonen, Johanna Salminen, …
Paperback
R3,285
Discovery Miles 32 850
Paleomagnetism, Volume 73 - Continents…
Michael W. McElhinny, Phillip L. McFadden
Hardcover
R1,409
Discovery Miles 14 090
Reconstructing Quaternary Environments
J.J. Lowe, M.J.C. Walker
Hardcover
R5,236
Discovery Miles 52 360
|