![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Applied mathematics > General
Most networks and databases that humans have to deal with contain large, albeit finite number of units. Their structure, for maintaining functional consistency of the components, is essentially not random and calls for a precise quantitative description of relations between nodes (or data units) and all network components. This book is an introduction, for both graduate students and newcomers to the field, to the theory of graphs and random walks on such graphs. The methods based on random walks and diffusions for exploring the structure of finite connected graphs and databases are reviewed (Markov chain analysis). This provides the necessary basis for consistently discussing a number of applications such diverse as electric resistance networks, estimation of land prices, urban planning, linguistic databases, music, and gene expression regulatory networks.
This book provides a quantitative framework for the analysis of conflict dynamics and for estimating the economic costs associated with civil wars. The author develops modified Lotka-Volterra equations to model conflict dynamics, to yield realistic representations of battle processes, and to allow us to assess prolonged conflict traps. The economic costs of civil wars are evaluated with the help of two alternative methods: Firstly, the author employs a production function to determine how the destruction of human and physical capital stocks undermines economic growth in the medium term. Secondly, he develops a synthetic control approach, where the cost is obtained as the divergence of actual economic activity from a hypothetical path in the absence of civil war. The difference between the two approaches gives an indication of the adverse externalities impinging upon the economy in the form of institutional destruction. By using detailed time-series regarding battle casualties, local socio-economic indicators, and capital stock destruction during the Greek Civil War (1946-1949), a full-scale application of the above framework is presented and discussed.
This unified volume is a collection of invited chapters presenting recent developments in the field of data analysis, with applications to reliability and inference, data mining, bioinformatics, lifetime data, and neural networks. The book is a useful reference for graduate students, researchers, and practitioners in statistics, mathematics, engineering, economics, social science, bioengineering, and bioscience.
All the existing books in Infinite Dimensional Complex Analysis focus on the problems of locally convex spaces. However, the theory without convexity condition is covered for the first time in this book. This shows that we are really working with a new, important and interesting field.
This book is about the role of knowledge in information systems. Knowledge is usually articulated and exchanged through human language(s). In this sense, language can be seen as the most natural vehicle to convey our concepts, whose meanings are usually intermingled, grouped and organized according to shared criteria, from simple perceptions ( every tree has a stem ) and common sense ( unsupported objects fall ) to complex social conventions ( a tax is a fee charged by a government on a product, income, or activity ). But what is natural for a human being turns out to be extremely difficult for machines: machines need to be instilled with knowledge and suitably equipped with logical and statistical algorithms to reason over it. Computers can t represent the external world and communicate their representations as effectively as humans do: ontologies and NLP have been invented to face this problem: in particular, integrating ontologies with (possibly multi-lingual) computational lexical resources is an essential requirement to make human meanings understandable by machines. This book explores the advancements in this integration, from the most recent steps in building the necessary infrastructure, i.e. the Semantic Web, to the different knowledge contents that can be analyzed, encoded and transferred (multimedia, emotions, events, etc.) through it. The work aims at presenting the progress in the field of integrating ontologies and lexicons: together, they constitute the essential technology for adequately represent, elicit and exchange knowledge contents in information systems, web services, text processing and several other domains of application.
Neutrinos continue to be the most mysterious and, arguably, the most fascinating particles of the Standard Model as their intrinsic properties such as absolute mass scale and CP properties are unknown. The open question of the absolute neutrino mass scale will be addressed with unprecedented accuracy by the Karlsruhe Tritium Neutrino (KATRIN) experiment, currently under construction. This thesis focusses on the spectrometer part of KATRIN and background processes therein. Various background sources such as small Penning traps, as well as nuclear decays from single radon atoms are fully characterized here for the first time. Most importantly, however, it was possible to reduce the background in the spectrometer by more than five orders of magnitude by eliminating Penning traps and by developing a completely new background reduction method by stochastically heating trapped electrons using electron cyclotron resonance (ECR). The work beautifully demonstrates that the obstacles and challenges in measuring the absolute mass scale of neutrinos can be met successfully if novel experimental tools (ECR) and novel computing methods (KASSIOPEIA) are combined to allow almost background-free tritium ss-spectroscopy.
Many technological applications exploit a variety of magnetic structures, or magnetic phases, to produce and optimise solid-state functionality. However, most research advances are restricted to a reduced number of phases owing to computational and resource constraints. This thesis presents an ab-initio theory to efficiently describe complex magnetic phases and their temperature-dependent properties. The central assumption is that magnetic phases evolve slowly compared with the underlying electronic structure from which they emerge. By describing how the electronic structure adapts to the type and extent of magnetic order, a theory able to describe multi-spin correlations and their effect on the magnetism at finite temperature is obtained. It is shown that multi-spin correlations are behind the temperature and magnetic field dependence of the diverse magnetism in the heavy rare earth elements. Magnetically frustrated Mn-based materials and the effect of strain are also investigated. These studies demonstrate that the performance of solid-state refrigeration can be enhanced by multi-spin effects.
Covering a range of subjects from operator theory and classical harmonic analysis to Banach space theory, this book contains survey and expository articles by leading experts in their corresponding fields, and features fully-refereed, high-quality papers exploring new results and trends in spectral theory, mathematical physics, geometric function theory, and partial differential equations. Graduate students and researchers in analysis will find inspiration in the articles collected in this volume, which emphasize the remarkable connections between harmonic analysis and operator theory. Another shared research interest of the contributors of this volume lies in the area of applied harmonic analysis, where a new notion called chromatic derivatives has recently been introduced in communication engineering. The material for this volume is based on the 13th New Mexico Analysis Seminar held at the University of New Mexico, April 3-4, 2014 and on several special sections of the Western Spring Sectional Meeting at the University of New Mexico, April 4-6, 2014. During the event, participants honored the memory of Cora Sadosky-a great mathematician who recently passed away and who made significant contributions to the field of harmonic analysis. Cora was an exceptional mathematician and human being. She was a world expert in harmonic analysis and operator theory, publishing over fifty-five research papers and authoring a major textbook in the field. Participants of the conference include new and senior researchers, recent doctorates as well as leading experts in the area.
The aim of this book is to analyse historical problems related to the use of mathematics in physics as well as to the use of physics in mathematics and to investigate "Mathematical Physics" as precisely the new discipline which is concerned with this dialectical link itself. So the main question is: "When and why did the tension between mathematics and physics, explicitly practised at least since Galileo, evolve into such a new scientific theory? " "" The authors explain the various ways in which this science allowed an advanced mathematical modelling in physics on the one hand, and the invention of new mathematical ideas on the other hand. Of course this problem is related to the links between institutions, universities, schools for engineers, and industries, and so it has social implications as well. The link by which physical ideas had influenced the world of mathematics was not new in the 19th century, but it came to a kind of maturity at that time. Recently, much historical research has been done into mathematics and physics and their relation in this period. The purpose of the Symposium and this book is to gather and re-evaluate the current thinking on this subject. It brings together contributions from leading experts in the field, and gives much-needed insight in the subject of mathematical physics from a historical point of view.
In 1917, Johann Radon published his fundamental work, where he introduced what is now called the Radon transform. Including important contributions by several experts, this book reports on ground-breaking developments related to the Radon transform throughout these years, and also discusses novel mathematical research topics and applications for the next century.
The quantum and relativity theories of physics are considered to underpin all of science in an absolute sense. This monograph argues against this proposition primarily on the basis of the two theories' incompatibility and of some untenable philosophical implications of the quantum model. Elementary matter is assumed in both theories to occur as zero-dimensional point particles. In relativity theory this requires the space-like region of the underlying Minkowski space-time to be rejected as unphysical, despite its precise mathematical characterization. In quantum theory it leads to an incomprehensible interpretation of the wave nature of matter in terms of a probability function and the equally obscure concept of wave-particle duality. The most worrisome aspect about quantum mechanics as a theory of chemistry is its total inability, despite unsubstantiated claims to the contrary, to account for the fundamental concepts of electron spin, molecular structure, and the periodic table of the elements. A remedy of all these defects by reformulation of both theories as nonlinear wave models in four-dimensional space-time is described.
"Stochastic Tools in Mathematics and Science" covers basic stochastic tools used in physics, chemistry, engineering and the life sciences. The topics covered include conditional expectations, stochastic processes, Brownian motion and its relation to partial differential equations, Langevin equations, the Liouville and Fokker-Planck equations, as well as Markov chain Monte Carlo algorithms, renormalization, basic statistical mechanics, and generalized Langevin equations and the Mori-Zwanzig formalism. The applications include sampling algorithms, data assimilation, prediction from partial data, spectral analysis, and turbulence. The book is based on lecture notes from a class that has attracted graduate and advanced undergraduate students from mathematics and from many other science departments at the University of California, Berkeley. Each chapter is followed by exercises. The book will be useful for scientists and engineers working in a wide range of fields and applications. For this new edition the material has been thoroughly reorganized and updated, and new sections on scaling, sampling, filtering and data assimilation, based on recent research, have been added. There are additional figures and exercises. Review of earlier edition: "This is an excellent concise textbook which can be used for self-study by graduate and advanced undergraduate students and as a recommended textbook for an introductory course on probabilistic tools in science." Mathematical Reviews, 2006
This book is devoted to biased sampling problems (also called choice-based sampling in Econometrics parlance) and over-identified parameter estimation problems. Biased sampling problems appear in many areas of research, including Medicine, Epidemiology and Public Health, the Social Sciences and Economics. The book addresses a range of important topics, including case and control studies, causal inference, missing data problems, meta-analysis, renewal process and length biased sampling problems, capture and recapture problems, case cohort studies, exponential tilting genetic mixture models etc. The goal of this book is to make it easier for Ph. D students and new researchers to get started in this research area. It will be of interest to all those who work in the health, biological, social and physical sciences, as well as those who are interested in survey methodology and other areas of statistical science, among others.
This is the seventh volume in a series on the general topics of supersymmetry, supergravity, black objects (including black holes) and the attractor mechanism. The present volume is based on lectures held in March 2013 at the INFN-Laboratori Nazionali di Frascati during the Breaking of supersymmetry and Ultraviolet Divergences in extended Supergravity Workshop (BUDS 2013), organized by Stefano Bellucci, with the participation of prestigious speakers including P. Aschieri, E. Bergshoeff, M. Cederwall, T. Dennen, P. Di Vecchia, S. Ferrara, R. Kallosh, A. Karlsson, M. Koehn, B. Ovrut, A. Van Proeyen, G. Ruppeiner. Special attention is devoted to discussing topics related to the cancellation of ultraviolet divergences in extended supergravity and Born-Infeld-like actions. All talks were followed by extensive discussions and subsequent reworking of the various contributions a feature which is reflected in the unique "flavor" of this volume.
The book provides readers with an understanding of the mutual conditioning of spacetime and interactions and matter. The spacetime manifold will be looked at to be a reservoir for the parametrization of operation Lie groups or subgroup classes of Lie groups. With basic operation groups or Lie algebras, all physical structures can be interpreted in terms of corresponding realizations or representations. Physical properties are related eigenvalues or invariants. As an explicit example of operational spacetime is proposed, called electroweak spacetime, parametrizing the classes of the internal hypercharge - isospin group in the general linear group in two complex dimensions, i.e., the Lorentz cover group, extended by the casual (dilation) and phase group. Its representations and invariants will be investigated with the aim to connect them, qualitatively and numerically, with the properties of interactions and particles as arising in the representations of its tangent Minkowski spaces.
The application of statistical methods to physics is essential. This unique book on statistical physics offers an advanced approach with numerous applications to the modern problems students are confronted with. Therefore the text contains more concepts and methods in statistics than the student would need for statistical mechanics alone. Methods from mathematical statistics and stochastics for the analysis of data are discussed as well. The book is divided into two parts, focusing first on the modeling of statistical systems and then on the analysis of these systems. Problems with hints for solution help the students to deepen their knowledge. The third edition has been updated and enlarged with new sections deepening the knowledge about data analysis. Moreover, a customized set of problems with solutions is accessible on the Web at extras.springer.com."
Quantum mechanics forms the foundation of all modern physics, including atomic, nuclear, and molecular physics, the physics of the elementary particles, condensed matter physics. Modern astrophysics also relies heavily on quantum mechanics. Quantum theory is needed to understand the basis for new materials, new devices, the nature of light coming from stars, the laws which govern the atomic nucleus, and the physics of biological systems. As a result the subject of this book is a required course for most physics graduate students. While there are many books on the subject, this book targets specifically graduate students and it is written with modern advances in various fields in mind. Many examples treated in the various chapters as well as the emphasis of the presentation in the book are designed from the perspective of such problems. For example, the book begins by putting the Schroedinger equation on a spatial discrete lattice and the continuum limit is also discussed, inspired by Hamiltonian lattice gauge theories. The latter and advances in quantum simulations motivated the inclusion of the path integral formulation. This formulation is applied to the imaginary-time evolution operator to project the exact ground state of the harmonic oscillator as is done in quantum simulations. As an example of how to take advantage of symmetry in quantum mechanics, one-dimensional periodic potentials are discussed, inspired by condensed matter physics. Atoms and molecules are discussed within mean-field like treatment (Hartree-Fock) and how to go beyond it. Motivated by the recent intense activity in condensed matter and atomic physics to study the Hubbard model, the electron correlations in the hydrogen molecule are taken into account by solving the two-site Hubbard model analytically. Using the canonical Hamiltonian quantization of quantum electrodynamics, the photons emerge as the quanta of the normal modes, in the same way as the phonons emerge in the treatment of the normal modes of the coupled array of atoms. This is used later to treat the interaction of radiation with atomic matter.
With this thesis the author contributes to the development of a non-mainstream but long-standing approach to electroweak symmetry breaking based on an analogy with superconductivity. Electroweak symmetry breaking is assumed to be caused by dynamically generated masses of typical fermions, i.e., of quarks and leptons, which in turn assumes a new dynamics between quarks and leptons. Primarily it is designed to generate fermion masses and electroweak symmetry breaking is an automatic consequence. After the summary of the topic, the first main part of the thesis addresses the question as to whether the masses of known quarks and leptons provide sufficiently strong sources of electroweak symmetry breaking. It is demonstrated that neutrino masses subject tothe seesaw mechanism are indispensable ingredients. The other two parts of the thesis are dedicated to the presentation of two particular models: The first model is based on the new strong Yukawa dynamics and serves as a platform for studying the ability to reproduce fermion masses. The second, more realistic model introduces a flavor gauge dynamics and its phenomenological consequences are studied. Even though, in the past, this type of models has already been of some interest, following the discovery of the Standard-Model-like Higgs particle, it is regaining its relevance."
This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI 13) and Mathematical Methods from Brain Connectivity (MMBC 13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, MR physics, and applied mathematics. "
This book provides a comprehensive guide to analyzing and solving optimal design problems in continuous media by means of the so-called sub-relaxation method. Though the underlying ideas are borrowed from other, more classical approaches, here they are used and organized in a novel way, yielding a distinct perspective on how to approach this kind of optimization problems. Starting with a discussion of the background motivation, the book broadly explains the sub-relaxation method in general terms, helping readers to grasp, from the very beginning, the driving idea and where the text is heading. In addition to the analytical content of the method, it examines practical issues like optimality and numerical approximation. Though the primary focus is on the development of the method for the conductivity context, the book's final two chapters explore several extensions of the method to other problems, as well as formal proofs. The text can be used for a graduate course in optimal design, even if the method would require some familiarity with the main analytical issues associated with this type of problems. This can be addressed with the help of the provided bibliography.
There is no recent elementary introduction to the theory of discrete dynamical systems that stresses the topological background of the topic. This book fills this gap: it deals with this theory as 'applied general topology'. We treat all important concepts needed to understand recent literature. The book is addressed primarily to graduate students. The prerequisites for understanding this book are modest: a certain mathematical maturity and course in General Topology are sufficient.
This new approach to real analysis stresses the use of the subject with respect to applications, i.e., how the principles and theory of real analysis can be applied in a variety of settings in subjects ranging from Fourier series and polynomial approximation to discrete dynamical systems and nonlinear optimization. Users will be prepared for more intensive work in each topic through these applications and their accompanying exercises. This book is appropriate for math enthusiasts with a prior knowledge of both calculus and linear algebra.
This book presents recent results on the modelling of space plasmas with Kappa distributions and their interpretation. Hot and dilute space plasmas most often do not reach thermal equilibrium, their dynamics being essentially conditioned by the kinetic effects of plasma particles, i.e., electrons, protons, and heavier ions. Deviations from thermal equilibrium shown by these plasma particles are often described by Kappa distributions. Although well-known, these distributions are still controversial in achieving a statistical characterization and a physical interpretation of non-equilibrium plasmas. The results of the Kappa modelling presented here mark a significant progress with respect to all these aspects and open perspectives to understanding the high-resolution data collected by the new generation of telescopes and spacecraft missions. The book is directed to the large community of plasma astrophysics, including graduate students and specialists from associated disciplines, given the palette of the proposed topics reaching from applications to the solar atmosphere and the solar wind, via linear and quasilinear modelling of multi-species plasmas and waves within, to the fundamental physics of nonequilibrium plasmas.
Examining recent mathematical developments in the study of Fredholm operators, spectral theory and block operator matrices, with a rigorous treatment of classical Riesz theory of polynomially-compact operators, this volume covers both abstract and applied developments in the study of spectral theory. These topics are intimately related to the stability of underlying physical systems and play a crucial role in many branches of mathematics as well as numerous interdisciplinary applications. By studying classical Riesz theory of polynomially compact operators in order to establish the existence results of the second kind operator equations, this volume will assist the reader working to describe the spectrum, multiplicities and localization of the eigenvalues of polynomially-compact operators. |
![]() ![]() You may like...
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,214
Discovery Miles 42 140
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R4,084
Discovery Miles 40 840
A Commentary On Newton's Principia…
John Martin Frederick Wright
Hardcover
R1,070
Discovery Miles 10 700
|