![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Applied mathematics > General
This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of stochastic reaction-diffusion models, while in the latter, one can describe the processes by adopting the framework of Markov jump processes and stochastic differential equations. Stochastic Processes, Multiscale Modeling, and Numerical Methods for Computational Cellular Biology will appeal to graduate students and researchers in the fields of applied mathematics, biophysics, and cellular biology.
A 30-article volume, introducing an active and attractive part of algebra that has gained much from its position at the crossroads of mathematics over the years. The papers stimulate the reader to consider and actively investigate the topics and problems they contain.
Quantum machine learning investigates how quantum computers can be used for data-driven prediction and decision making. The books summarises and conceptualises ideas of this relatively young discipline for an audience of computer scientists and physicists from a graduate level upwards. It aims at providing a starting point for those new to the field, showcasing a toy example of a quantum machine learning algorithm and providing a detailed introduction of the two parent disciplines. For more advanced readers, the book discusses topics such as data encoding into quantum states, quantum algorithms and routines for inference and optimisation, as well as the construction and analysis of genuine ``quantum learning models''. A special focus lies on supervised learning, and applications for near-term quantum devices.
This monograph aims to promote original mathematical methods to determine the invariant measure of two-dimensional random walks in domains with boundaries. Such processes arise in numerous applications and are of interest in several areas of mathematical research, such as Stochastic Networks, Analytic Combinatorics, and Quantum Physics. This second edition consists of two parts. Part I is a revised upgrade of the first edition (1999), with additional recent results on the group of a random walk. The theoretical approach given therein has been developed by the authors since the early 1970s. By using Complex Function Theory, Boundary Value Problems, Riemann Surfaces, and Galois Theory, completely new methods are proposed for solving functional equations of two complex variables, which can also be applied to characterize the Transient Behavior of the walks, as well as to find explicit solutions to the one-dimensional Quantum Three-Body Problem, or to tackle a new class of Integrable Systems. Part II borrows special case-studies from queueing theory (in particular, the famous problem of Joining the Shorter of Two Queues) and enumerative combinatorics (Counting, Asymptotics). Researchers and graduate students should find this book very useful.
Semidefinite programming (SDP) is one of the most exciting and active research areas in optimization. It has and continues to attract researchers with very diverse backgrounds, including experts in convex programming, linear algebra, numerical optimization, combinatorial optimization, control theory, and statistics. This tremendous research activity has been prompted by the discovery of important applications in combinatorial optimization and control theory, the development of efficient interior-point algorithms for solving SDP problems, and the depth and elegance of the underlying optimization theory. The Handbook of Semidefinite Programming offers an advanced and broad overview of the current state of the field. It contains nineteen chapters written by the leading experts on the subject. The chapters are organized in three parts: Theory, Algorithms, and Applications and Extensions.
This thesis addresses the surprising features of zero-temperature statics and dynamics of several spin glass models, including correlations between soft spins that arise spontaneously during avalanches, and the discovery of localized states that involve the presence of two-level systems. It also presents the only detailed historiographical research on the spin glass theory. Despite the extreme simplicity of their definition, spin glasses display a wide variety of non-trivial behaviors that are not yet fully understood. In this thesis the author sheds light on some of these, focusing on both the search for phase transitions under perturbations of Hamiltonians and the zero-temperature properties and responses to external stimuli. After introducing spin glasses and useful concepts on phase transitions and numerics, the results of two massive Monte Carlo campaigns on three-dimensional systems are presented: The first of these examines the de Almeida-Thouless transition, and proposes a new finite-size scaling ansatz, which accelerates the convergence to the thermodynamic limit. The second reconstructs the phase diagram of the Heisenberg spin glass with random exchange anisotropy.
FOAM. This acronym has been used for over ?fty years at Rensselaer to designate an upper-division course entitled, Foundations of Applied Ma- ematics. This course was started by George Handelman in 1956, when he came to Rensselaer from the Carnegie Institute of Technology. His objective was to closely integrate mathematical and physical reasoning, and in the p- cess enable students to obtain a qualitative understanding of the world we live in. FOAM was soon taken over by a young faculty member, Lee Segel. About this time a similar course, Introduction to Applied Mathematics, was introduced by Chia-Ch'iao Lin at the Massachusetts Institute of Technology. Together Lin and Segel, with help from Handelman, produced one of the landmark textbooks in applied mathematics, Mathematics Applied to - terministic Problems in the Natural Sciences. This was originally published in 1974, and republished in 1988 by the Society for Industrial and Applied Mathematics, in their Classics Series. This textbook comes from the author teaching FOAM over the last few years. In this sense, it is an updated version of the Lin and Segel textbook.
The study of shape optimization problems encompasses a wide spectrum of academic research with numerous applications to the real world. In this work these problems are treated from both the classical and modern perspectives and target a broad audience of graduate students in pure and applied mathematics, as well as engineers requiring a solid mathematical basis for the solution of practical problems. Key topics and features: * Presents foundational introduction to shape optimization theory * Studies certain classical problems: the isoperimetric problem and the Newton problem involving the best aerodynamical shape, and optimization problems over classes of convex domains * Treats optimal control problems under a general scheme, giving a topological framework, a survey of "gamma"-convergence, and problems governed by ODE * Examines shape optimization problems with Dirichlet and Neumann conditions on the free boundary, along with the existence of classical solutions * Studies optimization problems for obstacles and eigenvalues of elliptic operators * Poses several open problems for further research * Substantial bibliography and index Driven by good examples and illustrations and requiring only a standard knowledge in the calculus of variations, differential equations, and functional analysis, the book can serve as a text for a graduate course in computational methods of optimal design and optimization, as well as an excellent reference for applied mathematicians addressing functional shape optimization problems.
When a new extraordinary and outstanding theory is stated, it has to face criticism and skeptism, because it is beyond the usual concept. The fractional calculus though not new, was not discussed or developed for a long time, particularly for lack of its application to real life problems. It is extraordinary because it does not deal with 'ordinary' differential calculus. It is outstanding because it can now be applied to situations where existing theories fail to give satisfactory results. In this book not only mathematical abstractions are discussed in a lucid manner, with physical mathematical and geometrical explanations, but also several practical applications are given particularly for system identification, description and then efficient controls. The normal physical laws like, transport theory, electrodynamics, equation of motions, elasticity, viscosity, and several others of are based on 'ordinary' calculus. In this book these physical laws are generalized in fractional calculus contexts; taking, heterogeneity effect in transport background, the space having traps or islands, irregular distribution of charges, non-ideal spring with mass connected to a pointless-mass ball, material behaving with viscous as well as elastic properties, system relaxation with and without memory, physics of random delay in computer network; and several others; mapping the reality of nature closely. The concept of fractional and complex order differentiation and integration are elaborated mathematically, physically and geometrically with examples. The practical utility of local fractional differentiation for enhancing the character of singularity at phase transition or characterizing the irregularity measure of response function is deliberated. Practical results of viscoelastic experiments, fractional order controls experiments, design of fractional controller and practical circuit synthesis for fractional order elements are elaborated in this book. The book also maps theory of classical integer order differential equations to fractional calculus contexts, and deals in details with conflicting and demanding initialization issues, required in classical techniques. The book presents a modern approach to solve the 'solvable' system of fractional and other differential equations, linear, non-linear; without perturbation or transformations, but by applying physical principle of action-and-opposite-reaction, giving 'approximately exact' series solutions. Historically, Sir Isaac Newton and Gottfried Wihelm Leibniz independently discovered calculus in the middle of the 17th century. In recognition to this remarkable discovery, J.von Neumann remarked, ..".the calculus was the first achievement of modern mathematics and it is difficult to overestimate its importance. I think it defines more equivocally than anything else the inception of modern mathematical analysis which is logical development, still constitute the greatest technical advance in exact thinking." This XXI century has thus started to 'think-exactly' for advancement in science & technology by growing application of fractional calculus, and this century has started speaking the language which nature understands the best."
This book explores computational fluid dynamics in the context of the human nose, allowing readers to gain a better understanding of its anatomy and physiology and integrates recent advances in clinical rhinology, otolaryngology and respiratory physiology research. It focuses on advanced research topics, such as virtual surgery, AI-assisted clinical applications and therapy, as well as the latest computational modeling techniques, controversies, challenges and future directions in simulation using CFD software. Presenting perspectives and insights from computational experts and clinical specialists (ENT) combined with technical details of the computational modeling techniques from engineers, this unique reference book will give direction to and inspire future research in this emerging field.
From Catastrophe to Chaos: A General Theory of Economic Discontinuities presents and unusual perspective on economics and economic analysis. Current economic theory largely depends upon assuming that the world is fundamentally continuous. However, an increasing amount of economic research has been done using approaches that allow for discontinuities such as catastrophe theory, chaos theory, synergetics, and fractal geometry. The spread of such approaches across a variety of disciplines of thought has constituted a virtual intellectual revolution in recent years. This book reviews the applications of these approaches in various subdisciplines of economics and draws upon past economic thinkers to develop an integrated view of economics as a whole from the perspective of inherent discontinuity.
Reviews of Plasma Physics Volume 22, contains two reviews. The first Cooperative Effects in Plasmas by the late B.B. Kadomtsev is based on the second edition of the author's book in Russian which originated from his written lectures for students of the Moscow Institute of Physics and Technology. Kadomtsev intended to publish the book in English and even initiated the translation himself. The book represents a review of the typical plasma cooperative phenomena that determine the behavior of laboratory and astrophysical plasmas. It is characterized by lively language. The first three sections of the review deal with linear and nonlinear phenomena in fluids without a magnetic field. An additional subsection 'Solitons' has been added to the third section. The next two sections address regular nonlinear phenomena in a plasma in a magnetic field. The second review by S.V. Bulanov et al is connected with the contents of the first. The physics of the laser-plasma interaction including such nonlinear processes as wave breaking, the acceleration of charged particles, electromagnetic wave self-focusing, the relativistic soliton and vortex generation, are considered analytically and illustrated using computer simulations.
In this book applications of cooperative game theory that arise from combinatorial optimization problems are described. It is well known that the mathematical modeling of various real-world decision-making situations gives rise to combinatorial optimization problems. For situations where more than one decision-maker is involved classical combinatorial optimization theory does not suffice and it is here that cooperative game theory can make an important contribution. If a group of decision-makers decide to undertake a project together in order to increase the total revenue or decrease the total costs, they face two problems. The first one is how to execute the project in an optimal way so as to increase revenue. The second one is how to divide the revenue attained among the participants. It is with this second problem that cooperative game theory can help. The solution concepts from cooperative game theory can be applied to arrive at revenue allocation schemes. In this book the type of problems described above are examined. Although the choice of topics is application-driven, it also discusses theoretical questions that arise from the situations that are studied. For all the games described attention will be paid to the appropriateness of several game-theoretic solution concepts in the particular contexts that are considered. The computation complexity of the game-theoretic solution concepts in the situation at hand will also be considered.
This book focuses on solving different types of time-varying problems. It presents various Zhang dynamics (ZD) models by defining various Zhang functions (ZFs) in real and complex domains. It then provides theoretical analyses of such ZD models and illustrates their results. It also uses simulations to substantiate their efficacy and show the feasibility of the presented ZD approach (i.e., different ZFs leading to different ZD models), which is further applied to the repetitive motion planning (RMP) of redundant robots, showing its application potential.
This book represents a milestone in the progression of Data Envelop ment Analysis (DEA). It is the first reference text which includes a comprehensive review and comparative discussion of the basic DEA models. The development is anchored in a unified mathematical and graphical treatment and includes the most important modeling ex tensions. In addition, this is the first book that addresses the actual process of conducting DEA analyses including combining DEA and 1 parametric techniques. The book has three other distinctive features. It traces the applications driven evolution and diffusion of DEA models and extensions across disciplinary boundaries. It includes a comprehensive bibliography to serve as a source of references as well as a platform for further develop ments. And, finally, the power of DEA analysis is demonstrated through fifteen novel applications which should serve as an inspiration for future applications and extensions of the methodology. The origin of this book was a Conference on New Uses of DEA in 2 Management and Public Policy which was held at the IC Institute of the University of Texas at Austin on September 27-29, 1989. The conference was made possible through NSF Grant #SES-8722504 (A. Charnes and 2 W. W. Cooper, co-PIs) and the support of the IC Institute."
Dynamic Modeling for Business Management applies dynamic modeling to business management, using accessible modeling techniques that are demonstrated starting with fundamental processes and advancing to more complex business models. Discussions of modeling emphasize its practical use for decision making and implementing change for measurable results. Readers will learn about both manufacturing and service-oriented business processes using hands-on lessons. They will then be able to manipulate additional models to try out their knowledge and address issues specific to their own businesses and interests. All of the models used in the book along with demo versions of ithink and Berkeley Madonna software are included with the book on a CD-ROM. Some of the topics covered include workflow management, supply-chain management, and business strategy.
This book introduces new techniques for cellular image feature extraction, pattern recognition and classification. The authors use the antinuclear antibodies (ANAs) in patient serum as the subjects and the Indirect Immunofluorescence (IIF) technique as the imaging protocol to illustrate the applications of the described methods. Throughout the book, the authors provide evaluations for the proposed methods on two publicly available human epithelial (HEp-2) cell datasets: ICPR2012 dataset from the ICPR'12 HEp-2 cell classification contest and ICIP2013 training dataset from the ICIP'13 Competition on cells classification by fluorescent image analysis. First, the reading of imaging results is significantly influenced by one's qualification and reading systems, causing high intra- and inter-laboratory variance. The authors present a low-order LP21 fiber mode for optical single cell manipulation and imaging staining patterns of HEp-2 cells. A focused four-lobed mode distribution is stable and effective in optical tweezer applications, including selective cell pick-up, pairing, grouping or separation, as well as rotation of cell dimers and clusters. Both translational dragging force and rotational torque in the experiments are in good accordance with the theoretical model. With a simple all-fiber configuration, and low peak irradiation to targeted cells, instrumentation of this optical chuck technology will provide a powerful tool in the ANA-IIF laboratories. Chapters focus on the optical, mechanical and computing systems for the clinical trials. Computer programs for GUI and control of the optical tweezers are also discussed. to more discriminative local distance vector by searching for local neighbors of the local feature in the class-specific manifolds. Encoding and pooling the local distance vectors leads to salient image representation. Combined with the traditional coding methods, this method achieves higher classification accuracy. Then, a rotation invariant textural feature of Pairwise Local Ternary Patterns with Spatial Rotation Invariant (PLTP-SRI) is examined. It is invariant to image rotations, meanwhile it is robust to noise and weak illumination. By adding spatial pyramid structure, this method captures spatial layout information. While the proposed PLTP-SRI feature extracts local feature, the BoW framework builds a global image representation. It is reasonable to combine them together to achieve impressive classification performance, as the combined feature takes the advantages of the two kinds of features in different aspects. Finally, the authors design a Co-occurrence Differential Texton (CoDT) feature to represent the local image patches of HEp-2 cells. The CoDT feature reduces the information loss by ignoring the quantization while it utilizes the spatial relations among the differential micro-texton feature. Thus it can increase the discriminative power. A generative model adaptively characterizes the CoDT feature space of the training data. Furthermore, exploiting a discriminant representation allows for HEp-2 cell images based on the adaptive partitioned feature space. Therefore, the resulting representation is adapted to the classification task. By cooperating with linear Support Vector Machine (SVM) classifier, this framework can exploit the advantages of both generative and discriminative approaches for cellular image classification. The book is written for those researchers who would like to develop their own programs, and the working MatLab codes are included for all the important algorithms presented. It can also be used as a reference book for graduate students and senior undergraduates in the area of biomedical imaging, image feature extraction, pattern recognition and classification. Academics, researchers, and professional will find this to be an exceptional resource.
Digital Noise Monitoring of Defect Origin is for both academics and professionals in the fields of engineering, biological sciences, physical science, and automation with particular emphasis on power engineering, oil-and-gas extraction, and aviation among others. The focus of the book is on determining defect origins. The author divides the process into the stages of monitoring the defect origin, identification of the defect and its stages, and control of the defect. The significance of this work is also connected to the possibility of using the noise as a data carrier for creating technologies that detect the initial stage of changes in objects.
This book presents methodologies for analysing large data sets produced by the direct numerical simulation (DNS) of turbulence and combustion. It describes the development of models that can be used to analyse large eddy simulations, and highlights both the most common techniques and newly emerging ones. The chapters, written by internationally respected experts, invite readers to consider DNS of turbulence and combustion from a formal, data-driven standpoint, rather than one led by experience and intuition. This perspective allows readers to recognise the shortcomings of existing models, with the ultimate goal of quantifying and reducing model-based uncertainty. In addition, recent advances in machine learning and statistical inferences offer new insights on the interpretation of DNS data. The book will especially benefit graduate-level students and researchers in mechanical and aerospace engineering, e.g. those with an interest in general fluid mechanics, applied mathematics, and the environmental and atmospheric sciences.
This thesis presents the measurement of the Higgs boson cross section in the diphoton decay channel. The measurement relies on proton-proton collision data at a center-of-mass energy s = 13 TeV recorded by the ATLAS experiment at the Large Hadron Collider (LHC). The collected data correspond to the full Run-2 dataset with an integrated luminosity of 139 fb-1. The measured cross sections are used to constrain anomalous Higgs boson interactions in the Effective Field Theory (EFT) framework. The results presented in this thesis represent a reduction by a factor 2 of the different photon and jet energy scale and resolution systematic uncertainties with respect to the previous ATLAS publication. The thesis details the calibration of electron and photon energies in ATLAS, in particular the measurement of the presampler energy scale and the estimation of its systematic uncertainty. This calibration was used to perform a measurement of the Higgs boson mass in the H and H 4l channels using the 36 fb 1 dataset.
This book calls attention to the social dimension of economics and stresses the need for an ethical yardstick which can only be provided by an interdisciplinary approach to the economy -- socio-economics. Current thought claims to account for ethics by portraying economics as both positive and normative. The positive aspect of economics is based on observable fact. This is typified by the neoclassical school, which takes as its main premise the hypothesis of individual economic rationality resulting in economic decisions based on efficient market processes. The normative aspect of economics involves value judgements. Economic theory acknowledges that economic agents are free to express value judgements, and if one point of view is to prevail, it can only be the majority view based on an equitable democratic process. Accordingly, strict application of the principles of the market economy and political democracy should eliminate the need for a separate ethical approach to economics. Despite this conclusion, recent years have witnessed the need to introduce ethical considerations into economics. For one, the distinction between the normative and positive aspects of economics and their linking with politics and economics, respectively, are gross oversimplifications. In addition, numerous market failures make it difficult to accept efficient markets as the determinant of all economic decisions. Finally, democratic processes are hard to maintain due to society's inability to develop a sense of solidarity. This book provides the ethical yardstick' necessary in analyzing economic decisions, institutions and policies discussed in relation with environment protection.
"Decision Systems and Non-stochastic Randomness" is the first systematic presentation and mathematical formalization (including existence theorems) of the statistical regularities of non-stochastic randomness. The results presented in this book extend the capabilities of probability theory by providing mathematical techniques that allow for the description of uncertain events that do not fit standard stochastic models. The book demonstrates how non-stochastic regularities can be incorporated into decision theory and information theory, offering an alternative to the subjective probability approach to uncertainty and the unified approach to the measurement of information. This book is intended for statisticians, mathematicians, engineers, economists or other researchers interested in non-stochastic modeling and decision theory.
This volume presents an eclectic mix of original research articles in areas covering the analysis of ordered data, stochastic modeling and biostatistics. These areas were featured in a conference held at the University of Texas at Dallas from March 7 to 9, 2014 in honor of Professor H. N. Nagaraja's 60th birthday and his distinguished contributions to statistics. The articles were written by leading experts who were invited to contribute to the volume from among the conference participants. The volume is intended for all researchers with an interest in order statistics, distribution theory, analysis of censored data, stochastic modeling, time series analysis, and statistical methods for the health sciences, including statistical genetics.
In t.lw fHll of !!)!)2, Professor Dr. M. Alt.ar, chairman of tIw newly established dppartnwnt or Managenwnt. wit.h Comput.er Science at thp Homanian -American Univprsity in Bucharest (a private univprsil.y), inl.roducod in t.he curriculum a course on DiffenHltial Equations and Optimal Cont.rol, asking lIS to teach such course. It was an inter8sting challengo, since for t.Iw first tim8 wo had to t8ach such mathemaLical course for st.udents with economic background and interosts. It was a natural idea to sl.m't by looking at pconomic models which were described by differpntial equations and for which problems in (\pcision making dir! ariso. Since many or such models were r!escribed in discret.e timp, wp eleculed to elpvolop in parallel t.he theory of differential equations anel thaI, of discrete-timo systpms aur! also control theory in continuous and discrete time. Tlw jll'eSPlu book is t.he result of our tpaehing px!wripnce wit.h this courge. It is an enlargud version of t.he actllal lectuf(~s where, depending on t.he background of tho St.lI(\('Ilts, not all proofs could be given in detail. We would like to express our grat.itude to tlw Board of the Romanian - American University, personally 1. 0 the Rector, Professor Dr. Ion Smedpscu, for support, encouragement and readinpss to accept advancnd ideas in tho curriculum. fhe authors express t.heir warmest thanks 1.0 Mrs. Monica Stan . Necula for tho oxcellent procC'ssing of t.he manuscript.
This book traces the life of Cholesky (1875-1918), and gives his family history. After an introduction to topography, an English translation of an unpublished paper by him where he explained his method for linear systems is given, studied and replaced in its historical context. His other works, including two books, are also described as well as his involvement in teaching at a superior school by correspondence. The story of this school and its founder, Leon Eyrolles, are addressed. Then, an important unpublished book of Cholesky on graphical calculation is analyzed in detail and compared to similar contemporary publications. The biography of Ernest Benoit, who wrote the first paper where Choleskys method is explained, is provided. Various documents, highlighting the life and the personality of Cholesky, end the book." |
![]() ![]() You may like...
The Social Semantic Web
John G. Breslin, Alexandre Passant, …
Hardcover
R1,699
Discovery Miles 16 990
Text-based intelligent Systems - Current…
Paul S. Jacobs
Hardcover
|