![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Applied mathematics
This book is about the drift, diffusion, and reaction of ions moving through gases under the influence of an external electric field, the gas temperature, and the number density. While this field was established late in the 19th century, experimental and theoretical studies of ion and electron swarms continue to be important in such varied fields as atomic and molecular physics, aeronomy and atmospheric chemistry, gaseous electronics, plasma processing, and laser physics. This book follows in the rigorous tradition of well-known older books on the subject, while at the same time providing a much-needed overview of modern developments with a focus on theory. Graduate students and researchers new to this field will find this book an indispensable guide, particularly those involved with ion mobility spectrometry and the use of ion transport coefficients to test and improve ab initio ion-neutral interaction potentials. Established researchers and academics will find in this book a modern companion to the classic references.
This book demonstrates some of the ways in which Microsoft Excel (R) may be used to solve numerical problems in the field of physics.
Dirac operators play an important role in several domains of mathematics and physics, for example: index theory, elliptic pseudodifferential operators, electromagnetism, particle physics, and the representation theory of Lie groups. In this essentially self-contained work, the basic ideas underlying the concept of Dirac operators are explored. Starting with Clifford algebras and the fundamentals of differential geometry, the text focuses on two main properties, namely, conformal invariance, which determines the local behavior of the operator, and the unique continuation property dominating its global behavior. Spin groups and spinor bundles are covered, as well as the relations with their classical counterparts, orthogonal groups and Clifford bundles. The chapters on Clifford algebras and the fundamentals of differential geometry can be used as an introduction to the above topics, and are suitable for senior undergraduate and graduate students. The other chapters are also accessible at this level so that this text requires very little previous knowledge of the domains covered. The reader will benefit, however, from some knowledge of complex analysis, which gives the simplest example of a Dirac operator. More advanced readers---mathematical physicists, physicists and mathematicians from diverse areas---will appreciate the fresh approach to the theory as well as the new results on boundary value theory.
This volume shares and makes accessible new research lines and recent results in several branches of theoretical and mathematical physics, among them Quantum Optics, Coherent States, Integrable Systems, SUSY Quantum Mechanics, and Mathematical Methods in Physics. In addition to a selection of the contributions presented at the "6th International Workshop on New Challenges in Quantum Mechanics: Integrability and Supersymmetry", held in Valladolid, Spain, 27-30 June 2017, several high quality contributions from other authors are also included. The conference gathered 60 participants from many countries working in different fields of Theoretical Physics, and was dedicated to Prof. Veronique Hussin-an internationally recognized expert in many branches of Mathematical Physics who has been making remarkable contributions to this field since the 1980s. The reader will find interesting reviews on the main topics from internationally recognized experts in each field, as well as other original contributions, all of which deal with recent applications or discoveries in the aforementioned areas.
This self-contained monograph presents an overview of fuzzy operator theory in mathematical analysis. Concepts, principles, methods, techniques, and applications of fuzzy operator theory are unified in this book to provide an introduction to graduate students and researchers in mathematics, applied sciences, physics, engineering, optimization, and operations research. New approaches to fuzzy operator theory and fixed point theory with applications to fuzzy metric spaces, fuzzy normed spaces, partially ordered fuzzy metric spaces, fuzzy normed algebras, and non-Archimedean fuzzy metric spaces are presented. Surveys are provided on: Basic theory of fuzzy metric and normed spaces and its topology, fuzzy normed and Banach spaces, linear operators, fundamental theorems (open mapping and closed graph), applications of contractions and fixed point theory, approximation theory and best proximity theory, fuzzy metric type space, topology and applications.
This book introduces the reader to solving partial differential equations (PDEs) numerically using element-based Galerkin methods. Although it draws on a solid theoretical foundation (e.g. the theory of interpolation, numerical integration, and function spaces), the book's main focus is on how to build the method, what the resulting matrices look like, and how to write algorithms for coding Galerkin methods. In addition, the spotlight is on tensor-product bases, which means that only line elements (in one dimension), quadrilateral elements (in two dimensions), and cubes (in three dimensions) are considered. The types of Galerkin methods covered are: continuous Galerkin methods (i.e., finite/spectral elements), discontinuous Galerkin methods, and hybridized discontinuous Galerkin methods using both nodal and modal basis functions. In addition, examples are included (which can also serve as student projects) for solving hyperbolic and elliptic partial differential equations, including both scalar PDEs and systems of equations.
This book explores several key issues in beam phase space dynamics in plasma-based wakefield accelerators. It reveals the phase space dynamics of ionization-based injection methods by identifying two key phase mixing processes. Subsequently, the book proposes a two-color laser ionization injection scheme for generating high-quality beams, and assesses it using particle-in-cell (PIC) simulations. To eliminate emittance growth when the beam propagates between plasma accelerators and traditional accelerator components, a method using longitudinally tailored plasma structures as phase space matching components is proposed. Based on the aspects above, a preliminary design study on X-ray free-electron lasers driven by plasma accelerators is presented. Lastly, an important type of numerical noise-the numerical Cherenkov instabilities in particle-in-cell codes-is systematically studied.
What are the physical mechanisms that underlie the efficient generation and transfer of energy at the nanoscale? Nature seems to know the answer to this question, having optimised the process of photosynthesis in plants over millions of years of evolution. It is conceivable that humans could mimic this process using synthetic materials, and organic semiconductors have attracted a lot of attention in this respect. Once an organic semiconductor absorbs light, bound pairs of electrons with positively charged holes, termed `excitons', are formed. Excitons behave as fundamental energy carriers, hence understanding the physics behind their efficient generation and transfer is critical to realising the potential of organic semiconductors for light-harvesting and other applications, such as LEDs and transistors. However, this problem is extremely challenging since excitons can interact very strongly with photons. Moreover, simultaneously with the exciton motion, organic molecules can vibrate in hundreds of possible ways, having a very strong effect on energy transfer. The description of these complex phenomena is often beyond the reach of standard quantum mechanical methods which rely on the assumption of weak interactions between excitons, photons and vibrations. In this thesis, Antonios Alvertis addresses this problem through the development and application of a variety of different theoretical methods to the description of these strong interactions, providing pedagogical explanations of the underlying physics. A comprehensive introduction to organic semiconductors is followed by a review of the background theory that is employed to approach the relevant research questions, and the theoretical results are presented in close connection with experiment, yielding valuable insights for experimentalists and theoreticians alike.
This book studies the vulnerability of wireless communications under line-of-sight (LoS) and non-LoS correlated fading environments. The authors theoretically and practically provide physical layer security analyses for several technologies and networks such as Fifth-Generation (5G) networks, Internet of Things (IoT) applications, and Non-orthogonal multiple access (NOMA). The authors have provided these under various practical scenarios, and developed theoretical aspects to validate their proposed applications. Presents physical layer security (PLS) under correlated fading environments, 5G wireless networks, and NOMA networks; Provides end-to-end analyses, combination of channel correlation and outdated CSI and their effects on PL; Includes contributions of PLS research written by global experts in academia and industry.
This book proposes a number of promising models and methods for adaptive segmentation, swarm partition, permissible segmentation, and transform properties, as well as techniques for spatio-temporal video segmentation and interpretation, online fuzzy clustering of data streams, and fuzzy systems for information retrieval. The main focus is on the spatio-temporal segmentation of visual information. Sets of meaningful and manageable image or video parts, defined by visual interest or attention to higher-level semantic issues, are often vital to the efficient and effective processing and interpretation of viewable information. Developing robust methods for spatial and temporal partition represents a key challenge in computer vision and computational intelligence as a whole. This book is intended for students and researchers in the fields of machine learning and artificial intelligence, especially those whose work involves image processing and recognition, video parsing, and content-based image/video retrieval.
New Edition of a Classic Guide to Statistical Applications in the Biomedical Sciences In the last decade, there have been significant changes in the way statistics is incorporated into biostatistical, medical, and public health research. Addressing the need for a modernized treatment of these statistical applications, Basic Statistics, Fourth Edition presents relevant, up-to-date coverage of research methodology using careful explanations of basic statistics and how they are used to address practical problems that arise in the medical and public health settings. Through concise and easy-to-follow presentations, readers will learn to interpret and examine data by applying common statistical tools, such as sampling, random assignment, and survival analysis. Continuing the tradition of its predecessor, this new edition outlines a thorough discussion of different kinds of studies and guides readers through the important, related decision-making processes such as determining what information is needed and planning the collections process. The book equips readers with the knowledge to carry out these practices by explaining the various types of studies that are commonly conducted in the fields of medical and public health, and how the level of evidence varies depending on the area of research. Data screening and data entry into statistical programs is explained and accompanied by illustrations of statistical analyses and graphs. Additional features of the Fourth Edition include: A new chapter on data collection that outlines the initial steps in planning biomedical and public health studiesA new chapter on nonparametric statistics that includes a discussion and application of the Sign test, the Wilcoxon Signed Rank test, and the Wilcoxon Rank Sum test and its relationship to the Mann-Whitney U testAn updated introduction to survival analysis that includes the Kaplan Meier method for graphing the survival function and a brief introduction to tests for comparing survival functionsIncorporation of modern statistical software, such as SAS, Stata, SPSS, and Minitab into the presented discussion of data analysisUpdated references at the end of each chapter "Basic Statistics," Fourth Edition is an ideal book for courses on biostatistics, medicine, and public health at the upper-undergraduate and graduate levels. It is also appropriate as a reference for researchers and practitioners who would like to refresh their fundamental understanding of statistical techniques.
This book demonstrates that different rudder configurations have different hydrodynamic characteristics, which are influenced by the profile, the parameters, and the specific configuration. The author proposes new regression formulas to help naval architects quickly estimate the rudder-induced forces and moments in maneuvering. Furthermore, the author proposes and validates an integrated maneuvering model for both seagoing ships and inland vessels. Using the proposed regression formulas and maneuvering model, the specific impacts of rudder configurations on inland vessel maneuverability are studied. In turn, the book demonstrates the application of Reynolds-Averaged Navier-Stokes (RANS) simulations to obtain rudder hydrodynamic characteristics, and the integration of the RANS results into maneuvering models as an accurate estimation of rudder forces and moments needed to quantify the impacts of rudder configurations on ships' maneuvering performance. In addition, the author proposes new criteria for the prediction and evaluation of inland vessel maneuverability. Simulations of ships with various rudder configurations are presented, in order to analyze the impacts of rudder configurations on ship maneuverability in different classic and proposed test maneuvers. Offering essential guidance on the effects of rudders for inland vessel maneuverability, and helping practical engineers make informed design choices, the book is of interest to researchers and academics in the field of naval engineering, as well as students of naval architecture. Industrial practitioners working on ship design may also find it beneficial.
This textbook treats graph colouring as an algorithmic problem, with a strong emphasis on practical applications. The author describes and analyses some of the best-known algorithms for colouring graphs, focusing on whether these heuristics can provide optimal solutions in some cases; how they perform on graphs where the chromatic number is unknown; and whether they can produce better solutions than other algorithms for certain types of graphs, and why. The introductory chapters explain graph colouring, complexity theory, bounds and constructive algorithms. The author then shows how advanced, graph colouring techniques can be applied to classic real-world operational research problems such as designing seating plans, sports scheduling, and university timetabling. He includes many examples, suggestions for further reading, and historical notes, and the book is supplemented by an online suite of downloadable code. The book is of value to researchers, graduate students, and practitioners in the areas of operations research, theoretical computer science, optimization, and computational intelligence. The reader should have elementary knowledge of sets, matrices, and enumerative combinatorics.
This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract "best estimate" values for model parameters and predicted results, together with "best estimate" uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information. The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, "cost functional" (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the "data adjustment" and/or the "4D-VAR" data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a "model validation metric" which quantifies the consistency (agreement/disagreement) between measurements and computations. This "model validation metric" (or "consistency indicator") is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters. Traditional methods for computing response sensitivities are hampered by the "curse of dimensionality," which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, "best estimate" predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.
Chemical Modelling: Applications and Theory comprises critical literature reviews of all aspects of molecular modelling. Molecular modelling in this context refers to modelliing the structure, properties and reactions of atoms, molecules and materials. Each chapter provides a selective review of recent literature, incorporating sufficient historical perspective for the non-specialist to gain an understanding. With chemical modelling covering such a wide range of subjects, this Specialist Periodical Report serves as the first port of call to any chemist, biochemist, materials scientist or molecular physicist needing to acquaint themselves with major developments in the area.
This multi-volume handbook is the most up-to-date and comprehensive reference work in the field of fractional calculus and its numerous applications. This sixth volume collects authoritative chapters covering several applications of fractional calculus in control theory, including fractional controllers, design methods and toolboxes, and a large number of engineering applications of control.
This book addresses the concepts of unstable flow solutions, convective instability and absolute instability, with reference to simple (or toy) mathematical models, which are mathematically simple despite their purely abstract character. Within this paradigm, the book introduces the basic mathematical tools, Fourier transform, normal modes, wavepackets and their dynamics, before reviewing the fundamental ideas behind the mathematical modelling of fluid flow and heat transfer in porous media. The author goes on to discuss the fundamentals of the Rayleigh-Benard instability and other thermal instabilities of convective flows in porous media, and then analyses various examples of transition from convective to absolute instability in detail, with an emphasis on the formulation, deduction of the dispersion relation and study of the numerical data regarding the threshold of absolute instability. The clear descriptions of the analytical and numerical methods needed to obtain these parametric threshold data enable readers to apply them in different or more general cases. This book is of interest to postgraduates and researchers in mechanical and thermal engineering, civil engineering, geophysics, applied mathematics, fluid mechanics, and energy technology.
Since 1950, the Highway Capacity Manual has been a standard used in the planning, design, analysis, and operation of virtually any highway traffic facility in the United States. It has also been widely used around the globe and has inspired the development of similar manuals in other countries. This book is Volume II of a series on the conceptual and research origins of the methodologies found in the Highway Capacity Manual. It focuses on the most complex points in a traffic system: signalized and unsignalized intersections, and the concepts and methodologies developed over the years to model their operations. It also includes an overview of the fundamental concepts of capacity and level of service, particularly as applied to intersections. The historical roots of the manual and its contents are important to understanding current methodologies, and improving them in the future. As such, this book is a valuable resource for current and future users of the Highway Capacity Manual, as well as researchers and developers involved in advancing the state-of-the-art in the field.
The third edition of the by now classic reference on rigorous analysis of symmetry breaking in both classical and quantum field theories adds new topics of relevance, in particular the effect of dynamical Coulomb delocalization, by which boundary conditions give rise to volume effects and to energy/mass gap in the Goldstone spectrum (plasmon spectrum, Anderson superconductivity, Higgs phenomenon). The book closes with a discussion of the physical meaning of global and local gauge symmetries and their breaking, with attention to the effect of gauge group topology in QCD. From the reviews of the first edition: It is remarkable to see how much material can actually be presented in a rigorous way (incidentally, many of the results presented are due to Strocchi himself), yet this is largely ignored, the original heuristic derivations being, as a rule, more popular. - At each step he strongly emphasizes the physical meaning and motivation of the various notions introduced [...] a book that fills a conspicuous gap in the literature, and does it rather well. It could also be a good basis for a graduate course in mathematical physics. J.-P. Antoine, Physicalia 28/2, 2006 Despite many accounts in popular textbooks and a widespread belief, the phenomenon is rather subtle, requires an infinite set of degrees of freedom and an advanced mathematical setting of the system under investigation. [...] The mathematically oriented graduate student will certainly benefit from this thorough, rigorous and detailed investigation. G. Roepstorff, Zentralblatt MATH, Vol. 1075, 2006 From the reviews of the second edition: This second edition of Strocchi's Symmetry Breaking presents a complete, generalized and highly rigorous discussion of the subject, based on a formal analysis of conditions necessary for the mechanism of spontaneous symmetry breaking to occur in classical systems, as well as in quantum systems. [...] This book is specifically recommended for mathematical physicists interested in a deeper and rigorous understanding of the subject, and it should be mandatory for researchers studying the mechanism of spontaneous symmetry breaking. S. Hajjawi, Mathematical Reviews, 2008
This book uses art photography as a point of departure for learning about physics, while also using physics as a point of departure for asking fundamental questions about the nature of photography as an art. Although not a how-to manual, the topics center around hands-on applications, sometimes illustrated by photographic processes that are inexpensive and easily accessible to students (including a versatile new process developed by the author, and first described in print in this series). A central theme is the connection between the physical interaction of light and matter on the one hand, and the artistry of the photographic processes and their results on the other. One half of Energy and Color focuses on the physics of energy, power, illuminance, and intensity of light, and how these relate to the photographic exposure, including a detailed example that follows the emission of light from the sun all the way through to the formation of the image in the camera. These concepts are described in both their traditional manner, but also using very-low sensitivity photography as an example, which brings the physical concepts to the fore in a visible way, whereas they are often hidden with ordinary high-speed photographic detectors. Energy and Color also considers color in terms of the spectrum of light, how it interacts with the subject, and how the camera's light detector interacts with the image focused upon it. But of equal concern is the only partially-understood and sometimes unexpected ways in which the human eye/brain interprets this spectral stimulus as color. The volume covers basic photographic subjects such as shutter, aperture, ISO, metering and exposure value, but also given their relations to the larger themes of the book less familiar topics such as the Jones-Condit equation, Lambertian versus isotropic reflections, reflection and response curves, and the opponent-process model of color perception. Although written at a beginning undergraduate level, the topics are chosen for their role in a more general discussion of the relation between science and art that is of interest to readers of all backgrounds and levels of expertise.
This book analyzes a range of new developments in various fields concerning the concepts of chaos and complexity theory. The proceedings of the 7th International Symposium on Chaos, Complexity and Leadership feature newly developed concepts involving various research methodologies for identifying chaos and complexity in different fields of the sciences and leadership. In addition, it explores chaotic and complex systems from all fields of knowledge in order to stake a claim of prevalence of compatibility between knowledge fields. Particular emphasis is placed on exploring non-linearity in order to open a discussion on new approaches to and perspectives on chaos, complexity and leadership. Readers will find coverage of important events that have recently taken place in our world, regardless of whether they were social, political, economic or scientific in nature. The book explores diverse aspects of and issues related to the effects of chaos and complexity in the world; discusses the application of nonlinear dynamics in order to arrive at transformational policies; and offers projections of tomorrow's world using an interdisciplinary approach. Though primarily intended for readers with an interest in nonlinear science, thanks to its focus on the application of chaos and complexity to other disciplines, the book appeals to a broad readership.
This book presents a generalised computational model for the degradation of resorbable composites, using analytic expressions to represent the interwoven phenomena present during degradation. It then combines this modelling framework with a comprehensive database of quantitative degradation data mined from existing literature and from novel experiments, to provide new insights into the interrelated factors controlling degradation. Resorbable composites made of biodegradable polyesters and calcium-based ceramics have significant therapeutic potential as tissue engineering scaffolds, as temporary implants and as drug-loaded matrices for controlled release. However, their degradation is complex and the rate of resorption depends on multiple connected factors such as the shape and size of the device, polymer chemistry and molecular weight, particle phase, size, volume fraction, distribution and pH-dependent dissolution properties. Understanding and ultimately predicting the degradation of resorbable composites is of central importance if we are to fully unlock the promise of these materials.
This book focuses on the calculus of variations, including fundamental theories and applications. This textbook is intended for graduate and higher-level college and university students, introducing them to the basic concepts and calculation methods used in the calculus of variations. It covers the preliminaries, variational problems with fixed boundaries, sufficient conditions of extrema of functionals, problems with undetermined boundaries, variational problems of conditional extrema, variational problems in parametric forms, variational principles, direct methods for variational problems, variational principles in mechanics and their applications, and variational problems of functionals with vector, tensor and Hamiltonian operators. Many of the contributions are based on the authors' research, addressing topics such as the extension of the connotation of the Hilbert adjoint operator, definitions of the other three kinds of adjoint operators, the extremum function theorem of the complete functional, unified Euler equations in variational methods, variational theories of functionals with vectors, modulus of vectors, arbitrary order tensors, Hamiltonian operators and Hamiltonian operator strings, reconciling the Euler equations and the natural boundary conditions, and the application range of variational methods. The book is also a valuable reference resource for teachers as well as science and technology professionals.
Blast Mitigation: Experimental and Numerical Studies covers both experimental and numerical aspects of material and structural response to dynamic blast loads and its mitigation. The authors present the most up-to-date understanding from laboratory studies and computational analysis for researchers working in the field of blast loadings and their effect on material and structural failure, develop designs for lighter and highly efficient structural members for blast energy absorption, discuss vulnerability of underground structures, present methods for dampening blast overpressures, discuss structural post blast collapse and give attention to underwater explosion and implosion effects on submerged infrastructure and mitigation measures for this environment.
This book focuses on theoretical aspects of dynamical systems in the broadest sense. It highlights novel and relevant results on mathematical and numerical problems that can be found in the fields of applied mathematics, physics, mechanics, engineering and the life sciences. The book consists of contributed research chapters addressing a diverse range of problems. The issues discussed include (among others): numerical-analytical algorithms for nonlinear optimal control problems on a large time interval; gravity waves in a reservoir with an uneven bottom; value distribution and growth of solutions for certain Painleve equations; optimal control of hybrid systems with sliding modes; a mathematical model of the two types of atrioventricular nodal reentrant tachycardia; non-conservative instability of cantilevered nanotubes using the Cell Discretization Method; dynamic analysis of a compliant tensegrity structure for use in a gripper application; and Jeffcott rotor bifurcation behavior using various models of hydrodynamic bearings. |
![]() ![]() You may like...
Mathematics For Engineering Students
Ramoshweu Solomon Lebelo, Radley Kebarapetse Mahlobo
Paperback
R397
Discovery Miles 3 970
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R4,186
Discovery Miles 41 860
Stochastic Analysis of Mixed Fractional…
Yuliya Mishura, Mounir Zili
Hardcover
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,319
Discovery Miles 43 190
|