![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Physics > Quantum physics (quantum mechanics) > General
This monograph explains the theory of quantum waveguides, that is, dynamics of quantum particles confined to regions in the form of tubes, layers, networks, etc. The focus is on relations between the confinement geometry on the one hand and the spectral and scattering properties of the corresponding quantum Hamiltonians on the other. Perturbations of such operators, in particular, by external fields are also considered. The volume provides a unique summary of twenty-five years of research activity in this area and indicates ways in which the theory can develop further. The book is fairly self-contained. While it requires some broader mathematical physics background, all the basic concepts are properly explained and proofs of most theorems are given in detail, so there is no need for additional sources. Without a parallel in the literature, the monograph by Exner and Kovarik guides the reader through this new and exciting field.
This book is based on the author's work at the Double Chooz Experiment, from 2010 to 2013, the goal of which was to search for electronic anti-neutrino disappearance close to nuclear power plant facilities as a result of neutrino oscillation. Starting with a brief review of neutrino oscillation and the most important past experimental findings in this field, the author subsequently provides a full and detailed description of a neutrino detector, from simulation aspects to detection principles, as well as the data analysis procedure used to extract the oscillation parameters. The main results in this book are 1) an improvement on the mixing angle, 13, uncertainty by combining two data-sets from neutrino event selection: neutron capture on gadolinium and on hydrogen; and 2) the first measurement of the effective squared mass difference by combining the current reactor neutrino experimental data from Daya Bay, Double Chooz and RENO and taking advantage of their different reactor-to-detector distances. The author explains how these methods of combining data can be used to estimate these two values. Each method results in the best possible sensitivity for the oscillation parameters with regard to reactor neutrinos. They can be used as a standard method on the latest data releases from the current experiments.
This book is a collection of problems that are intended to aid students in graduate and undergraduate courses in Classical and Quantum Physics. It is also intended to be a study aid for students that are preparing for the PhD qualifying exam. Many of the included problems are of a type that could be on a qualifying exam. Others are meant to elucidate important concepts. Unlike other compilations of problems, the detailed solutions are often accompanied by discussions that reach beyond the specific problem.The solution of the problem is only the beginning of the learning process--it is by manipulation of the solution and changing of the parameters that a great deal of insight can be gleaned. The authors refer to this technique as "massaging the problem," and it is an approach that the authors feel increases the pedagogical value of any problem.
The book considers foundational thinking in quantum theory, focusing on the role the fundamental principles and principle thinking there, including thinking that leads to the invention of new principles, which is, the book contends, one of the ultimate achievements of theoretical thinking in physics and beyond. The focus on principles, prominent during the rise and in the immediate aftermath of quantum theory, has been uncommon in more recent discussions and debates concerning it. The book argues, however, that exploring the fundamental principles and principle thinking is exceptionally helpful in addressing the key issues at stake in quantum foundations and the seemingly interminable debates concerning them. Principle thinking led to major breakthroughs throughout the history of quantum theory, beginning with the old quantum theory and quantum mechanics, the first definitive quantum theory, which it remains within its proper (nonrelativistic) scope. It has, the book also argues, been equally important in quantum field theory, which has been the frontier of quantum theory for quite a while now, and more recently, in quantum information theory, where principle thinking was given new prominence. The approach allows the book to develop a new understanding of both the history and philosophy of quantum theory, from Planck's quantum to the Higgs boson, and beyond, and of the thinking the key founding figures, such as Einstein, Bohr, Heisenberg, Schroedinger, and Dirac, as well as some among more recent theorists. The book also extensively considers the nature of quantum probability, and contains a new interpretation of quantum mechanics, "the statistical Copenhagen interpretation." Overall, the book's argument is guided by what Heisenberg called "the spirit of Copenhagen," which is defined by three great divorces from the preceding foundational thinking in physics-reality from realism, probability from causality, and locality from relativity-and defined the fundamental principles of quantum theory accordingly.
This text presents the two complementary aspects of thermal physics as an integrated theory of the properties of matter. Conceptual understanding is promoted by thorough development of basic concepts. In contrast to many texts, statistical mechanics, including discussion of the required probability theory, is presented first. This provides a statistical foundation for the concept of entropy, which is central to thermal physics. A unique feature of the book is the development of entropy based on Boltzmann's 1877 definition; this avoids contradictions or ad hoc corrections found in other texts. Detailed fundamentals provide a natural grounding for advanced topics, such as black-body radiation and quantum gases. An extensive set of problems (solutions are available for lecturers through the OUP website), many including explicit computations, advance the core content by probing essential concepts. The text is designed for a two-semester undergraduate course but can be adapted for one-semester courses emphasizing either aspect of thermal physics. It is also suitable for graduate study.
This thesis is based on the first data from the Large Hadron Collider (LHC) at CERN. Its theme can be described as the classical Rutherford scattering experiment adapted to the LHC: measurement of scattering angles to search for new physics and substructure. At the LHC, colliding quarks and gluons exit the proton collisions as collimated particle showers, or jets. The thesis presents studies of the scattering angles of these jets. It includes a phenomenological study at the LHC design energy of 14 TeV, where a model of so-called large extra dimensions is used as a benchmark process for the sensitivity to new physics. The experimental result is the first measurement, made in 2010, by ATLAS, operating at the LHC start-up energy of 7 TeV. The result is compatible with the Standard Model and demonstrates how well the physics and the apparatus are understood. The first data is a tiny fraction of what will be accumulated in the coming years, and this study has set the stage for performing these measurements with confidence as the LHC accumulates luminosity and increases its energy, thereby probing smaller length scales.
The invention of the semiconductor laser along with silica glass fiber has enabled an incredible revolution in global communication infrastructure of direct benefit to all. Development of devices and system concepts that exploit the same fundamental light-matter interaction continues. Researchers and technologists are pursuing a broad range of emerging applications, everything from automobile collision avoidance to secure quantum key distribution. This book sets out to summarize key aspects of semiconductor laser device physics and principles of laser operation. It provides a convenient reference and essential knowledge to be understood before exploring more sophisticated device concepts. The contents serve as a foundation for scientists and engineers, without the need to invest in specialized detailed study. Supplementary material in the form of MATLAB is available for numerically generated figures.
This book presents two analyses, the first of which involves the search for a new heavy charged gauge boson, a so-called W' boson. This new gauge boson is predicted by some theories extending the Standard Model gauge group to solve some of its conceptual problems. Decays of the W' boson in final states with a lepton ( +/- = e+/- , +/-) and the corresponding (anti-)neutrino are considered. Data collected by the ATLAS experiment in 2015 at a center of mass energy of s =13 TeV is used for the analysis. In turn, the second analysis presents a measurement of the double-differential cross section of the process pp->Z/gamma^* + X -> l^+l^- + X, including a gamma gamma induced contribution, at a center of mass energy of sqrt{s} = 8 TeV. The measurement is performed in an invariant mass region of 116 GeV to 1500 GeV as a function of invariant mass and absolute rapidity of the l^+l^-- pair, and as a function of invariant mass and pseudorapidity separation of the l^+l^-- pair. The data analyzed was recorded by the ATLAS experiment in 2012 and corresponds to an integrated luminosity of 20.3/fb. It is expected that the measured cross sections are sensitive to the PDFs at very high values of the Bjorken-x scaling variable, and to the photon structure of the proton.
This highly interdisciplinary thesis covers a wide range of topics relating to the interface of cold atoms, quantum simulation, quantum magnetism and disorder. With a self-contained presentation, it provides a broad overview of the rapidly evolving area of cold atoms and is of interest to both undergraduates and researchers working in the field. Starting with a general introduction to the physics of cold atoms and optical lattices, it extends the theory to that of systems with different multispecies atoms. It advances the theory of many-body quantum systems in excited bands (of optical lattices) through an extensive study of the properties of both the mean-field and strongly correlated regimes. Particular emphasis is given to the context of quantum simulation, where as shown here, the orbital degree of freedom in excited bands allows the study of exotic models of magnetism not easily achievable with the previous alternative systems. In addition, it proposes a new model Hamiltonian that serves as a quantum simulator of various disordered systems in different symmetry classes that can easily be reproduced experimentally. This is of great interest, especially for the study of disorder in 2D quantum systems.
This Ph.D. thesis is a search for physics beyond the standard model (SM) of particle physics, which successfully describes the interactions and properties of all known elementary particles. However, no particle exists in the SM that can account for the dark matter, which makes up about one quarter of the energy-mass content of the universe. Understanding the nature of dark matter is one goal of the CERN Large Hadron Collider (LHC). The extension of the SM with supersymmetry (SUSY) is considered a promising possibilities to explain dark matter. The nominated thesis describes a search for SUSY using data collected by the CMS experiment at the LHC. It utilizes a final state consisting of a photon, a lepton, and a large momentum imbalance probing a class of SUSY models that has not yet been studied extensively. The thesis stands out not only due to its content that is explained with clarity but also because the author performed more or less all aspects of the thesis analysis by himself, from data skimming to limit calculations, which is extremely rare, especially nowadays in the large LHC collaborations.
Quantum physics started in the 1920's with wave mechanics and the wave-particle duality. However, the last 20 years have seen a second quantum revolution, centered around non-locality and quantum correlations between measurement outcomes. The associated key property, entanglement, is recognized today as the signature of quantumness. This second revolution opened the possibility of studying quantum correlations without any assumption on the internal functioning of the measurement apparata, the so-called Device-Independent Approach to Quantum Physics. This thesis explores this new approach using the powerful geometrical tool of polytopes. Emphasis is placed on the study of non-locality in the case of three or more parties, where it is shown that a whole new variety of phenomena appear compared to the bipartite case. Genuine multiparty entanglement is also studied for the first time within the device-independent framework. Finally, these tools are used to answer a long-standing open question: could quantum non-locality be explained by influences that propagate from one party to the others faster than light, but that remain hidden so that one cannot use them to communicate faster than light? This would provide a way around Einstein's notion of action at a distance that would be compatible with relativity. However, the answer is shown to be negative, as such influences could not remain hidden."
Quarks are the main constituents of protons and neutrons and hence are important building blocks of all the matter that surrounds us. However, quarks have the intriguing property that they never appear as isolated single particles but only in bound states. This phenomenon is called confinement and has been a central research topic of elementary particle physics for the last few decades. In order to find the mechanism that forbids the existence of free quarks many approaches and ideas are being followed, but by now it has become clear that they are not mutually exclusive but illuminate the problem from different perspectives. Two such confinement scenarios are investigated in this thesis: Firstly, the importance of Abelian field components for the low-energy regime is corroborated, thus supporting the dual superconductor picture of confinement and secondly, the influence of the Gribov horizon on non-perturbative solutions is studied.
This work describes theoretical and experimental advances towards the realization of a hybrid quantum processor in which the collective degrees of freedom of an ensemble of spins in a crystal are used as a multi-qubit register for superconducting qubits. A memory protocol made of write, read and reset operations is first presented, followed by the demonstration of building blocks of its implementation with NV center spins in diamond. Qubit states are written by resonant absorption of a microwave photon in the spin ensemble and read out of the memory on-demand by applying Hahn echo refocusing techniques to the spins. The reset step is implemented in between two successive write-read sequences using optical repumping of the spins.
This thesis represents a decisive breakthrough in our understanding of the physics of universal quantum-mechanical three-body systems. The Efimov scenario is a prime example of how fundamental few-body physics features universally across seemingly disparate fields of modern quantum physics. Initially postulated for nuclear physics more than 40 years ago, the Efimov effect has now become a new research paradigm not only in ultracold atomic gases but also in molecular, biological and condensed matter systems. Despite a lot of effort since its first observations, the scaling behavior, which is a hallmark property and often referred to as the "holy grail" of Efimov physics, remained hidden until recently. In this work, the author demonstrates this behavior for the first time for a heteronuclear mixture of ultracold Li and Cs atoms, and pioneers the experimental understanding of microscopic, non-universal properties in such systems. Based on the application of Born-Oppenheimer approximation, well known from molecular physics textbooks, an exceptionally clear and intuitive picture of heteronuclear Efimov physics is revealed.
This book aims to provide advanced students and researchers with the text on a nonperturbative, thermodynamically grounded, and largely analytical approach to four-dimensional Quantum Gauge Theory. The terrestrial, astrophysical, and cosmological applications, mostly within the realm of low-temperature photon physics, are treated.
In this thesis, the main approach to the characterization of the set of classical probabilities, the correlation polytope approach, is reviewed for different scenarios, namely, hidden variable models discussed by Bell (local), Kochen and Specker (non-contextual), and Leggett and Garg (macrorealist). Computational difficulties associated with the method are described and a method to overcome them in several nontrivial cases is presented. For the quantum case, a general method to analyze quantum correlations in the sequential measurement scenario is provided, which allows computation of the maximal correlations. Such a method has a direct application for computation of maximal quantum violations of Leggett-Garg inequalities and it is relevant in the analysis of non-contextuality tests. Finally, possible applications of the results for quantum information tasks are discussed.
This marvelous book is aimed at strengthening the mathematical background and sharpening the mathematical tools of students without rigorous training before taking the quantum mechanics course. The abstract construction of quantum postulates in the framework of Hilbert space and Hermitian operators are realized by q-representation in the formulation to demonstrate the conventional approach to quantum theory. Symmetry property is emphasized and extensively explored in this book both in continuous transformations as well as in the discrete ones. The space-time structure is discussed in depth and Dirac equation is formulated by symmetry consideration of Lorentz group.
This textbook presents in a concise and self-contained way the advanced fundamental mathematical structures in quantum theory. It is based on lectures prepared for a 6 months course for MSc students. The reader is introduced to the beautiful interconnection between logic, lattice theory, general probability theory, and general spectral theory including the basic theory of von Neumann algebras and of the algebraic formulation, naturally arising in the study of the mathematical machinery of quantum theories. Some general results concerning hidden-variable interpretations of QM such as Gleason's and the Kochen-Specker theorems and the related notions of realism and non-contextuality are carefully discussed. This is done also in relation with the famous Bell (BCHSH) inequality concerning local causality. Written in a didactic style, this book includes many examples and solved exercises. The work is organized as follows. Chapter 1 reviews some elementary facts and properties of quantum systems. Chapter 2 and 3 present the main results of spectral analysis in complex Hilbert spaces. Chapter 4 introduces the point of view of the orthomodular lattices' theory. Quantum theory form this perspective turns out to the probability measure theory on the non-Boolean lattice of elementary observables and Gleason's theorem characterizes all these measures. Chapter 5 deals with some philosophical and interpretative aspects of quantum theory like hidden-variable formulations of QM. The Kochen-Specker theorem and its implications are analyzed also in relation BCHSH inequality, entanglement, realism, locality, and non-contextuality. Chapter 6 focuses on the algebra of observables also in the presence of superselection rules introducing the notion of von Neumann algebra. Chapter 7 offers the idea of (groups of) quantum symmetry, in particular, illustrated in terms of Wigner and Kadison theorems. Chapter 8 deals with the elementary ideas and results of the so called algebraic formulation of quantum theories in terms of both *-algebras and C*-algebras. This book should appeal to a dual readership: on one hand mathematicians that wish to acquire the tools that unlock the physical aspects of quantum theories; on the other physicists eager to solidify their understanding of the mathematical scaffolding of quantum theories.
Vortices comprising swirling motion of matter are observable in classical systems at all scales ranging from atomic size to the scale of galaxies. In quantum mechanical systems, such vortices are robust entities whose behaviours are governed by the strict rules of topology. The physics of quantum vortices is pivotal to basic science of quantum turbulence and high temperature superconductors, and underpins emerging quantum technologies including topological quantum computation. This handbook is aimed at providing a dictionary style portal to the fascinating quantum world of vortices.
..".The Multiversal book series is equally unique, providing book-length extensions of the lectures with enough additional depth for those who truly want to explore these fields, while also providing thekind of clarity that is appropriate for interested lay people to grasp the general principles involved." - Lawrence M. Krauss Cosmic Update Covers: A novel approach to uncover the dark faces of the Standard Model of cosmology.The possibility that Dark Energy and Dark Matter are manifestations of the inhomogeneous geometry of our Universe.On the history of cosmological model building and the general architecture of cosmological modes.Illustrations on the Large Scale Structure of the Universe.A new perspective on the classical static Einstein Cosmos.Global properties of World Models including their Topology.The Arrow of Time in a Universe with a Positive Cosmological Constant.Exploring the consequences of a fundamental Cosmological Constant for our Universe. Exploring why the current observed acceleration of the Universe may not be its final destiny.Demonstrating that nature forbids the existence of a pure Cosmological Constant.Our current understanding of the long term (in time scales that greatly exceed the current age of the Universe) future of the Universe.The long term fate and eventual destruction of the astrophysical objects that populate the universe --including clusters, galaxies, stars, planets, and black holes. The material is presented in a layperson-friendly language followed by addition technical sections that explain the basic equations and principles. This feature is very attractive to readers who want to learn more about the theories involved beyond the basic description. "Multiversal Journeys is a trademark of Farzad Nekoogar and Multiversal Journeys, a 501 (c) (3) nonprofit organization.""
The book deals with quantum field theory which is the language of the modern physics of elementary particles. Written based on university lectures given by the author, the book provides treatments and technical details of quantum field theory, which will be particularly useful for students. The book starts with the quantization of the most important kind of free fields (the scalar, the spin-1/2 and the photon fields). It is then followed by a detailed account of the symmetry properties of a field theory and a discussion on global and local symmetries and the spontaneous breaking of symmetries. Other topics discussed include the perturbation theory, one-loop effects for quantum electrodynamics, and renormalization properties.
This book describes research in two different areas of state-of-the-art hadron collider physics, both of which are of central importance in the field of particle physics. The first part of the book focuses on the search for supersymmetric particles called gluinos. The book subsequently presents a set of precision measurements of "multi-jet" collision events, which involve large numbers of newly created particles, and are among the dominant processes at the Large Hadron Collider (LHC). Now that a Higgs boson has been discovered at the LHC, the existence (or non-existence) of supersymmetric particles is of the utmost interest and significance, both theoretically and experimentally. In addition, multi-jet collision events are an important background process for a wide range of analyses, including searches for supersymmetry.
In this thesis, the author explains the background of problems in quantum estimation, the necessary conditions required for estimation precision benchmarks that are applicable and meaningful for evaluating data in quantum information experiments, and provides examples of such benchmarks. The author develops mathematical methods in quantum estimation theory and analyzes the benchmarks in tests of Bell-type correlation and quantum tomography with those methods. Above all, a set of explicit formulae for evaluating the estimation precision in quantum tomography with finite data sets is derived, in contrast to the standard quantum estimation theory, which can deal only with infinite samples. This is the first result directly applicable to the evaluation of estimation errors in quantum tomography experiments, allowing experimentalists to guarantee estimation precision and verify quantitatively that their preparation is reliable.
This book presents a major step forward in experimentally understanding the behavior of muon neutrinos and antineutrinos. Apart from providing the world's first measurement of these interactions in a mostly unexplored energy region, the data presented advances the neutrino community's preparedness to search for an asymmetry between matter and anti-matter that may very well provide the physical mechanism for the existence of our universe. The details of these measurements are preceded by brief summaries of the history of the neutrino, the phenomenon of neutrino oscillations, and a description of their interactions. Also provided are details of the experimental setup for the measurements and the muon antineutrino cross-section measurement which motivates the need for dedicated in situ background constraints. The world's first measurement of the neutrino component of an antineutrino beam using a non-magnetized detector, as well as other crucial background constraints, are also presented in the book. By exploiting correlated systematic uncertainties, combined measurements of the muon neutrino and antineutrino cross sections described in the book maximize the precision of the extracted information from both results.
This book describes a promising approach to problems in the foundations of quantum mechanics, including the measurement problem. The dynamics of ensembles on configuration space is shown here to be a valuable tool for unifying the formalisms of classical and quantum mechanics, for deriving and extending the latter in various ways, and for addressing the quantum measurement problem. A description of physical systems by means of ensembles on configuration space can be introduced at a very fundamental level: the basic building blocks are a configuration space, probabilities, and Hamiltonian equations of motion for the probabilities. The formalism can describe both classical and quantum systems, and their thermodynamics, with the main difference being the choice of ensemble Hamiltonian. Furthermore, there is a natural way of introducing ensemble Hamiltonians that describe the evolution of hybrid systems; i.e., interacting systems that have distinct classical and quantum sectors, allowing for consistent descriptions of quantum systems interacting with classical measurement devices and quantum matter fields interacting gravitationally with a classical spacetime. |
You may like...
|