![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Physics > Quantum physics (quantum mechanics)
This thesis investigates ultracold molecules as a resource for novel quantum many-body physics, in particular by utilizing their rich internal structure and strong, long-range dipole-dipole interactions. In addition, numerical methods based on matrix product states are analyzed in detail, and general algorithms for investigating the static and dynamic properties of essentially arbitrary one-dimensional quantum many-body systems are put forth. Finally, this thesis covers open-source implementations of matrix product state algorithms, as well as educational material designed to aid in the use of understanding such methods.
This thesis elucidates electron correlation effects in topological matter whose electronic states hold nontrivial topological properties robust against small perturbations. In addition to a comprehensive introduction to topological matter, this thesis provides a new perspective on correlated topological matter. The book comprises three subjects, in which electron correlations in different forms are considered. The first focuses on Coulomb interactions for massless Dirac fermions. Using a perturbative approach, the author reveals emergent Lorentz invariance in a low-energy limit and discusses how to probe the Lorentz invariance experimentally. The second subject aims to show a principle for synthesizing topological insulators with common, light elements. The interplay between the spin-orbit interaction and electron correlation is considered, and Hund's rule and electron filling are consequently found to play a key role for a strong spin-orbit interaction important for topological insulators. The last subject is classification of topological crystalline insulators in the presence of electron correlation. Unlike non-interacting topological insulators, such two- and three-dimensional correlated insulators with mirror symmetry are demonstrated to be characterized, respectively, by the Z4 and Z8 group by using the bosonization technique and a geometrical consideration.
This volume is the result of two international workshops; "Infinite Analysis 11 Frontier of Integrability" held at University of Tokyo, Japan in July 25th to 29th, 2011, and "Symmetries, Integrable Systems and Representations" held at Universite Claude Bernard Lyon 1, France in December 13th to 16th, 2011. Included are research articles based on the talks presented at the workshops, latest results obtained thereafter, and some review articles. The subjects discussed range across diverse areas such as algebraic geometry, combinatorics, differential equations, integrable systems, representation theory, solvable lattice models and special functions. Through these topics, the readerwill find some recent
developments in the field of mathematical physics and their
interactions with several other domains.
This book gives a theoretical description of linear and nonlinear optical responses of matter with special emphasis on the microscopic and "nonlocal" nature of resonant response. The response field and induced polarization are determined self-consistently in terms of simultaneous linear or nonlinear polynomial equations. This scheme is a general one situated between QED and macroscopic response theory, but is most appropriate for determining the dependence of optical signals on the size, shape, and internal structure of a nanostructure sample. As a highlight of the scheme, the multi-resonant enhancement of the DFWM signal is described together with its experimental verification.
This textbook explains the experimental basics, effects and theory of nuclear physics. It supports learning and teaching with numerous worked examples, questions and problems with answers. Numerous tables and diagrams help to better understand the explanations. A better feeling to the subject of the book is given with sketches about the historical development of nuclear physics. The main topics of this book include the phenomena associated with passage of charged particles and radiation through matter which are related to nuclear resonance fluorescence and the Moessbauer effect., Gamov's theory of alpha decay, Fermi theory of beta decay, electron capture and gamma decay. The discussion of general properties of nuclei covers nuclear sizes and nuclear force, nuclear spin, magnetic dipole moment and electric quadrupole moment. Nuclear instability against various modes of decay and Yukawa theory are explained. Nuclear models such as Fermi Gas Model, Shell Model, Liquid Drop Model, Collective Model and Optical Model are outlined to explain various experimental facts related to nuclear structure. Heavy ion reactions, including nuclear fusion, are explained. Nuclear fission and fusion power production is treated elaborately.
The work presented in this thesis spans a wide range of experimental particle physics subjects, starting from level-1 trigger electronics to the final results of the search for Higgs boson decay and to tau lepton pairs. The thesis describes an innovative reconstruction algorithm for tau decays and details how it was instrumental in providing a measurement of Z decay to tau lepton pairs. The reliability of the analysis is fully established by this measurement before the Higgs boson decay to tau lepton pairs is considered. The work described here continues to serve as a model for analysing CMS Higgs to tau leptons measurements.
In this second edition, the following recent papers have been added: "Gauss Codes, Quantum Groups and Ribbon Hopf Algebras", "Spin Networks, Topology and Discrete Physics", "Link Polynomials and a Graphical Calculus" and "Knots Tangles and Electrical Networks". An appendix with a discussion on invariants of embedded graphs and Vassiliev invariants has also been included.This book is an introduction to knot and link invariants as generalized amplitudes (vacuum-vacuum amplitudes) for a quasi-physical process. The demands of knot theory, coupled with a quantum statistical framework, create a context that naturally and powerfully includes an extraordinary range of interrelated topics in topology and mathematical physics. The author takes a primarily combinatorial stance toward knot theory and its relations with these subjects. This has the advantage of providing very direct access to the algebra and to the combinatorial topology, as well as the physical ideas. This book is divided into 2 parts: Part I of the book is a systematic course in knots and physics starting from the ground up. Part II is a set of lectures on various topics related to and sometimes based on Part I. Part II also explores some side-topics such as frictional properties of knots, relations with combinatorics and knots in dynamical systems.
This volume provides a detailed discussion of the mathematical aspects and the physical applications of a new geometrical structure of space-time, based on a generalization ("deformation") of the usual Minkowski space, as supposed to be endowed with a metric whose coefficients depend on the energy. Such a formalism (Deformed Special Relativity, DSR) allows one
Moreover, the four-dimensional energy-dependent space-time is just a manifestation of a larger, five-dimensional space in which energy plays the role of a fifth (non-compactified) dimension. This new five-dimensional scheme (Deformed Relativity in Five Dimensions, DR5) represents a true generalization of the usual Kaluza-Klein (KK) formalism. The mathematical properties of such a generalized KK scheme are illustrated. They include the solutions of the five-dimensional Einstein equations in vacuum in most cases of physical relevance, the infinitesimal symmetries of the theory for the phenomenological metrics of the four interactions, and the study of the five-dimensional geodesics. The mathematical results concerning the geometry of the deformed five-dimensional spacetime (like its Killing symmetries) can be applied also to other multidimensional theories with infinite extra dimensions. Some experiments providing preliminary evidence for the hypothesized deformation of space-time for all thefour fundamental interactions are discussed.
This book provides an introduction to the body of theory shared by several branches of modern optics--nonlinear optics, quantum electronics, laser physics, and quantum optics--with an emphasis on quantum and statistical aspects. It is intended for well prepared undergraduate and graduate students in physics, applied physics, electrical engineering, and chemistry who seek a level of preparation of sufficient maturity to enable them to follow the specialized literature.
This thesis reports on the first studies of Standard Model photon production at the Large Hadron Collider (LHC) using the ATLAS detector. Standard Model photon production is a large background in the search for Higgs bosons decaying into photon pairs, and is thus critical to understand. The thesis explains the techniques used to reconstruct and identify photon candidates using the ATLAS detector, and describes a measurement of the production cross section for isolated prompt photons. The thesis also describes a search for the Higgs boson in which the analysis techniques used in the measurement are exploited to reduce and estimate non-prompt backgrounds in diphoton events.
This thesis describes the thorough analysis of the rare B meson decay into K* on data taken by the Belle Collaboration at the B-meson-factory KEKB over 10 years. This reaction is very interesting, because it in principle allows the observation of CP-violation effects. In the Standard Model however, no CP violation in this reaction is expected. An observation of CP asymmetries thus immediately implies new physics. This thesis presents an amplitude analysis of this decay and the search for CP violation in detail and discusses methods to solve related problems: The quantification of multivariate dependence and the improvement of numeric evaluation speed of normalization integrals in amplitude analysis. In addition it provides an overview of the theory, experimental setup, (blind) statistical data analysis and estimation of systematic uncertainties.
But all the clocks in the city Began to whirr and chime: 'O let not Time deceive you, You cannot conquer Time. W. H. Auden It is hard to think of a subject as rich, complex, and important as time. From the practical point of view it governs and organizes our lives (most of us are after all attached to a wrist watch) or it helps us to wonderfully ?nd our way in unknown territory with the global positioning system (GPS). More generally it constitutes the heartbeat of modern technology. Time is the most precisely measured quantity, so the second de?nes the meter or the volt and yet, nobody knows for sure what it is, puzzling philosophers, artists, priests, and scientists for centuries as one of the enduring enigmas of all cultures. Indeed time is full of contrasts: taken for granted in daily life, it requires sophisticated experimental and theoretical treatments to be accurately "produced. " We are trapped in its web, and it actually kills us all, but it also constitutes the stuff we need to progress and realize our objectives. There is nothing more boring and monotonous than the tick-tock of a clock, but how many fascinating challenges have physicists met to realize that monotony: Quite a number of Nobel Prize winners have been directly motivated by them or have contributed 1 signi?cantly to time measurement.
This thesis reports the measurement of muon neutrino and antineutrino disappearance and electron neutrino and antineutrino appearance in a muon neutrino and antineutrino beam using the T2K experiment. It describes a result in neutrino physics that is a pioneering indication of charge-parity (CP) violation in neutrino oscillation; the first to be obtained from a single experiment. Neutrinos are some of the most abundant-but elusive-particles in the universe, and may provide a promising place to look for a potential solution to the puzzle of matter/antimatter imbalance in the observable universe. It has been firmly established that neutrinos can change flavour (or 'oscillate'), as recognised by the 2015 Nobel Prize. The theory of neutrino oscillation allows for neutrinos and antineutrinos to oscillate differently (CP violation), and may provide insights into why our universe is matter-dominated. Bayesian statistical methods, including the Markov Chain Monte Carlo fitting technique, are used to simultaneously optimise several hundred systematic parameters describing detector, beam, and neutrino interaction uncertainties as well as the six oscillation parameters.
This is a collection of essays based on lectures that author has given on various occasions on foundation of quantum theory, symmetries and representation theory, and the quantum theory of the superworld created by physicists. The lectures are linked by a unifying theme: how the quantum world and superworld appear under the lens of symmetry and supersymmetry. In the world of ultra-small times and distances such as the Planck length and Planck time, physicists believe no measurements are possible and so the structure of spacetime itself is an unkown that has to be first understood. There have been suggestions (Volovich hypothesis) that world geometry at such energy regimes is non-archimedian and some of the lectures explore the consequences of such a hypothesis. Ultimately, symmetries and supersymmetries are described by the representation of groups and supergroups. The author's interest in representation is a lifelong one and evolved slowly, and owes a great deal to conversations and discussions he had with George Mackey and Harish-Chandra. The book concludes with a retrospective look at these conversations.
Gravity, a Geometrical Course presents general relativity (GR) in a systematic and exhaustive way, covering three aspects that are homogenized into a single texture: i) the mathematical, geometrical foundations, exposed in a self consistent contemporary formalism, ii) the main physical, astrophysical and cosmological applications, updated to the issues of contemporary research and observations, with glimpses on supergravity and superstring theory, iii) the historical development of scientific ideas underlying both the birth of general relativity and its subsequent evolution. The book, divided in two volumes, is a rich resource for graduate students and those who wish to gain a deep knowledge of the subject without an instructor. Volume One is dedicated to the development of the theory and basic physical applications. It guides the reader from the foundation of special relativity to Einstein field equations, illustrating some basic applications in astrophysics. A detailed account of the historical and conceptual development of the theory is combined with the presentation of its mathematical foundations. Differentiable manifolds, fibre-bundles, differential forms, and the theory of connections are covered, with a sketchy introduction to homology and cohomology. (Pseudo)-Riemannian geometry is presented both in the metric and in the vielbein approach. Physical applications include the motions in a Schwarzschild field leading to the classical tests of GR (light-ray bending and periastron advance) discussion of relativistic stellar equilibrium, white dwarfs, Chandrasekhar mass limit and polytropes. An entire chapter is devoted to tests of GR and to the indirect evidence of gravitational wave emission. The formal structure of gravitational theory is at all stages compared with that of non gravitational gauge theories, as a preparation to its modern extension, namely supergravity, discussed in the second volume. Pietro Fre is Professor of Theoretical Physics at the University of Torino, Italy and is currently serving as Scientific Counsellor of the Italian Embassy in Moscow. His scientific passion lies in supergravity and all allied topics, since the inception of the field, in 1976. He was professor at SISSA, worked in the USA and at CERN. He has taught General Relativity for 15 years. He has previously two scientific monographs, Supergravity and Superstrings and The N=2 Wonderland, He is also the author of a popular science book on cosmology and two novels, in Italian."
Quantum annealing employs quantum fluctuations in frustrated systems or networks to anneal the system down to its ground state, or more generally to its so-called minimum cost state. Often this procedure turns out to be more effective, in multivariable optimization problems, than its classical counterpart utilizing tunable thermal fluctuations. This volume is divided into three parts. Part I is an extensive tutorial introduction familiarizing the reader with the background material necessary to follow the core of the book. Part II gives a comprehensive account of the fundamentals and applications of the quantum annealing method, and Part III compares quantum annealing with other related optimization methods. This is the first book entirely devoted to quantum annealing and will be both an invaluable primer and guidebook for all advanced students and researchers in this important field.
This book provides an advanced introduction to extended theories of quantum field theory and algebraic topology, including Hamiltonian quantization associated with some geometrical constraints, symplectic embedding and Hamilton-Jacobi quantization and Becci-Rouet-Stora-Tyutin (BRST) symmetry, as well as de Rham cohomology. It offers a critical overview of the research in this area and unifies the existing literature, employing a consistent notation. Although the results presented apply in principle to all alternative quantization schemes, special emphasis is placed on the BRST quantization for constrained physical systems and its corresponding de Rham cohomology group structure. These were studied by theoretical physicists from the early 1960s and appeared in attempts to quantize rigorously some physical theories such as solitons and other models subject to geometrical constraints. In particular, phenomenological soliton theories such as Skyrmion and chiral bag models have seen a revival following experimental data from the SAMPLE and HAPPEX Collaborations and these are discussed. The book describes how these model predictions were shown to include rigorous treatments of geometrical constraints because these constraints affect the predictions themselves. The application of the BRST symmetry to the de Rham cohomology contributes to a deep understanding of Hilbert space of constrained physical theories. Aimed at graduate-level students in quantum field theory, the book will also serve as a useful reference for those working in the field. An extensive bibliography guides the reader towards the source literature on particular topics.
Mesoscopic physics deals with effects at submicron and nanoscales where the conventional wisdom of macroscopic averaging is no longer applicable. A wide variety of new devices have recently evolved, all extremely promising for major novel directions in technology, including carbon nanotubes, ballistic quantum dots, hybrid mesoscopic junctions made of different type of normal, superconducting and ferromagnetic materials. This, in turn, demands a profound understanding of fundamental physical phenomena on mesoscopic scales. As a result, the forefront of fundamental research in condensed matter has been moved to the areas where the interplay between electron-electron interactions and quantum interference of phase-coherent electrons scattered by impurities and/or boundaries is the key to such understanding. An understanding of decoherence as well as other effects of the interactions is crucial for developing future electronic, photonic and spintronic devices, including the element base for quantum computation.
This book introduces mathematicians, physicists, and philosophers to a new, coherent approach to theory and interpretation of quantum physics, in which classical and quantum thinking live peacefully side by side and jointly fertilize the intuition. The formal, mathematical core of quantum physics is cleanly separated from the interpretation issues. The book demonstrates that the universe can be rationally and objectively understood from the smallest to the largest levels of modeling. The thermal interpretation featured in this book succeeds without any change in the theory. It involves one radical step, the reinterpretation of an assumption that was virtually never questioned before - the traditional eigenvalue link between theory and observation is replaced by a q-expectation link: Objective properties are given by q-expectations of products of quantum fields and what is computable from these. Averaging over macroscopic spacetime regions produces macroscopic quantities with negligible uncertainty, and leads to classical physics. - Reflects the actual practice of quantum physics. - Models the quantum-classical interface through coherent spaces. - Interprets both quantum mechanics and quantum field theory. - Eliminates probability and measurement from the foundations. - Proposes a novel solution of the measurement problem.
Quantum information may sound like science fiction but is, in fact, an active and extremely promising area of research, with a big dream: to build a quantum computer capable of solving problems that a classical computer could not even begin to handle. Research in quantum information science is now at an advanced enough stage for this dream to be credible and well-worth pursuing. It is, at the same time, too early to predict how quantum computers will be built, and what potential technologies will eventually strike gold in their ability to manipulate and process quantum information. One direction that has reaped many successes in quantum information processing relies on continuous variables. This area is bustling with theoretical and experimental achievements, from continuous-variable teleportation, to in-principle demonstrations of universal computation and efficient error correction. Now the time has come to compile some of the major results into one volume. In this book the leading researchers of the field present up-to-date developments of continuous-variable quantum information. This book is organized to suit many reader levels with introductions to every topic and in-depth discussions of theoretical and experimental results.
This book is about the strategic relevance of quantum technologies. It debates the military-specific aspects of this technology. Various chapters of this book cohere around two specific themes. The first theme discusses the global pattern of ongoing civilian and military research on quantum computers, quantum cryptography, quantum communications and quantum internet. The second theme explicitly identifies the relevance of these technologies in the military domain and the possible nature of quantum technology-based weapons. This thread further debates on quantum (arms) race at a global level in general, and in the context of the USA and China, in particular. The book argues that the defence utility of these technologies is increasingly becoming obvious and is likely to change the nature of warfare in the future.
Quantum coherence is a phenomenon that plays a crucial role in various forms of matter. The thriving field of quantum information as well as unconventional approaches to use mesoscopic systems in future optoelectronic devices provide the exciting background for this set of lectures. The lectures originate from the well-known Schladming Winter Schools and are carefully edited so as to address a broad readership ranging from the beginning graduate student up to the senior scientist wanting to keep up with or to enter newly emerging fields of research.
This thesis describes in detail a search for weakly interacting massive particles as possible dark matter candidates, making use of so-called mono-jet events. It includes a detailed description of the run-1 system, important operational challenges, and the upgrade for run-2. The nature of dark matter, which accounts for roughly 25% of the energy-matter content of the universe, is one of the biggest open questions in fundamental science. The analysis is based on the full set of proton-proton collisions collected by the ATLAS experiment at the Large Hadron Collider at s = 8 TeV. Special attention is given to the experimental challenges and analysis techniques, as well as the overall scientific context beyond particle physics. The results complement those of non-collider experiments and yield some of the strongest exclusion bounds on parameters of dark matter models by the end of the Large Hadron Collider run-1. Details of the upgrade of the ATLAS Central Trigger for run-2 are also included.
This thesis has two parts, each based on an application of the
renormalization-group (RG). The first part is an analysis of the
d-dimensional Coulomb gas. The goal was to determine if the Wilson
RG could provide input into particle-in-cell simulations in plasma
physics, which are the main family of simulation methods used in
this field. The role of the RG was to identify the effect of
coarse-graining on the coupling constants as a function of the
cut-offs. The RG calculation reproduced established results, but in
a more concise form, and showed the effect of the cut-offs on the
Debye screening length.
This thesis reports on major steps towards the realization of scalable quantum networks. It addresses the experimental implementation of a deterministic interaction mechanism between flying optical photons and a single trapped atom. In particular, it demonstrates the nondestructive detection of an optical photon. To this end, single rubidium atoms are trapped in a three-dimensional optical lattice at the center of an optical cavity in the strong coupling regime. Full control over the atomic state - its position, its motion, and its electronic state - is achieved with laser beams applied along the resonator and from the side. When faint laser pulses are reflected from the resonator, the combined atom-photon state acquires a state-dependent phase shift. In a first series of experiments, this is employed to nondestructively detect optical photons by measuring the atomic state after the reflection process. Then, quantum bits are encoded in the polarization of the laser pulse and in the Zeeman state of the atom. The state-dependent phase shift mediates a deterministic universal quantum gate between the atom and one or two successively reflected photons, which is used to generate entangled atom-photon, atom-photon-photon, and photon-photon states out of separable input states. |
![]() ![]() You may like...
|