1-11 of 11 Results  for:

  • Particles and Fields x
Clear all

Article

AdS3 Gravity and Holography  

Per Kraus

General relativity in three spacetime dimensions is a simplified model of gravity, possessing no local degrees of freedom, yet rich enough to admit black-hole solutions and other phenomena of interest. In the presence of a negative cosmological constant, the asymptotically anti–de Sitter (AdS) solutions admit a symmetry algebra consisting of two copies of the Virasoro algebra, with central charge inversely proportional to Newton’s constant. The study of this theory is greatly enriched by the AdS/CFT correspondence, which in this case implies a relationship to two-dimensional conformal field theory. General aspects of this theory can be understood by focusing on universal properties such as symmetries. The best understood examples of the AdS3/CFT2 correspondence arise from string theory constructions, in which case the gravity sector is accompanied by other propagating degrees of freedom. A question of recent interest is whether pure gravity can be made sense of as a quantum theory of gravity with a holographic dual. Attempting to answer this question requires making sense of the path integral over asymptotically AdS3 geometries.

Article

Calabi-Yau Spaces in the String Landscape  

Yang-Hui He

Calabi-Yau spaces, or Kähler spaces admitting zero Ricci curvature, have played a pivotal role in theoretical physics and pure mathematics for the last half century. In physics, they constituted the first and natural solution to compactification of superstring theory to our 4-dimensional universe, primarily due to one of their equivalent definitions being the admittance of covariantly constant spinors. Since the mid-1980s, physicists and mathematicians have joined forces in creating explicit examples of Calabi-Yau spaces, compiling databases of formidable size, including the complete intersecion (CICY) data set, the weighted hypersurfaces data set, the elliptic-fibration data set, the Kreuzer-Skarke toric hypersurface data set, generalized CICYs, etc., totaling at least on the order of 10 10 manifolds. These all contribute to the vast string landscape, the multitude of possible vacuum solutions to string compactification. More recently, this collaboration has been enriched by computer science and data science, the former in bench-marking the complexity of the algorithms in computing geometric quantities, and the latter in applying techniques such as machine learning in extracting unexpected information. These endeavours, inspired by the physics of the string landscape, have rendered the investigation of Calabi-Yau spaces one of the most exciting and interdisciplinary fields.

Article

Dark Matter  

Timothy Sumner

Dark matter is one of the most fundamental and perplexing issues of modern physics. Its presence is deduced from a straightforward application of Newton’s theory of gravity to astronomical systems whose dynamical motion should be simple to understand. The success of Newton’s theory in describing the behavior of the solar system was one of the greatest achievements of the 18th century. Its subsequent use to deduce the presence of a previously unknown planet, Neptune, discovered in 1846, was the first demonstration of how minor departures from its predictions indicated additional mass. The expectation in the early 20th century, as astronomical observations allowed more distance and larger celestial systems to be studied, was that galaxies and collections of galaxies should behave like larger solar systems, albeit more complicated. However, the reality was quite different. It is not a minor discrepancy, as led to the discovery of Neptune, but it is extreme. The stars at the edges of galaxies are not behaving at all like Pluto at the edge of the solar system. Instead of having a slower orbital speed, as expected and shown by Pluto, they have the same speed as those much further in. If Newton’s law is to be retained, there must be much more mass in the galaxy than can be seen, and it must be distributed out to large distances, beyond the visible extent of the galaxy. This unseen mass is called “dark matter,” and its presence was becoming widely accepted by the 1970s. Subsequently, many other types of astrophysical observations covering many other types of object were made that came to the same conclusions. The ultimate realization was that the universe itself requires dark matter to explain how it developed the structures within it observed today. The current consensus is that one-fourth of the universe is dark matter, whereas only 1/20th is normal matter. This leaves the majority in some other form, and therein lies another mystery—“dark energy.” The modern form of Newton’s laws is general relativity, due to Albert Einstein. This offers no help in solving the problem of dark matter because most of the systems involved are nonrelativistic and the solutions to the general theory of relativity (GR) reproduce Newtonian behavior. However, it would not be right to avoid mentioning the possibility of modifying Newton’s laws (and hence GR) in such a way as to change the nonrelativistic behavior to explain the way galaxies behave, but without changing the solar system dynamics. Although this is a minority concept, it is nonetheless surviving within the scientific community as an idea. Understanding the nature of dark matter is one of the most intensely competitive research areas, and the solution will be of profound importance to astrophysics, cosmology, and fundamental physics. There is thus a huge “industry” of direct detection experiments predicated on the premise that there is a new particle species yet to be found, and which pervades the universe. There are also experiments searching for evidence of the decay of the particles via their annihilation products, and, finally, there are intense searches for newly formed unknown particles in collider experiments.

Article

Electroweak Interactions and W, Z Boson Properties  

Maarten Boonekamp and Matthias Schott

With the huge success of quantum electrodynamics (QED) to describe electromagnetic interactions in nature, several attempts have been made to extend the concept of gauge theories to the other known fundamental interactions. It was realized in the late 1960s that electromagnetic and weak interactions can be described by a single unified gauge theory. In addition to the photon, the single mediator of the electromagnetic interaction, this theory predicted new, heavy particles responsible for the weak interaction, namely the W and the Z bosons. A scalar field, the Higgs field, was introduced to generate their mass. The discovery of the mediators of the weak interaction in 1983, at the European Center for Nuclear Research (CERN), marked a breakthrough in fundamental physics and opened the door to more precise tests of the Standard Model. Subsequent measurements of the weak boson properties allowed the mass of the top quark and of the Higgs Boson to be predicted before their discovery. Nowadays, these measurements are used to further probe the consistency of the Standard Model, and to place constrains on theories attempting to answer still open questions in physics, such as the presence of dark matter in the universe or unification of the electroweak and strong interactions with gravity.

Article

Progress in Gamma Detection for Basic Nuclear Science and Applications  

J. Simpson and A. J. Boston

The atomic nucleus, consisting of protons and neutrons, is a unique strongly interacting quantum mechanical system that makes up 99.9% of all visible matter. From the inception of gamma-ray detectors to the early 21st century, advances in gamma detection have allowed researchers to broaden their understanding of the fundamental properties of all nuclei and their interactions. Key technical advances have enabled the development of state-of-the art instruments that are expected to address a wide range of nuclear science at the extremes of the nuclear landscape, excitation energy, spin, stability, and mass. The realisation of efficient gamma detection systems has impact in many applications such as medical imaging environmental radiation monitoring, and security. Even though the technical advances made so far are remarkable, further improvements are continually being implemented or planned.

Article

Quantum Quench and Universal Scaling  

Sumit R. Das

A quantum quench is a process in which a parameter of a many-body system or quantum field theory is changed in time, taking an initial stationary state into a complicated excited state. Traditionally “quench” refers to a process where this time dependence is fast compared to all scales in the problem. However in recent years the terminology has been generalized to include smooth changes that are slow compared to initial scales in the problem, but become fast compared to the physical scales at some later time, leading to a breakdown of adiabatic evolution. Quantum quench has been recently used as a theoretical tool to study many aspects of nonequilibrium physics like thermalization and universal aspects of critical dynamics. Relatively recent experiments in cold atom systems have implemented such quench protocols, which explore dynamical passages through critical points, and study in detail the process of relaxation to a steady state. On the other hand, quenches which remain adiabatic have been explored as a useful technique in quantum computation.

Article

Strange Metals and Black Holes: Insights From the Sachdev-Ye-Kitaev Model  

Subir Sachdev

Complex many-particle quantum entanglement is a central theme in two distinct major topics in physics: the strange metal state found in numerous correlated electron compounds and the quantum theory of black holes in Einstein gravity. The Sachdev-Ye-Kitaev model provides a solvable theory of entangled many-particle quantum states without quasiparticle excitations. This toy model has led to realistic universal models of strange metals and to new insights on the quantum states of black holes.

Article

String Field Theory  

Carlo Maccaferri

After more than 50 years from the Veneziano amplitude, the fundamental formulation of string theory (ST) remains elusive. On the one hand, there is the world-sheet formulation that is truly microscopic but which is only valid for infinitesimal string coupling and is highly background dependent. On the other hand, there are other nonperturbative background independent approaches such as supergravity or other quantum theories of (super) Yang–Mills type, which however necessarily miss some of the features of the extended nature of the string (although in some cases they can be holographically equivalent to a ST). In string field theory (SFT), it is possible to keep an exact microscopic world-sheet description together with a complete space–time framework that follows the rules of quantum field theory (QFT) and where nonperturbative contributions can be, at least in principle, coherently accounted for. String field theory is a formulation of ST as a QFT for an infinite number of fields (the various oscillation modes of the string) in spacetime. This formulation allows to better treat some of the shortcomings of the usual on-shell formulation of ST while maintaining at the same time a full microscopic world-sheet approach. The construction of SFTs is such that ST world-sheet amplitudes are reproduced when these are well-defined. But SFT gives a more general construction of amplitudes that is well-defined even when the standard world-sheet approach gives rise to divergences. In this very general framework, all the elementary string interactions are defined so as to provide a solution to the quantum Batalin–Vilkovisky master equation, furnishing a perturbative microscopic definition of the target space path integral of ST. This construction is explicitly realized in terms of (quantum) homotopy algebras for both bosonic strings and superstrings, including Type II, Type I, and heterotic. The construction offered by SFT allows to define the one-particle irreducible (1PI) effective action of ST and thus to give a definition of string perturbation theory where it is possible to discuss quantum effects such as vacuum shifts due to tadpoles and mass renormalization. The explicit knowledge of microscopic ultraviolet SFTs allows to construct the low-energy ST effective action as the Wilsonian action by integrating out the massive string states from the SFT path integral. This top-down construction is safe from infrared divergences and has been very useful for obtaining unambiguous results on nonperturbative contributions, such as D-instanton corrections to perturbative amplitudes and effective superpotentials. String field theories (especially open string field theories [OSFTs]) allow to approach background independence in ST by recasting the plethora of different ST backgrounds in the form of classical solutions to the SFT equation of motion. This program has been fully realized in critical bosonic OSFT, where any D-brane system can be explicitly written as a classical solution of the OSFT on any other D-brane system.

Article

Supersymmetric QFT in Six Dimensions  

Alessandro Tomasiello

Quantum field theory (QFT) in six dimensions is more challenging than its four-dimensional counterpart: most models tend to become ill-defined at high energies. A combination of supersymmetry and string theory has yielded many QFTs that evade this problem and are low-energy effective manifestations of conformal field theories (CFTs). Besides the usual vector, spinor and scalar fields, the new ingredients are self-dual tensor fields, analogs of the electromagnetic field with an additional spacetime index, sometimes with an additional non-Abelian structure. A recent wave of interest in this field has produced several classification results, notably of models that have a holographic dual in string theory and of models that can be realized in F-theory. Several precise quantitative checks of the overall picture are now available, and give confidence that a full classification of all six-dimensional CFTs may be at hand.conformal field theories, supersymmetry, extra dimensions, holography, string theory, D-branes, F-theory

Article

The Conformal Bootstrap  

Miguel Fernandes Paulos

Conformal field theories (CFTs) have a wide range of experimental and theoretical applications. They describe classical and quantum critical phenomena, low (or high) energy limits of quantum field theories, and even quantum gravity via the Anti-de Sitter space/CFT correspondence (AdS/CFT). Most interesting, CFTs are strongly interacting and difficult to analyze. The Conformal Bootstrap program is an approach that exploits only basic consistency conditions of CFTs, such as unitarity, locality, and symmetry, encoded into a set of bootstrap equations. The hope is that such conditions might be strong enough to uniquely determine the full set of consistent theories. This philosophy was first used successfuly in the 1980s to analytically determine and classify large classes of critical phenomena in two spatial dimensions. Starting from 2008, major developments have allowed the exploration of CFTs in more general spacetime dimension. The key breakthrough was to realize that one could exploit methods from linear and semidefinite optimization theory to analyze the bootstrap equations and obtain strong, universal constraints on the space of CFTs. The Conformal Bootstrap has led to a number of important results in the study of CFTs. One of the main outcomes consists of general bounds on the data defining a CFT, such as critical exponents and operator–product expansion coefficients. This has been done for a number of contexts, such as different space-time dimensions, global symmetry groups, and various amounts of supersymmetry. More remarkably, this approach not only leads to general results on the space of theories but is also powerful enough to give extremely precise determinations of the properties of specific models, such as the critical exponents of the critical 3d Ising and O(2) models. Finally the conformal-bootstrap program also includes the formal study and non-perturbative definition of CFTs and their observables. These include not only the study of Euclidean correlation functions but also a study of their properties in Lorentzian signature; the study of defects, interfaces, and boundary conditions; finite temperature; and connections to the AdS/CFT correspondence.

Article

The Partonic Content of Nucleons and Nuclei  

Juan Rojo

Deepening our knowledge of the partonic content of nucleons and nuclei represents a central endeavor of modern high-energy and nuclear physics, with ramifications in related disciplines, such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding of the strong interaction, such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy-ion collisions at the Large Hadron Collider to ultra-high-energy neutrino interactions at neutrino telescopes.