1-9 of 9 Results

  • Keywords: entanglement x
Clear all

Article

Gravity and Quantum Entanglement  

Mukund Rangamani and Veronika Hubeny

The holographic entanglement entropy proposals give an explicit geometric encoding of spatially ordered quantum entanglement in continuum quantum field theory. These proposals have been developed in the context of the AdS/CFT correspondence, which posits a quantum duality between gravitational dynamics in anti-de Sitter (AdS) space times and that of a conformal field theory (CFT) in one fewer dimension. The von Neumann entropy of a spatial region of the CFT is given by the area of a particular extremal surface in the dual geometry. This surprising connection between a fundamental quantum mechanical concept and a simple geometric construct has given deep insights into the nature of the holographic map and potentially holds an important clue to unraveling the mysteries of quantum gravity.

Article

Quantum Quench and Universal Scaling  

Sumit R. Das

A quantum quench is a process in which a parameter of a many-body system or quantum field theory is changed in time, taking an initial stationary state into a complicated excited state. Traditionally “quench” refers to a process where this time dependence is fast compared to all scales in the problem. However in recent years the terminology has been generalized to include smooth changes that are slow compared to initial scales in the problem, but become fast compared to the physical scales at some later time, leading to a breakdown of adiabatic evolution. Quantum quench has been recently used as a theoretical tool to study many aspects of nonequilibrium physics like thermalization and universal aspects of critical dynamics. Relatively recent experiments in cold atom systems have implemented such quench protocols, which explore dynamical passages through critical points, and study in detail the process of relaxation to a steady state. On the other hand, quenches which remain adiabatic have been explored as a useful technique in quantum computation.

Article

The Philosophical Significance of Decoherence  

Elise Crull

Quantum decoherence is a physical process resulting from the entanglement of a system with environmental degrees of freedom. The entanglement allows the environment to behave like a measuring device on the initial system, resulting in the dynamical suppression of interference terms in mutually commuting bases. Because decoherence processes are extremely fast and often practically irreversible, measurements performed on the system after system–environment interactions typically yield outcomes empirically indistinguishable from physical collapse of the wave function. That is: environmental decoherence of a system’s phase relations produces effective eigenstates of a system in certain bases (depending on the details of the interaction) through prodigious damping—but not destruction—of the system’s off-diagonal terms in those bases. Although decoherence by itself is neither an interpretation of quantum physics nor indeed even new physics, there is much debate concerning the implications of this process in both the philosophical and the scientific literature. This is especially true regarding fundamental questions arising from quantum theory about the roles of measurement, observation, the nature of entanglement, and the emergence of classicality. In particular, acknowledging the part decoherence plays in interpretations of quantum mechanics recasts that debate in a new light.

Article

From the Interpretation of Quantum Mechanics to Quantum Technologies  

Olival Freire Junior

Quantum mechanics emerged laden with issues and doubts about its foundations and interpretation. However, nobody in the 1920s and 1930s dared to conjecture that research on such issues would open the doors to developments so huge as to require the term second quantum revolution to describe them. On the one hand, the new theory saw its scope of applications widen in various domains including atoms, molecules, light, the interaction between light and matter, relativistic effects, field quantization, nuclear physics, and solid state and particle physics. On the other hand, there were debates on alternative interpretations, the status of statistical predictions, the completeness of the theory, the underlying logic, mathematical structures, the understanding of measurements, and the transition from the quantum to the classical description. Until the early 1960s, there seemed to be a coexistence between these two orders of issues, without any interaction between them. From the late 1960s on, however, this landscape underwent dramatic changes. The main factor of change was Bell’s theorem, which implied a conflict between quantum mechanics predictions for certain systems that are spatially separated and the assumption of local realism. Experimental tests of this theorem led to the corroboration of quantum predictions and the understanding of quantum entanglement as a physical feature, a result that justified the 2022 Nobel Prize. Another theoretical breakthrough was the understanding and calculation of the interaction of a quantum system with its environment, leading to the transition from pure to mixed states, a feature now known as decoherence. Entanglement and decoherence both resulted from the dialogue between research on the foundations and quantum predictions. In addition, research on quantum optics and quantum gravity benefitted debates on the foundations. From the early 1980s on, another major change occurred, now in terms of experimental techniques, allowing physicists to manipulate single quantum systems and taking the thought experiments of the founders of quantum mechanics into the labs. Lastly, the insight that quantum systems may be used in computing opened the doors to the first quantum algorithms. Altogether, these developments have produced a new field of research, quantum information, which has quantum computers as its holy grail. The term second quantum revolution distinguishes these new achievements from the first spin-offs of quantum mechanics, for example, transistors, electronic microscopes, magnetic resonance imaging, and lasers. Nowadays the applications of this second revolution have gone beyond computing to include sensors and metrology, for instance, and thus are better labeled as quantum technologies.

Article

Quantum Error Correction  

Todd A. Brun

Quantum error correction is a set of methods to protect quantum information—that is, quantum states—from unwanted environmental interactions (decoherence) and other forms of noise. The information is stored in a quantum error-correcting code, which is a subspace in a larger Hilbert space. This code is designed so that the most common errors move the state into an error space orthogonal to the original code space while preserving the information in the state. It is possible to determine whether an error has occurred by a suitable measurement and to apply a unitary correction that returns the state to the code space without measuring (and hence disturbing) the protected state itself. In general, codewords of a quantum code are entangled states. No code that stores information can protect against all possible errors; instead, codes are designed to correct a specific error set, which should be chosen to match the most likely types of noise. An error set is represented by a set of operators that can multiply the codeword state. Most work on quantum error correction has focused on systems of quantum bits, or qubits, which are two-level quantum systems. These can be physically realized by the states of a spin-1/2 particle, the polarization of a single photon, two distinguished levels of a trapped atom or ion, the current states of a microscopic superconducting loop, or many other physical systems. The most widely used codes are the stabilizer codes, which are closely related to classical linear codes. The code space is the joint +1 eigenspace of a set of commuting Pauli operators on n qubits, called stabilizer generators; the error syndrome is determined by measuring these operators, which allows errors to be diagnosed and corrected. A stabilizer code is characterized by three parameters [ [ n , k , d ] ] , where n is the number of physical qubits, k is the number of encoded logical qubits, and d is the minimum distance of the code (the smallest number of simultaneous qubit errors that can transform one valid codeword into another). Every useful code has n > k ; this physical redundancy is necessary to detect and correct errors without disturbing the logical state. Quantum error correction is used to protect information in quantum communication (where quantum states pass through noisy channels) and quantum computation (where quantum states are transformed through a sequence of imperfect computational steps in the presence of environmental decoherence to solve a computational problem). In quantum computation, error correction is just one component of fault-tolerant design. Other approaches to error mitigation in quantum systems include decoherence-free subspaces, noiseless subsystems, and dynamical decoupling.

Article

Circuit Model of Quantum Computation  

James Wootton

Quantum circuits are an abstract framework to represent quantum dynamics. They are used to formally describe and reason about processes within quantum information technology. They are primarily used in quantum computation, quantum communication, and quantum cryptography—for which they provide a machine code–level description of quantum algorithms and protocols. The quantum circuit model is an abstract representation of these technologies based on the use of quantum circuits, with which algorithms and protocols can be concretely developed and studied. Quantum circuits are typically based on the concept of qubits: two-level quantum systems that serve as a fundamental unit of quantum hardware. In their simplest form, circuits take a set of qubits initialized in a simple known state, apply a set of discrete single- and two-qubit evolutions known as “gates,” and then finally measure all qubits. Any quantum computation can be expressed in this form through a suitable choice of gates, in a quantum analogy of the Boolean circuit model of conventional digital computation. More complex versions of quantum circuits can include features such as qudits, which are higher level quantum systems, as well as the ability to reset and measure qubits or qudits throughout the circuit. However, even the simplest form of the model can be used to emulate such behavior, making it fully sufficient to describe quantum information technology. It is possible to use the quantum circuit model to emulate other models of quantum computing, such as the adiabatic and measurement-based models, which formalize quantum algorithms in a very different way. As well as being a theoretical model to reason about quantum information technology, quantum circuits can also provide a blueprint for quantum hardware development. Corresponding hardware is based on the concept of building physical systems that can be controlled in the way required for qubits or qudits, including applying gates on them in sequence and performing measurements.

Article

Measurement-Based Quantum Computation  

Tzu-Chieh Wei

Measurement-based quantum computation is a framework of quantum computation, where entanglement is used as a resource and local measurements on qubits are used to drive the computation. It originates from the one-way quantum computer of Raussendorf and Briegel, who introduced the so-called cluster state as the underlying entangled resource state and showed that any quantum circuit could be executed by performing only local measurement on individual qubits. The randomness in the measurement outcomes can be dealt with by adapting future measurement axes so that computation is deterministic. Subsequent works have expanded the discussions of the measurement-based quantum computation to various subjects, including the quantification of entanglement for such a measurement-based scheme, the search for other resource states beyond cluster states and computational phases of matter. In addition, the measurement-based framework also provides useful connections to the emergence of time ordering, computational complexity and classical spin models, blind quantum computation, and so on, and has given an alternative, resource-efficient approach to implement the original linear-optic quantum computation of Knill, Laflamme, and Milburn. Cluster states and a few other resource states have been created experimentally in various physical systems, and the measurement-based approach offers a potential alternative to the standard circuit approach to realize a practical quantum computer.

Article

Egyptology and African History  

Juan C. Moreno García

Egyptology has played a rather ambiguous role in the study of the African past. While the Nile Valley was the cradle of one of the oldest states as well as of crucial innovations like writing, monumental architecture, and complex administrative managerial techniques, among others, the burden of Eurocentric historiographical prejudices considered these achievements to be a sort of anomaly. Ancient Egypt was thus interpreted through the lens of an alleged “exceptionalism”—a geographically African but, quite self-contradictorily, culturally non-African society. Such a view was rooted in a too-literal reading of pharaonic texts and images that celebrated the differences between Egypt and its neighbors. At the same time, Egypt was seen as a remote precedent of Western culture and societies—a venerable instigator of an uninterrupted process of progress supposedly culminating in Europe in the 19th century. Only as of the late 20th century has archaeology helped Egyptology overcome such a view, understand the African roots of the pharaonic civilization, and review the nature of its relations with its African neighbors. At the same time, intense archaeological exploration of the African regions that surrounded Egypt has revealed the critical role of Nubian and desert populations in creating original forms of political power and cultural achievement that owed little or nothing to pharaonic Egypt. The result is the emergence of more balanced historical interpretations that emphasize the complex interplay between all these actors in the social dynamics of the Bronze and Iron Age in northeastern Africa.

Article

Writing and Managing Multimodal Field Notes  

Fernando Hernández-Hernández and Juana M. Sancho-Gil

Researchers from various disciplines collect and generate field notes as a strategy to describe and reflect (through texts, photos, drawings, diagrams, or recordings) the complexity they face when addressing entangled and many-faceted phenomena. Field notes are as common research strategy not only to capture and amass instantly what researchers listen to, observe, think, and feel, but also to make explicit their reflexivity process, based on their observations and experiences. Field notes are not only a method for generating evidence, but a reflection of the ontological, epistemological, methodological, and ethical positionality that guide the researcher’s gaze. Paradoxically, although field notes are something most researchers use and are fundamental in their reports and publications, they are generally the hidden and idiosyncratic side of academic field work. The preparation of field notes is an extremely intricate issue, as the very same meaning, purposes, and roles of field notes heavily rely on the ethnographer’s onto-epistemological positioning. It is useful, then to contextualize field notes within the tradition of ethnography, without ignoring the fact that they are used in a wide range of disciplines (including anthropology, deology, architecture, geography, ethology, archaeology, and biology). It is also important to problematize the practice of taking, collecting, and generating field notes by taking into account the fact that the traditional vision of field notes as written (alphabetic) notes is being challenged by the availability of mobile applications that enable researchers to create and organize multimodal information. It is important to note the relevance of the so-called “headnotes,” as there are many impressions, scenes, and experiences that cannot be written down or can be difficult or impossible to document. In addition, the text goes beyond the reflection of interaction by introducing the notion of intra-action to overcome the metaphysics of individualism underlying conventional understandings of “interactions.” The growing multiplicity of languages, modes, and means of expression and communication must be examined alongside the strengths and limitations of multimodal field notes. Finally, the practice of keeping field notes requires a recognition of the reflexivity imbedded in this process. Field diaries can be seen as the first step toward ethnographic reporting, and here reflexivity becomes a fundamental part of the analyses involved.