Multiverse Theories: Philosophical and Religious Perspectives
Summary and Keywords
The term multiverse is derived from multiple universes. A multiverse is a theoretical concept denoting a collection of universes that are causally disconnected and whatever may exist beyond or between the boundaries of these universes. In essence, it is the totality of physical reality, whatever form that may take. An equivalent term is megaverse. The physically distinct universes composing a multiverse are often referred to as alternative, alternate, quantum, parallel, or bubble universes.
The American philosopher William James invented the specific term multiverse in 1895, not in a cosmological context but in reference to his view of the natural world. In the 20th century the application of the term was broadened from James’s original intent to a range of areas including cosmology, religion, philosophy, and psychology. More recently, David Lewis (1941–2001) considered philosophical implications of a multiverse from his modal realism perspective. In fact, the concept of a cosmological multiverse and its philosophical and religious implications were actually considered more than a millennium prior throughout various societies and religions. The scientific implications have predominantly been analyzed since the early 20th century.
In efforts to answer fundamental questions about the origin and properties of our universe, many cosmologists have converged on a scientific concept of a multiverse of one form or another. Multiverses with vastly different properties have been developed. To organize the collection of multiverses in a consistent way, in 2003, MIT cosmologist Max Tegmark proposed a multiverse taxonomy. Tegmark argued that all multiverses can be fit into four classes, which he designated as Levels 1 through 4. A given higher level multiverse contains a set or sets of lower level multiverses. In 2007, string theorist Brian Greene of Columbia University refined Tegmark’s classification system. Each of Greene’s nine classes fit within one of Tegmark’s four Levels.
While theoretical multiverses take many forms, common to most all of them is the idea that a vast number of universes exists outside the limits of our observable region. Given that a multiverse beyond our universe is not currently (and perhaps never will be) empirically testable or detectable, the multiverse concept is very controversial. This is especially so within and among the science, philosophy, and religion communities. There is disagreement regarding the question of the existence of the multiverse and whether the multiverse is a proper subject of scientific inquiry. Some argue that a multiverse is a philosophical concept, rather than a scientific one. Alternatively, some scientists believe that most or all multiverse proposals present a deconstructionist science that avoids providing answers grounded in meaningful science. In contrast, many theoretical physicists (especially cosmologists) and some philosophers affirm that a multiverse offers a more likely and more robust resolution to fundamental cosmological issues that a sole (even infinite) universe cannot answer.
Multiverse—The General Concept
The longer humanity has observed and contemplated the physical realm surrounding us, the larger we have understood the physical realm to be. In broad strokes, our vision of the world has passed through five paradigms: the Mythocentric, the Geocentric, the Heliocentric, the Galactocentric, and currently the Universe-centric. Each paradigm presented a larger vision of reality. The concept of a vast cosmos has a long and developmental history, mingled in different forms throughout each of these stages. The proposals for a multiverse developed as a result of our growing understanding of the universe, and our simultaneous perplexity of its features.
In the distant past, the concept of a vast cosmos and possible multiplicity of universes was predominantly a philosophical or theological concept. It was envisioned in many ancient religious texts, such as the Apannaka Jatakas of Buddhism, the Bhagavata Purana of Hinduism, and the Kabbalah of mystical Judaism.
One of the first Western thinkers to consider the concept of cosmological multiverse was Roger Bacon (c. 1200ce). From his underlying assumption of the symmetry of space, Bacon concluded that our universe must be spatially spherical. This assumption inspired him to consider whether more than one spherical universe could exist and, if so, how these universes would be oriented with regard to each other. Bacon concluded that any two or more universes without a universal common center point would leave some volume “unoccupied,” creating a vacuum. Believing that nature abhors a vacuum, Bacon inferred that spherical universes should all be aligned around a center point. But given a common center point, the largest universe would contain all of the smaller, with each smaller being a sub-volume of the next larger. Therefore, Bacon concluded that only one overall universe could exist.
By the 1300s, the Roman Catholic Church began to consider the possibility that other worlds like earth could exist elsewhere in our one universe. St. Albert the Great (c. 1260) pondered if “there exist many worlds, or is there but a single world? This is one of the most noble and exalted questions in the study of Nature.”1 In 1277, the bishop of Paris issued a decree with papal authority that officially condemned the idea that “the First Cause (God) cannot make many worlds.”2 Cardinal Nicolas of Cusa (c. 1440) likewise speculated about many worlds. Dominican friar Giordano Bruno (1548–1600) further argued that our sun was but one among an infinite number of stars circled by life-supporting planets, within an infinite universe.3
The specific term multiverse did not appear until a few years before the turn of the 20th century. In an 1895 address to the Men’s Christian Association of Harvard University, William James constructed the term to express his views on natural evil and the chaos and disorder within the natural world. James remarked that “visible nature is all plasticity and indifference, a moral multiverse, as one might call it, and not a moral universe.” In its modern incarnation(s), the multiverse is primarily a scientific construction. Since the latter half of the 20th century, multiverse has been most commonly identified in the cosmological sense. A multiverse has come to imply either a hypothesized set of possible universes or the complete set of universes that may in fact actually exist (which necessarily includes our own universe). The number of universes in a proposed multiverse can be finite or infinite.
For many cosmologists, a multiverse is now understood as an inevitable (or, in the least, highly probable) outcome of the physical processes that are generally believed to have produced our universe. A multiverse explains the apparent fine-tuning of our universe for life and especially sentient life. Numerous theoretical physicists believe a multiverse is an inevitable result of quantum mechanical effects.
As a concept that offers possible scientific answers to these issues, a multiverse of one form or another has gained in popularity within segments of the scientific community and the populace at large. Physics Nobel Laureate Frank Wilczek describes the increasing acceptance within the science community of the multiverse concept in his comments regarding the growth of attendance at international multiverse conferences between 2001 and 2005:
The  gathering had a defensive air. It prominently featured a number of physicists who subsisted on the fringes, voices in the wilderness who had for many years promoted strange arguments about conspiracies among fundamental constants and alternative universes. Their concerns and approaches seemed totally alien to the vanguard of theoretical physics, which was busy successfully constructing a unique and mathematically perfect universe. Now the vanguard has marched off to join the prophets in the wilderness.4
The increased focus on the multiverse concept sparked growing attention and debate within the scientific, philosophical, and theological communities. Multiverse supporter Bernard Carr and (generally) multiverse opponent George Ellis acknowledged in a published debate that
there is no doubt that [multiverse] raises deep conceptual issues. The problem is that scientific progress has not only changed our view of the universe, it has also changed our view of the nature of science itself. . . . The multiverse has explanatory values but [the question is] whether it should be regarded as legitimate science.5
At present, and possibly for all time, a multiverse is not empirically detectable, except for our observable universe component.
Multiverse hypotheses remain controversial within the scientific community. Disagreement abounds regarding the reality of the existence of a multiverse. Even if it does exist, many people question if a multiverse is a proper subject for scientific inquiry. Multiverse supporters within the sciences of the multiverse hypothesis include Stephen Barr, Bernard Carr, Brian Greene, Stephen Hawking, Don Page, Burt Schellekins, John Schwarz, Lenny Susskind, Max Tegmark, Alex Vilenkin, Steven Weinberg, and Frank Wilczek, to (alphabetically) name a few. Philosopher Klaas Kraay is also a multiverse proponent.
In contrast, critics and skeptics include Paul Davies, David Gross, Lee Smolin, Paul Steinhardt, and Neil Turok. These scientists generally assign the multiverse question to the philosophical realm rather than to the scientific realm. Former director of the Stanford Linear Accelerator Burt Richter and mathematician Peter Woit regard multiverse proposals as deconstructionist science. Philosopher William Lane Craig is a staunch multiverse adversary, while philosopher Rodney Holder is generally skeptical.
Physicists George Ellis and Robert Mann and philosophers Robin Collins and John Leslie take a more nuanced view. The concerns of Ellis and Mann focus on issues regarding Level 3 (Everett) and Level 4 multiverses in particular. Collins does not object to the multiverse proposals, but doesn’t believe that they solve the problem of the ultra-fine tuning we observe in this universe. His claim is that our universe appears fine-tuned for more specialized beings than just generic observers. Rather, for Collins it appears to be specifically fine-tuned for embodied conscious agents like ourselves who can significantly interact with each other, including doing science to understand the universe.6 Leslie sees theism and multiverse as alternative explanations for the fine-tuning. He prefers a form of neo-Platonic axiarchism, according to which the universe comes about through its “ethical requiredness.”7
To understand how scientific progress has both changed humankind’s view of the universe and its view of the nature of science, the concept of multiverse is reviewed in its historical context. The four prior paradigms and the present paradigm of the physical world are summarized. Issues and unanswered questions within the present Universe-centric paradigm that suggest a transformation to a multiverse paradigm are then summarized. Different types of multiverses and their implications are explored. Scientific, philosophical, and religious controversies of each type are considered.
As both supporters and critics have come to realize, the Multiverse-centric paradigm has deep philosophical and theological implications. A full paradigm shift from a single universe to a multiverse could profoundly alter theological perceptions of many religions, especially the nature of the deity’s interaction with and within creation. The specific form of a multiverse, if determinable, could have profound implications for the nature of a divine being.
Prelude to a Multiverse
Mythocentric, Prehistory to 300 bce
From prehistory to the Greek era, the Mideast world was viewed through the imaginative lenses of myth and symbolism. Thus, during this era humanity was in the “Mythocentric” paradigm. Throughout the Mideast, reality was generally perceived as a type of three-tier structure. Center stage was the surface of the earth. Below this was envisioned the underworld of the dead (e.g., hades, hell, or sheol). A primeval ocean was believed to exist below the world of the dead. Upon this ocean the earth was believed to float and into this ocean the earth’s support pillars were thought to descend. Various levels of the heavens were imagined to exist far above the earth’s surface: the firmament of the stars and the sun and moon; the watery ocean of the heavens kept separated above by a cover; and beyond that the heaven of heavens, the realm of the divine.
Around 300 bce the development of a philosophical science of the contemplative Greek world brought about the first significant paradigm shift: a transformation from the Mythocentric paradigm to the Geocentric paradigm. The new paradigm was a result of a quasi-scientific understanding of physical reality. In the cosmology of the Geocentric paradigm, both the sun and the other known planets were believed to orbit about a spherical earth. This picture was based on observation, philosophical logic, and the beginning of science. That the earth was spherical, rather than flat, was widely accepted by the educated Greeks. The earth’s radius and circumference were even computed with extreme accuracy by Eratosthenes of Cyrene (c. 276 bce–194 bce), a Greek mathematician, geographer, poet, astronomer, and music theorist. Using the earth’s curvature, Eratosthenes determined the circumference of the earth to be either 39,690 kilometers (giving an error of only 0.96 percent) or 46,620 kilometers (giving an error of still just 16.3 percent). (The result depends on Eratosthenes’ length for a stade, the Egyptian stade equaling 157.5 meters, but the more common Attic stade equaling 185 meters. Along with introducing the concept of leap day, Eratosthenes also accurately calculated the tilt of the earth’s axis and likely the distance of the earth to the sun.
A minority of Greeks even supported the idea that the earth orbited the sun, rather than vice versa. One of the earliest dissenters of the Geocentric picture was Aristarchus of Samos (c. 310 bce–230 bce), as quoted by Archimedes in his book The Sand Reckoner. Aristarchus, an astronomer and mathematician, was one of the earliest to suggest a Heliocentric system. He hypothesized that the earth rotated about the sun and placed the other planets in their correct order of distances from the sun. Aristarchus also argued that the distant stars were fixed in their locations.
But Aristarchus and the other Greek heliocentrists were essentially ignored, partially for scientific reasons as then perceived: One piece of evidence supporting the Geocentric picture of Aristotle and Ptolemy was non-detection of parallax in the stars. Parallax is a simple geometrical method by which an object’s distance can be measured. For example, if one looks at an object first with one eye closed and then with the other, the object appears to move, unless it is perfectly centered between the eyes, because of the distance between one’s eyes. The further away an object is, the less parallax an object has. Our brains use this mechanism to judge the distance an object is away. For the Greeks, the lack of parallax in all of the stars implied the earth was not moving. The alternate possibility, that the stars were so distant as to make their parallax unobservable, was generally rejected (except by some such as Aristarchus.) The Geocentric paradigm lasted for over one and a half millennium, until replaced by the Heliocentric paradigm in the 17th century.
The inception of the Heliocentric paradigm in Europe overlapped the last two centuries of the Geocentric paradigm. Nicolaus Copernicus (1473–1543) is generally accredited with initiating the paradigm transformation through his De revolutionibus orbium coelestium, published in 1543 shortly before his death. In this work, Copernicus proposed a Heliocentric theory in mathematical form. (Nonetheless, in his Commentariolus of 1514 Copernicus had already stated the seven assumptions of his Heliocentric proposal, though without any related mathematics.) Pope Clement VII, Pope Paul III (to whom De revoluntionibus was dedicated), and many Catholic cardinals were intrigued with Copernicus’s Heliocentric concept. Copernicus was invited to present his ideas in Rome in 1536. According to many historians, at that time the Copernican system was well received by the Vatican.
Another significant contributor to the Heliocentric revolution was astronomer Johannes Kepler (1572–1630), student of Danish astronomer Tycho Brahe (1546–1601). Kepler is best remembered for the laws of planetary motion he presented in his works Astronomica Nova, Harmonices Mundi, and Epitome of Copernican Astronomy.8 Based on the orbit of Mars that he recorded with Brahe, Kepler determined his three laws of planetary motion:
i. that the orbit of a planet is elliptical, with the sun at one of the two foci,
ii. that a planet sweeps out equal areas during equal periods of time, and
iii. that the square of a planet’s orbital period is proportional to the cube of the semi-major axis of its orbit.
Overlapping Kepler was, Galileo Galilei (1564–1642), the “father of modern science.” Using a telescope of his own construction, in 1610 Galileo was the first to discover moons of another planet, specifically the four largest moons of Jupiter. Kepler endorsed Galileo’s observations with a publication in which he also speculated about their scientific meaning. Galileo was also the first in Europe to identify the Milky Way as a multitude of stars located closely enough to appear from earth as nebulous clouds. Galileo actually proposed the first version of relativity: that the laws of physics are the same in any system moving at constant speed and direction, regardless of the actual speed or direction. That there is no absolute motion or absolute rest became the framework for Newton’s laws of motion and is fundamental to Einstein’s special theory of relativity.
Because of his telescope observations, Galileo became a vocal advocate for Copernicus’s heliocentrist model as more than mere hypothesis, as evidenced in Galileo’s Letter to the Grand Duchess Christina. However, simultaneously opposition to heliocentrism as anything more than hypothesis was growing within the leadership of the Catholic Church. Contrary to the earlier understandings of Popes Clement VII and Paul III, both the Roman Inquisition of 1615 and Pope Paul V concluded that the heliocentrism hypothesis was false and contrary to scripture. The inquisition forbade Galileo from advocating the reality of heliocentrism. All writings, including De revolutionibus, claiming the Copernican system as anything more than hypothesis or useful model were placed on the index of banned books. Nevertheless, the paradigm shift toward heliocentrism spread swiftly in the years to follow, particularly as Kepler’s Epitome of Copernican Astronomy grew in influence. By the 1660s the paradigm shift was complete and the learned public was informed of the evidence for heliocentrism in books such as Conversations on the Plurality of Worlds by Bernared de Fontenelle of France.
At the urging of astronomer and physicist Edmond Halley (1656–1742), Isaac Newton (1642–1727) published his landmark Philosophiae Naturalis Principia Mathematica in 1687. Therein, Newton explained all of Kepler’s laws in terms of his law of gravity and three laws of motion. His laws of motion state that:
i. An object either remains at rest or in constant motion in a straight line, unless acted on by an external force;
ii. The vector sum of all forces on an object is equal to the acceleration of an object multiplied by its mass; and,
iii. For every force applied by object-A to object-B, there is an equal but opposite force applied by object-B to object-A.
Newton determined that the force of gravity between two objects is proportional to the product of the masses of the two objects divided by the square of the distance between the centers of mass of the objects. Through his universal law of gravity, which acts identically in space and on earth, and his three force laws, Newton placed the Heliocentric paradigm on firm physical and mathematical foundations.
Nevertheless, in a relatively short time, the Heliocentric picture was supplanted by the much grander Galactocentric picture. In fact, the idea that the Milky Way is a vast collection of stars predated the Galactocentric era itself. The Persian astronomer Abu Rayhan al Biruni (973–1084) proposed the Milky Way to be a collection of nebulous stars. The astronomers Avempace (d. 1138) and Ibn Qayyim Al-Jawziyya (1292–1350) believed the components to be actual stars that appeared to us as a blurred image due to refraction of their light in the earth’s atmosphere.
In On Learned Ignorance, Cardinal Nicholas of Cusa (1401–1464) asked whether there was any reason to assert that our sun (or any other point in space) was the center of the universe. Cusa believed our sun was randomly located among a vast collection of stars. The proof that our galaxy was an actual collection of stars followed in 1610. In that year Galileo demonstrated telescopically that the Milky Way was genuinely composed of finely scattered actual stars and not just formed of cloudy patches of gas.
In a treatise in 1755, Immanuel Kant theorized that the Milky Way, with our solar system included, was a rotating body of a vast number of stars held together by gravitational forces, much like a solar system, but on a far larger scale. Kant realized that the resulting disk of stars would be viewed from our perspective inside the disk as a band in the sky. Kant further proposed that some of the nebulae visible in the sky might be separate galaxies themselves. He referred to our galaxy and the extragalactic nebulae as island universes.
In the 18th century, the observations of astronomers such as Sir William Herschel (1738–1822) revealed that our sun did not occupy a unique location in the Milky Way galaxy. Herschel studied binary (and multiple) star systems, discovering over 800 systems between 1774 and 1794. In his 1803 paper, “Account of the Changes That Have Happened, during the Last Twenty-Five Years, in the Relative Situation of Double-Stars: With an Investigation of the Cause to Which They Are Owing,” Hershel presented the orbital evidence to prove his conjecture that almost all of the binary and multiple systems he observed were locked in gravitational orbits as pairs.
Herschel discovered over 2,400 nebulae and presented his nebulae data in three catalogs: Catalogue of One Thousand New Nebulae and Clusters of Stars in 1786, Catalogue of a Second Thousand New Nebulae and Clusters of Stars in 1789, and Catalogue of 500 New Nebulae, Nebulous Stars, Planetary Nebulae, and Clusters of Stars: With Remarks on the Construction of the Heavens. Throughout his lifetime, Hershel suggested these nebulae were galaxies like our own Milky Way.9 Nevertheless, prior to Edwin Hubble’s 1920s observations, it remained the general consensus of astronomers that nebulae were hot gaseous objects interspersed between the stars of our Milky Way galaxy.
Between 1840 and 1930, significant advancements in astronomy enabled distances within the Milky Way to be more precisely calculated. By the beginning of the 20th century, the Milky Way as a spiral galaxy approximately 100,000 light years (ly) in diameter and 1,000 ly in thickness reached consensus. The galaxy’s star population was estimated to be between 100 billion and 400 billion. The consensus number is now around that upper limit. From the collection of astronomical evidence, the Galactocentric paradigm quickly replaced the Heliocentric paradigm. (Thus, Einstein’s relativity was constructed during the Galactocentric paradigm.) The remaining issue of debate was primarily the location and nature of nebulae.
By the beginning of the mid-19th century, the Galactocentric paradigm was firmly in place, However, evidence indicating this picture was still incomplete was already appearing. The nagging question of the nebulae continued. In 1845, William Parsons (1800–1867) constructed a new telescope through which he could distinguish more structure in some nebulae, observing some as elliptical and others as spiral-shaped. He also identified individual point sources of light in some nebulae, lending support to Kant’s earlier conjecture that the nebulae were separate galaxies. Seven decades later, astronomer Heber Curtis (1872–1942) observed an individual nova within the Great Andromeda nebula. Following this, Curtis discovered additional novae in around a dozen other photographs. These novae were at least 10 magnitudes fainter than those more commonly observed within the Milky Way galaxy, which implied that they were at least 500,000 ly away. Curtis concluded that the nebulae were island universes distinct from our own.
In the famous 1920 “Great Astronomy Debate,” astronomers Harlow Shapley and Herber Curtis sparred over whether distant nebulae were relatively small and lay within our Milky Way galaxy or whether they were large, independent ones. Shapley argued in favor of the Milky Way forming the entirety of the universe. He had determined that if the Andromeda nebula was outside of the Milky Way and was an independent galaxy of similar size, then it would necessarily be around 108 ly away—too vast a distance for most astronomers at the time to accept. Curtis, however, believed this huge distance to be correct. Pointing out that the rate of observed novae in Andromeda was more than in the Milky Way as a whole, Curtis asked why, if the Andromeda nebula was within our galaxy, should there be more novae in a particular region of the Milky Way. His “lack of symmetry argument” lent credence to Andromeda as an individual galaxy.
This debate was conclusively ended a few years later by Edwin Hubble (1889–1953). Using the Mt. Wilson observatory’s 100-inch telescope, Hubble observed that the outer parts of some spiral nebulae were indeed formed of individual stars. Based on a type of star, known as Cepheid variables, as a standard candle, he determined the Andromeda nebula to be over 900,000 ly away—far too distant to be part of the Milky Way. (For Cepheid variables, the pulse rate is directly related to the star’s actual brightness. Since the observed brightness of a star decreases as the square of its distance, measuring the pulse rate allows determination of the star’s actual distance.) Hubble eventually identified over a billion other galaxies. These findings convinced the scientific community of a multiplicity of galaxies, firmly establishing the dawn of the Universe-centric paradigm.
In 1920, Georges Lemaître, a Belgian, began training to be a Roman Catholic priest. At the School of the House of Saint Rombaut, Lemaître’s professors recognized his mathematics and physics skills and so directed Lemaître to study the papers of Albert Einstein in addition to the standard theological writings. To learn the principles of general relativity, Lemaître studied the papers of Arthur Eddington. In 1922, Lemaître received a Belgian government scholarship for his thesis on The Physics of Einstein. The next year he was ordained a Catholic priest. However, rather than following the normal course of a priest, he used the Belgian scholarship to work personally with Eddington. Lemaître was assigned by Eddington the Ph.D. dissertation of applying the laws of general relativity to a universe treated as a sphere containing a gas that is both isotropic (that is, with properties that don’t depend on location) and inhomogeneous (that is, the gas is not evenly distributed). Lemaître discovered two solutions to such a universe: The first solution was consistent with Einstein’s 1917 model of a closed, stable, static universe with constant mass-energy density. The second solution was consistent with Willem de Sitter’s 1917 model of a universe for which the large-scale behavior is dominated by a cosmological constant (that is, the energy density of space itself).
After this success, Lemaître traveled to the United States to complete a Ph.D. at MIT. During his time in the United States, he interacted with other physicists including Hubble and Vesto Slipher. Lemaître returned to Belgium in 1925 to accept a faculty appointment at the Catholic University of Leuven. In 1927 he returned to MIT to defend his dissertation, The Gravitational Field in a Fluid Sphere of Uniform Invariant Density according to the Theory of Relativity. Soon after, Lemaître wrote a paper that claimed that the universe’s mass is constant, but its radius is increasing, thereby causing galaxies to move apart.10 Lemaître developed the argument in two steps. He first showed that Einstein’s general theory of relativity produced solutions consistent with an expanding universe. He then built on and synthesized the observations of three astronomers: Henrietta Leavitt’s Cepheid variable measurements of distances to close stars, Hubble’s measurements of distances to vastly further away galaxies, and Slipher’s redshift measurements of many of these same Cepheid variables.11
The faster a source of light moves away from an observer, the more the wavelength of the light from the source appears to the observer to be stretched. Correspondingly, visible light appears more shifted toward the red. Thus, measurement of red-shift allowed determination of the recessional velocity of the Cepheid variables. By combining all of the data, Lemaître showed that the further away a galaxy was from our Milky Way galaxy, the faster it was receding from us. Lemaître proposed that space itself is expanding in all directions, rather than space being static and all other galaxies moving away from ours. He showed that spatial expansion can be expressed in the form v = H r, where v is the recession velocity of another galaxy relative to the Milky Way resulting from spatial expansion, H is a constant, and r is the distance of the other galaxy from the Milky Way. Lemaître calculated H to be around 625 km s-1 Mpc-1. Unfortunately, Lemaître’s paper had little impact, especially in the United States, because it was published in a journal not widely circulated outside Belgium.
Two years later, in 1929, Hubble published a similar paper in the United States deriving the v=H r relationship, based on his own synthesis of his Cepheid variable distance measurements with Slipher’s red-shift measurements.12 Hubble’s paper received substantially more attention.13H soon became known as the Hubble constant (eventually renamed the Hubble parameter because H was later found to evolve slightly over time, with Ho denoting the present time value). More precise recent examination of the Hubble parameter proved Lemaître’s and Hubble’s measurements to be one order of magnitude too high. Current calculations yield Ho = 67.66 ± 0.42 km s-1 Mpc-1 from Planck Mission data, 73.45 ± 1.66 km s-1 Mpc-1 from Hubble telescope data, and 72.5 ± 2.2 km s-1 Mpc-1 based on the H0LiCOW collaboration’s observation of multiply imaged quasars. (The disparity between the Planck calculation and the other two is receiving notable attention.)
Lemaître took part in a 1931 meeting of the British Association considering the relation between the physical universe and spirituality. At this conference Lemaître suggested that the universe had expanded from an initial point, referring to this event as “the Cosmic Egg exploding at the moment of the creation.” The same year Lemaître republished in English his 1927 paper in the Monthly Notices of the Royal Astronomical Society.14 To pay tribute to both Hubble and Lemaître for their fundamental contributions to the development of modern cosmology, the International Astronomical Union issued in August 2018 a resolution to rename the “Hubble law” as the “Hubble-Lemaître law.”
Over the 30 years that followed, the idea of the Big Bang became widely accepted within the scientific community. Nonetheless, a few scientists, including English astronomer Fred Hoyle, held out in support of a steady-state eternally expanding universe. Lemaître’s proposal became better known as the Big Bang after astronomer Fred Hoyle applied the term as a derision on BBC radio. (At that time, the idea of a universe with a finite beginning was unacceptable to many on philosophical grounds.)15
During the following decades, physicists and cosmologists investigated the implications of these two conflicting alternatives: (i) an expanding universe with a finite lifetime, versus (ii) an eternally expanding universe with no origin in the finite past. Specific predictions of each hypothesis were determined. For example, an eternally expanding universe with roughly constant mass density over the cosmic scale necessitated that mass be continually generated from the vacuum energy of space-time. Then, ever so slowly, more galaxies would continually build from the mass generated. The constantly forming galaxies would fill in the additional space produced though continued expansion. Hoyle argued this would give the appearance of the universe as presently seen, with galaxies continually in view of others, but moving away, even after an infinite time of expansion.
A unique prediction of an expanding Big Bang universe is that the temperature T of a universe decreases as its size scale a increases, with T ~ 1/a. In the 1940s, physicists George Gamow, Ralph Alpher, and Robert Herman investigated the physics of such a universe. They realized there would be a time before which the universe would be too hot for atoms to be stable. Prior to that era, electrons would be too energetic to be captured by nuclei. Only after the universe cooled below a particular temperature could electrons be captured and atoms become stable. Preceding this, the universe would have been filled with a dense plasma of positively charged nuclei and negatively charged electrons. These positively and negatively charged ions would continually exchange photons (light) over extremely small (sub-millimeter) distances. Thus, light released by one charged ion would travel only a very short distance before interacting with another charged ion and be captured. A fictional being existing in the universe at that stage would not be able to see even a millimeter away in a given direction.
By roughly 380,000 years, our observable universe grew to a size at which temperature dropped below 3000 K. Electrons no longer possessed the kinetic energy to remain free of nuclei, resulting in the formation of atoms. The prior exchange photons were free to travel vast distances before being absorbed by another particle. As the universe expanded, photons continually redshifted to lower frequencies. Gamow, Alpher, and Herman spent several years working through the details of the related physical processes to determine the predicted present frequency and temperature of the “freed” (or decoupled) photons. Under the assumption of a 3-billion-year-old universe, first Gamow estimated in 1948 that the photons should presently have a temperature of 50 K. Later that year, Alpher and Herman more accurately predicted the current temperature should be around 5 K. By 1953, Gamow also lowered his estimate to be around 7 K for a much older universe. In 1956, Gamow further reduced his prediction to 6 K. (Then in 1957, but unreported to the Western world, the Russian physicist Tigran Shmaonov measured a microwave background constant in time and directions with a 4 K temperature.)
While working in 1964 on cryogenic microwave receivers for radio astronomy, Arno Penzias and Robert Wilson at Bell Labs detected a microwave background with corresponding temperature of approximately 3 K. After building their most sensitive antenna and receiver system, Penzias and Wilson continued to detect radio noise for which they could not identify a source. The two scientists contacted physicist Robert Dicke at nearby Princeton University to discuss their problem. Dicke suggested they may have detected the predicted cosmic background radiation of the photons freed when atoms became stable at 380,000 years. Penzias, Wilson, and Dicke agreed to publish side-by-side papers in the Astrophysical Journal, Penzias and Wilson writing about their observations and Dicke writing about his interpretation of the microwaves as the cosmic microwave background (CMB). Penzias and Wilson received the Nobel Prize in 1978 for their discovery of the smoking gun of the Big Bang theory. An expanding universe beginning from a Big Bang became the only tenable possibility.
Because it was contrary to the evidence, Hoyle’s steady-state theory was then dismissed by the overwhelming majority of scientists. While it is sometimes claimed that Hoyle never came to accept the Big Bang, in 1965 he publicly acknowledged that the microwave background implies that “the universe must have been different in the past from what it is today,” especially that it was of higher density. He also admitted that the steady-state theory “will now have to be discarded, at any rate in the form it has become widely known.” However, Hoyle continued to express dislike for a singularity at the beginning and continued to speculate about oscillatory universes.16
The properties of the CMB have been investigated since 1964. In the 1990s, the Cosmic Background Explorer (COBE) satellite verified the primary theoretical predictions of the CMB: (i) that the CMB should provide a near-perfect black-body spectrum defined by a particular temperature, and (ii) that small variations should appear in the spectrum as an indicator of the energy density variations that led to structure formation in the universe, specifically individual galaxies, galaxy clusters, and the vast regions devoid of galaxies. Predictions for the anisotropy of space-time was on the order of 1 part in 104 or 5. COBE is sometimes acknowledged as “the starting point for cosmology as a precision science.”
From the full COBE CMB sky map, the CMB radiation was found to fit a black body distribution of frequencies at a corresponding temperature of 2.73 K. Up to the precision of COBE, temperature anisotropies were also found at the expected ratios. The lead scientists of the COBE project, John Mather of NASA Goddard Space Flight Center and George Smoot of the University of California at Berkeley, were awarded the Nobel Prize in Physics in 2006 for the groundbreaking discoveries of COBE.
The Wilkinson Microwave Anisotropy Probe (WMAP), launched by NASA in 2001, was the successor to COBE. A primary goal of WMAP was to determine with greater precision the CMB anisotropies. WMAP images of the CMB were much cleaner than COBE’s and, indeed, verified anisotropies with an amplitude of 1 part in 105. WMAP data raised some significant questions about the formation of the universe based on a few possible slight variations between measurements and predictions on the multipole moment intensity plot. While the overall uncertainties in the WMAP data points were smaller and the points closer to the theory curve than the corresponding COBE measurements, a few data points did not follow the theoretical plot as well as was expected. The angular scale at which the amplitude of the variations peak in the full CMB plot was also surprising. To some cosmologists, these sets of data anomalies suggested the possibility of additional, as yet unknown physics, in the early universe. For example, J.–P. Luminet and colleagues proposed this may imply a periodic finite universe.17
The Planck observatory, launched by the European Space Agency (ESA) in 2009, is the current successor to WMAP. In 2013, its all-sky map of the CMB was released. This sky map provided a more refined understanding of CMB anisotropies. Planck measurements of the total intensity and polarization of the CMB complemented and significantly improved on WMAP. It strongly constrained hypotheses regarding the earliest stages of the universe and the process for structure formation. Ripples in the CMB can be traced back to 10-30 s after creation.
The Planck observatory is continuing to enhance our knowledge of the universe’s composition and its evolution. It is providing an excellent confirmation of the Standard Model of cosmology at an unprecedented accuracy, setting a new benchmark for our knowledge of the contents of the universe. However, the high precision of the Planck measurements continues to reveal some CMB anomalies. Among the most surprising are some fluctuations in the CMB over large scales that do not match those predicted by the Standard Model of cosmology. (The fluctuations are about 10 percent weaker than the best fit of the Standard Model to the data.) This anomaly supports similar ones observed by WMAP, including an apparent asymmetry in the average temperature on opposite hemispheres of the sky. One resulting hypothesis is that, on a larger distance scale than we can observe, the universe may not be the same in all directions. In this scenario, light rays from the CMB may have taken a more complicated route through the universe than previously understood.
Since the advent of the Universe-centric paradigm, scientists have striven to understand when and how the universe came to be, at least to as far back as the Planck time scale of 10-43 seconds and at least down to the Planck length scale of 10-33 cm. Over the course of the 20th century, astronomers determined our observable universe contains over a trillion galaxies. Light observed from the oldest of these galaxies was emitted approximately 13.3 billion years ago.
As Einstein showed mathematically, space and time are inseparable quantities. Both space and time were brought about in the Big Bang/inflation (BBI) era, with the current measurements of fundamental cosmological quantities implying the BBI event was 13.798 ± 0.037 billion years ago (corresponding to time t→ 0 seconds). The length scale of the initial spatial volume that expanded to the current observable universe is uncertain. This scale depends on the particular quantum gravity theory assumed. Some quantum cosmology models gravity predict a Planck length scale, others predict a sub-Planck length scale, around the Planck time.
The inflationary era is generally thought to occur around t ≈ 10-33 s (or earlier for quantum gravity), lasting to t ≈ 10-32 seconds. During this era, the universe expanded exponentially with time, growing by a scale-factor of at least e60 ≈ 1026 (also known as 60 e-folds). By the end of inflation, the present observable universe reached about the size of a baseball.
Following that extremely brief, highly exponential expansion era, the expansion (per unit length) of the universe has been much slower. For roughly the first 50,000 years (i.e., when radiation was the dominant form of energy), the expansion went as t1/2, with t the age of the universe. Following that, matter as the dominant form of energy resulted in the universe growing as t2/3. Until 1997, the universe was believed to still be in a matter-dominated era. In that year, both Adam Reiss’s and Brian Schmidt’s team and Saul Perlmutter’s team discovered that for the past 4 billion years the universe, including the observable volume, has been expanding exponentially as a result of dark energy. (Note though that the present rate of exponential growth is orders of magnitude below that during the initial, very brief, inflation era.)
Einstein’s general relativity shows that accelerating expansion of the universe is a result of dark energy (aka, vacuum energy or cosmological constant), the “amount of energy packed into a given volume of empty space.” That definition might seem like a contradiction. (How can there be energy in “nothing”?) Actually, there never is “absolutely nothing.” Ever since the mid-1970s, the area of physics known as quantum field theory has predicted that particle and anti-particle pairs pop into and out of existence together everywhere, continually. Effects have been observed that uniquely result from virtual particle production and annihilation. Related experimental measurements are in excellent agreement with theoretical predictions (that is, except for the severe disagreement between the theoretical prediction and experimental measurements for dark energy density itself). Dark energy is denoted either by the Greek letter Λ when it appears on the left-hand side of the Einstein equations or as ρDE when it appears on the right-hand side.
The observable universe is the spherical volume of the universe that can presently be observed from the earth. Because of the finite speed of light, any object more distant from us than light could travel over the age of the universe cannot yet be seen. The cosmological horizon (also known as the particle horizon) is the maximum distance light could have traveled to us in the age of the universe, that is, the outer shell of the observable universe defining the boundary between the currently observable and the currently unobservable regions of the universe.
Due to the expansion of the universe, the distance to the cosmological horizon is not simply the age of the universe multiplied by the speed of light. Instead it is the conformal time η multiplied by the speed of light c. The conformal time is the amount of time it would take a photon to travel to us from the furthest observable distance, taking into account the effect of the expansion rate of the universe. Thus, the cosmological horizon recedes from us as time passes and the conformal time grows. Calculations of the present η produce a cosmological horizon of about 47 billion ly.
The term observable universe is often confused with the term Hubble volume. Likewise, cosmological horizon is often confused with Hubble horizon. The Hubble volume is the region surrounding an observer in which all other objects are receding from the observer at a speed below that of light. The Hubble horizon is the outer spherical surface of the Hubble volume on which objects are receding at the speed of light. Outside the Hubble horizon objects recede faster than light. The radius of the Hubble equals the speed of light divided by the present value of the Hubble parameter, rHo = c/Ho. For the present Hubble value, Ho, the Hubble radius is 14 billion ly, slightly larger than the speed of light times the age of the universe. Objects on our Hubble horizon have a recession speed of c. So, light emitted at the present time can never be viewed on earth. However, for an accelerating universe growing in size the Hubble parameter H(t) increases over time, causing the Hubble radius rH(t) = c/H(t) to decrease. As the Hubble radius shrinks over time, objects that we can currently observe will cross our Hubble horizon and will disappear. As defined for a specific observer, a Hubble volume will always be smaller than the corresponding observable universe.
Quantum field theory predicts that the dark energy density Λ should either be exactly zero or of order 1, in units of MPl4, with MPl denoting the Planck mass ~ 1019 GeV. These values are expressed in natural units, wherein the speed of light c, Planck’s constant h, and Newton’s constant GN are all set to 1. (The Planck mass is on the same scale as the mass of a particle of dust in the air.)
In 1987 Stephen Weinberg showed that after the brief, initial inflationary era, a dark energy density larger than 10-119MPl4 would have expanded our universe far too quickly for galaxies to have formed. (In his calculations, Weinberg kept all other relevant fundamental parameters constant at their present values.)18 Therefore, the existence of galaxies in our observable universe implied a dark energy density smaller than 10-119MPl. As a result of this strong constraint, it was generally assumed for the next decade that our universe did not possess any dark energy.
However, in the years preceding 1997, more precise measurements were recorded of intensities and redshifts of light from supernovae explosions in galaxies over a much broader range of distances from the Milky Way. The data surprised everyone: the expansion of the universe was found to have been increasing for the last four billion years. In fact, the dark energy density is just below Weinberg’s upper limit by a factor of only 10, Λ ~ 10-120 MPl4. The 2011 Nobel Prize in Physics was awarded for this discovery. Dark energy composes roughly 68 percent of the total mass-energy in the universe. The dark energy density is constant everywhere within and beyond our observable universe. As space grows, the density of dark energy density in the larger space remains the same as the density in an earlier smaller volume.
The unexpectedly small, but nevertheless non-zero, value of dark energy remains a profound mystery. Several hypotheses have been proposed, but so far none are truly convincing. Some scientists and philosophers suggest that the value of the cosmological constant is an anthropic effect of human existence (or any intelligent life we might imagine): atomic-based sentient life could not observe a dark energy density above Weinberg’s limit, because it could not exist in such a universe. (While this may seem like a trivial tautology, it has profound implications for science and philosophy.)
Another major scientific puzzle is dark matter, the existence of which was first proposed by Fritz Zwicky (1898–1974) at Caltech. From his 1933 study of the Coma galaxy cluster, Zwicky discovered that the observable mass in a cluster of galaxies provided for only a small fraction of the total mass necessary to produce the gravitational attraction required to keep the cluster together.
Then three decades later, Vera Rubin, using the McDonald Observatory and then the Palomar Observatory, performed a study of the rotation of dozens of galaxies. Beginning with Andromeda, Rubin examined the rotation and outer edges of spiral galaxies. (This observational detail was possible thanks to a state-of-the-art image tube spectrograph designed by Kent Ford Jr.19) The study revealed a consistent pattern of flat rotation curves, meaning that the outer components of a galaxy were rotating as fast as the inner components. This implied that these spiral galaxies were individually surrounded by a spheroidal distribution of a dark matter halo. Rubin’s measurements indicated that without the presence of a stabilizing gravitational effect of dark matter, the outer regions of these spiral arms rotated too quickly to have been kept intact by the force of gravity from just observable matter. Over recent decades, dark matter distributions have also been mapped in several galaxies by supercomputer analysis of telescopic observations of gravitational lensing.
Dark matter generally forms a spherical or ellipsoidal shape around observable matter in a galaxy, with dark matter density going as 1/r2 from a galactic center. The distribution of dark matter in a galaxy indicates that dark matter doesn’t self-interact in the same manner as observable matter does. Instead, one possibility is that dark matter interacts through forces vastly different from the electromagnetic, weak nuclear, or strong nuclear forces. Given that dark matter interacts much more weakly with ordinary matter, an alternate possibility is weakly interacting massive particles (WIMPS). Another possibility is a theorized exotic particle labeled the “axion,” which has much lower mass than ordinary matter.
Dark matter is referred to as “cold” or “hot,” essentially depending on its velocity through the universe. “Cold” dark matter travels slowly, whereas “hot” dark matter travels fast. Cold dark matter was non-relativistic at “freeze-out,” which is when the interaction rate of the particle becomes less than the Hubble expansion rate as space. At freeze out, reactions of these particles effectively cease. The dark matter models most consistent with the properties of the observable universe contain predominantly cold dark matter, with perhaps a small amount of hot dark matter.
A growing realization within the scientific, philosophical, and theological communities is that humanity’s perception of reality remains incomplete and limited. Given the patterns of scientific developments and paradigm advancements that have produced our current understanding of the universe, this should not be surprising. Many questions about our universe and its properties remain unanswered. Several independent pieces of evidence and lines of argument suggest much more exists beyond the observable universe. The BBI that produced our universe is itself such evidence.
Inflation was first proposed by Alan Guth in 1981 to resolve several issues and inconsistencies that standard Big Bang theory left unresolved.20 For instance, without inflation the Big Bang theory cannot explain the origin of the large-scale structure of the universe. Quantum fluctuations on the order of 1 part in 105 (as evidenced in variations in the CMB), magnified to cosmic size through inflation, became the seeds for the growth of structures (galaxies, galaxy clusters, galaxy superclusters, and sheets of galaxy superclusters). If the universe had not grown super-luminally during the Big Bang era, significant temperature and structure variations should have been detected at scales larger than the horizon scale at matter-photon decoupling around 380,000 years. Without inflation, temperatures at such angular scales should not be correlated. Inflation explains why the universe appears so isotropic in all directions on the large scale, while also providing the 1 part in 105 variation in the CMB from a quantum effect.
BBI also explains the observed flatness of the universe: Any initial positive or negative curvature (“bending” of space-time) becomes extremely stretched out by inflation such that any post inflation space-time appears flat within the scale of the observable universe. In flat spaces, the sum of the inner angles of a triangle is 180o. In spaces of positive curvature, triangles “bulge,” with the sum of the inner angles greater than 180o (as on the surface of a globe). In spaces of negative curvature, triangles “pucker,” with the sum of the inner angles being less than 180o. All three types of curvatures provide solutions to the Einstein equations.
Extrapolation of the electromagnetic, weak nuclear, and strong nuclear forces from CERN’s current 105 GeV to the drastically higher (earlier) 1016 GeV and higher energy scales generically predicts the existence of a pre–inflation era magnetic monopole particle. However, magnetic monopoles have never been detected. Alan Guth showed that a sufficiently long early inflation era provides an answer to magnetic monopoles non-detection. In the post-inflation era, monopole density would decrease to far less than one per present Hubble volume. Thus, it becomes improbable that any will ever be found.
The theorized particle responsible for inflation is appropriately referred to as the inflaton. Although this hypothesized particle has not yet been identified, it possesses many of the properties of the recently discovered, but long predicted, Higgs particle, which is responsible for giving mass to all fundamental particles. One feature common to inflationary potentials that produce an inflationary epoch is time and temperature dependence. For cosmological consistency, the initial inflation era must come to an end after a sufficiently high exponential expansion lasting no more than about 10-32 s. Present observational constraints require the minimum number of e-folds to be at least 60. The specific number of e-folds induced is extremely dependent on the form of the assumed inflation potential.21 Critically, whatever its form, the inflaton potential must possess its own internal “shut-off” switch to halt inflation. A shut-off feature severely limits possible inflation modes. At inflation’s end, the inflaton decays into ordinary particles and induces a reheating process.
Although much of the physics behind the inflation era remains a mystery, several required properties are understood. One is that the conditions necessary to produce a single universe also provide for multiple universes. Not just one universe, but (perhaps uncountably) many would pop into existence from a space-time vacuum with (near) Planck temperatures (of order 1032 K). Under the widest range of known physical conditions that can generate a single inflating universe that induces its own “shut-off” stage, the probability is near 1 that a vast collection of universes would be generated “simultaneously.”
Thus, a multiverse is a generic prediction when at least one universe comes into existence through BBI. A large pot of water reaching boiling temperature can provide a good analogy to that helps to understand why this would be. The key question to ask is “How many bubbles should one expect to appear in a pot of water at a temperature of 100oC?” The answer is that a pot of boiling water produces ongoing of bubbles until the water is evaporated. A constantly refilled pot maintained at 100oC would never stop boiling. Never would just one bubble appear. A given potential energy configurations of a Planck temperature space-time that is able to produce at least one inflating universe that containsa s “shut-off” mechanism is like a pot of boiling water. It won’t stop “nucleating” univerves until it runs out of space-time soup.
Properties of a Multiverse
If a multiverse exists, how should we picture our universe within it? Paul Davies addresses this in Cosmic Jackpot.22 Davies defines the observed universe as all of space and its contents as far out as our scientific instruments can currently probe. Not much larger in extent is the observable universe, everything within the horizon of what is visible from our location within the 13.8 billion ly limit. The observed universe and the observable universe nearly coincide. Our pocket universe is the region of space as far out as it resembles the observable universe today, acknowledging there may be other pocket universes with different sets of physical laws than those within our pocket universe. Davies’s multiverse is the collection of all pocket universes (possibly infinite in number) plus the space-time foam gaps theorized to exist between them.23
A hypothetical multiverse may possess a meta-law, defined as an overarching set of fundamental physical laws. A meta-law may be expressed as mathematical equations with multiple solutions. Individual pocket universes may be subject to different sets of local physical laws, with each set being a particular solution to the governing multiverse meta-law. The pre-1995 version of string theory with its six compact dimensions of space provides such a multiverse. It allows an estimated 100 trillion universes, when a universe is distinguished simply by its particular set of physical laws that is a solution to the underlying string theory equations. The generalized post-1995 version of string theory (aka M-theory) with its seven compact dimensions of space allows an estimated 10100 to 101,000 universes with distinct sets of physical laws, all of which are solutions to the M-theory equations. Additionally, two universes within a multiverse may have identical sets of physical law solutions to the meta-law, with
i. different physical constants; or
ii. identical physical constants, but different histories from the beginning; or
iii. identical physical constants, and identical histories up to some particular time, but differing histories thereafter.
The highest probability is for two randomly picked universes in the multiverse to have differing local physical laws. Probabilities for common aspects between two universes are expected to significantly decrease with each move from i to iii for matching features, including length of time of common history.
It is a speculative concept that our universe is but one of many in a vast multiverse. Even so, in efforts to answer fundamental questions about the origin of the universe and about its properties, cosmologists and other theoretical physicists have converged on the idea of a multiverse of one form or another. While a proposed multiverse can indeed take different forms, the idea that a vast number of universes exist somewhere beyond the limits of our observable universe is “universal” to all of them.24
Nevertheless, a multiverse beyond our observable universe is currently undetectable by us. Many believe this may always be so. On the other hand, supporting empirical evidence (i.e., evidence obtained by observation, detection, or a physical experiment) for a multiverse would be provided by detection of
i. unexpected patterns in the CMB and/or in primordial gravitational waves best explained by interaction between our universe and another universe before, during, or very soon after the BBI of our universe;
ii. hypothetical particles known as Kaluza-Klein modes with certain mass patterns indicating the existence of compact dimensions; and/or
iii. variation of the force of gravity from its standard 1/r2 Newtonian form at a minimum distance scale below those yet tested (another predicted outcome of compact dimensions).
Both opponents and advocates of the multiverse proposal are limited by the same observational constraints. Both sides have only the observable universe as evidence. The controversy within the scientific community includes disagreement over whether a multiverse exists, and whether the multiverse concept is a proper subject of scientific inquiry. Some argue the concept belongs more in the realm of philosophy than science. Nevertheless, independent of one’s view of that, a fundamental question remains: “Given the known properties of our observable universe, is the existence of a multiverse more probable than the nonexistence of anything beyond our observable universe.
Multiverse proponents seek to determine the more plausible of two hypotheses:
i. that all that exists is precisely that which we can observe or detect, with observation and detectability limited by the speed of light and the accelerating expansion of space, or
ii. that there likely exists physical reality beyond our observational and detectable limits.
Proponents argue that to claim that only our observable universe exists requires either a positivist/empiricist stance that nothing exists except what has been physically proven to exist, or the belief that the origin of our universe cannot be described in current scientific language.
In String Theory and the Scientific Method, Richard Dawid advances the role of non-empirical theory assessment in support of a (string theory) multiverse.25 Non-empirical theory assessment is based on logical analysis rather than empirical evidence. As a consequence of the growing chasm between experimental capabilities and the advancement of theoretical proposals in these fields, Dawid asserts that the definition of the scientific process has been evolving in recent decades, particularly in cosmology and high energy particle physics. He asserts that the growing disparity between energy/distance scale limitations of experimental processes and energy/distance scales relevant to current cosmological and high energy particle theories necessitates new scientific research methods that enable a shift toward non-empirical methods of theory verification. For Dawid, non-empirical theory assessment involves three key arguments:
i. the no alternatives argument (NAA);
ii. the argument of unexpected explanatory coherence (UEA); and
iii. the meta-inductive agreement from the success of other theories in the research program (MIA).
According to NAA, the likelihood that a given theoretical principle is true increases when no other explanations have been currently found nor are any expected to be found in the future. This application to a multiverse yields the statement that multiverse is implied by all known successful inflationary potentials. UEA contends that support for a given theoretical principle increases when the principle surprisingly provides a more coherent theoretical picture after the principle’s implications are more fully understood. In this case, the UEA offers evidence for a multiverse based on the Multiverse paradigm’s successful generalization of the Copernican principle that humans do not have a privileged view of the universe, or reality as a whole, within our particular finite observable universe.
MIA reasons that a theoretical principle is more likely when the prior, more limited, theories that led up to it have already been strongly supported by their verified predictions. Hence, MIA asserts that, as time is followed backwards toward t=0 the ongoing success of the current cosmological model and Standard Model of particles physics supplies further credence to a logically linked preceding multiverse that produces our universe through BBI. Dawid maintains that the combination of NAA, UEA, and MIA strongly constrains viable scientific explanations for the origin of our universe, imposing limitations to scientific theory under determinism. Helmut Satz also supports the role of non-empirical theory assessment for modern cosmology.26
In contrast, Peter Woit offers rebuttals to Dawid. He challenges both the scientific value of Dawid’s non-empirical approach and the specific applicability of NAA, UEA, and MIA.27 Woit emphasizes Dawid’s footnote acknowledgement “that physicists on both sides of the divide are aware of the slightly precarious character of the ‘non-physical’ arguments deployed in the debate.”28 Specifically regarding NAA, Woit writes that “the way science progresses is that there are always unsuccessful ideas with no good alternatives, until the day someone comes up with a better idea.” Denying the warrant of UEA, Woit believes that “you could make an equally good case for string theory unification [and the multiverse] becoming a more and more dubious idea as it became better understood.” As for MIA, Woit regards the history of the Standard Model and the history of the string theory multiverse as “two radically different subjects.”
Philosophers point out that an element of faith is required to either defend or reject the multiverse hypothesis. One example element of faith (here opposing a multiverse) would be a particular belief that a direct act of God, superseding or modifying the physical laws within our physical reality, is necessary for creation of a universe. For theologians, an associated question is whether it is more likely the will of God that exactly one universe exists or a multitude exist. There are also significant theological distinctions between a multiverse containing a finite number of universes; a multiverse containing an infinite, but countable number of universes; and a multiverse containing an infinite, uncountable number of universes. A countably infinite set of universes is analogous to the set of all integers, while an uncountable infinite set of universes is analogous to the set of all real numbers. The former is a single infinity, while the latter is an infinity of infinities.
If the existence of a multiverse is granted, the next question is “What are its properties?” In his groundbreaking 2003 “Parallel Universes” papers, cosmologist Max Tegmark proposed a taxonomy for categorizing multiverses.29 In Tegmark’s approach, multiverses range from Level 1 to Level 4, with lower levels embedded in higher levels:
Our Level 1 multiverse is our total universe: the observable universe and all the spatially connected regions beyond it that possess our electromagnetic, weak nuclear, and strong nuclear forces. Other proposed Level 1 multiverses could exist independently, being causally disconnected from our universe and from each other.
A Level 2 multiverse is defined by a specific meta-law, an overarching set of mathematical constraints, which has more than one set of physical laws as unique solutions. Each Level 2 multiverse contains a set, be it finite or infinite, of Level 1 multiverses.
A Level 3 multiverse, often referred to as an Everett multiverse, contains universes that branch at each event where options occur or choices are made. Each branch corresponds to a unique universe in which a specific option is realized in nature or a choice is made. The Hugh Everett school of thought argues that everything allowed by the laws of physics does happen somewhere and/or at some time. (In addition to the original Everett scheme, some events may require variation of the physical constants at universe splitting.)30
Level 4 is Tegmark’s highest level. Each Level 4 multiverse contains entire sets of Level 1, 2, and 3 multiverses based on a specific mathematical structure, mathematical object, or logic system.
Each level of multiverses has its own specific controversial features, with each higher level being more contentious than the lower. A particular feature of a given level of multiverse might be interpreted by multiverse proponents as rationale supporting the multiverse conjecture, while viewed by opponents as rationale against the conjecture.
Level 1 Multiverse: Regions Beyond Our Cosmic Horizon
A Level 1 multiverse is vast, possibly (but not necessarily) infinite, in spatial extent, while possessing the same underlying physical laws throughout. A Level 1 multiverse is the simplest cosmological model to consistently contain our observable universe. Our existence within a Level 1 multiverse (of likely infinite volume) is supported by the space-time flatness indicated by CMB measurements and by other data.
Our Level 1 multiverse is assumed to contain matter at roughly the same distribution throughout. Because matter can combine in only so many different configurations, some cosmologists, including Tegmark, believe that it stands to reason other portions of our Level 1 multiverse exist that have elements in common with our observable universe. The implication is that somewhere “out there are regions of space” in our Level 1 multiverse that exactly mimic our own local region. For example, there is non-zero probability that somewhere else in our Level 1 multiverse another planet exists where everything is unfolding exactly as on Earth.
From probability estimates, Tegmark predicts that an identical physical volume to our Hubble volume should be about m away.31 Based on the cosmological principle (which assumes that our Hubble volume is not special or unique), Tegmark believes an infinite number of Hubble volumes physically identical to ours should exist. However, probability predictions within infinite volumes are tricky: The probability that larger volumes of space exist with no events in common with any in our Hubble volume is greater than the probability of matching Hubble volumes existing.
According to Tegmark, at m away, a sphere of radius 100 ly should exist that is identical to the one centered here. All perceptions that we have during the next century should then be identical to those of our dopplegängers there. However, since there are differences in the content and events of space outside each 100 ly sphere, observations of each local universe as viewed from their respective dopplegänger earths will begin to differ after 100 years.
Tegmark also estimates that the average distance between you and a dopplegänger identical to you in all ways up to this time in your life would be around meters away. However, given all possible choices, that person is more likely to stop reading this article at a different page and/or at a different time than you choose to. From then on, the dopplegänger walks a different path.
From the assumption of infinite space-time and a sufficiently uniform distribution of matter on large scales, Tegmark suggests that even the most unlikely events must take place somewhere, given that they are not in violation of the underlying physical laws. Although the implications may seem crazy and counter-intuitive, Tegmark argues that “this spatially infinite cosmological model is the simplest and most popular one on the market today.”32 It is part of the cosmological concordance model, which agrees with all current observational evidence and is used as the basis for most cosmology calculations and simulations.33
Level 2 Multiverse: Collection of Level 1 Multiverses
Given a specific set of fundamental laws of physics (e.g., M-theory), different regions of causally independent space-time can exhibit different effective local laws of physics. Each set of local laws is a different solution to the overarching fundamental laws, the meta-law, of a Level 2 multiverse.
Each Level 1 multiverse within a Level 2 multiverse can possess different physical constants, particles, and space-time dimensionality. For example, it is estimated that a superstring-based Level 2 multiverse would contain between 10100 and 101000 distinct Level 1 multiverses, defined by a unique set of fundamental forces. A particular Level 2 multiverse can also contain sets of Level 1 multiverses with matching local laws of physics, but different physical constants or histories.
A Level 2 multiverse proffers an explanation for the apparent fine-tuned parameters in a specific life-yielding universe such as our own. This is a particular strength of Level 2 multiverses that individual Level 1 multiverses are lacking. The exception is a Level 1 multiverse that is timewise cyclical, passing through an infinite series of Big Bang inflations. Over time, a cyclical Level 1 multiverse could have a timewise diversity of free parameters parallel to the diversity a Level 2 multiverse has spatially. Thus, whether a cyclic multiverse classifies as Level 1 or Level 2 is contentious, with many claiming that a cyclical universe classifies as a Level 2 multiverse.
In chaotic inflation theory, the multiverse as a whole has been expanding since its beginning and will continue expanding forever. But in some regions of space, the expansion rate may slow down and form distinct bubbles undergoing a change of phase, like gas pockets forming in a boiling pot of water. Each such bubble is a Level I multiverse with its own physical laws. In other words, a Level 2 multiverse is a collection of Level 1 multiverses. One aspect where the boiling water bubble analogy falls short is that each Level 1 multiverse bubble within a Level 2 multiverse is itself infinite in extent. Further, an infinite number of such bubbles would form over an infinite amount of time because the chain reaction would never end.
Each Level 1 multiverse would be out of contact from all others. (The exception might be via blackhole/white hole tunnels as Lee Smolin and others have proposed.)34 Given that each Level 1 multiverse is a distinct space-time from all others, even traveling at super-luminal speeds a person could not move from one Level 1 multiverse to another. Further, the equations of chaotic inflation indicate that the “space-time foam” region beyond Level 1 multiverses experiences a continual inflation, thereby eternally creating more volume than could ever be traversed.
If the expected exponential growth of the number of Level 1 multiverses bubbles has been continuing forever in the backward past, there would be an uncountable infinity of Level 1 multiverses at present. However, such an eternal past for the bubbles may not be possible, according to a theorem by Arvind Borde, Alan Guth, and Alex Vilenkin. In a series of papers beginning in 2003, Borde, Guth, and Vilenkin (BGV) argue that any process undergoing gravitational inflation began a finite amount of time in the past.35 The BGV theorem states that if a multiverse (including the single Level 1 case) is at least on average everywhere expanding, then the histories of most particles cannot be extended to the infinite past. If one follows the trajectory of some particles backward in time, one inevitably comes to a point where the universe (multiverse) is no longer expanding.
While the BGV theorem does not absolutely prove that a multiverse had a beginning, it does imply that the expansion stage of the multiverse was induced by a phase transition that occurred a finite amount of time in the past. (A phase transition is when the properties of a given medium change, as a result of the change of some external condition. A well-known example of phase transition is water transforming between solid, liquid, and gaseous forms under a combination of changes in temperature, pressure, and volume.) The multiverse phase transition would have been in the far distant past, perhaps hundreds of trillions of years ago. Assuming the validity of the BVG theorem, Linde and Vanchurin estimate the number of the Level 1 multiverses to be on the scale of 1010^10,000,000 per Level 2 multiverse.36
The BVG theorem can be evaded if a multiverse experiences an overall contraction at a prior time. An example is the proposed cyclic ekpyrotic multiverse, with eternally repeated cycles of contraction and expansion.37 The “initial boundary” of a present multiverse would correspond to the time between prior contraction, current expansion phases, when the multiverse was momentarily static. However, a large number of cosmologists believe that the net entropy may increase in each expansion stage of a cyclic multiverse (or of a cyclic universe). An increased entropy in a given cycle would result in the next contraction phase stopping at a larger minimum distance than did the prior contraction phase. This too would require a change in phase of a multiverse very early, initiating the cyclic process with a low entropy value.
Penrose does not agree with this general consensus. To explain why our universe began in such a low entropy, high ordered state, Roger Penrose and V. G. Gurtzadyan propose a model in which entropy is removed at the end of each cycle. In their conformal cyclic cosmology (CCC) model, Penrose and Gurtzadyan posit that our universe is only one in a series they refer to as aeons.38 In the CCC model, our universe began in and will return to a state of low entropy as it approaches its “final days of expanding into eventual nothingness, leaving behind a cold, dark, featureless, abyss.” They suggest that the black holes in our universe spend their cosmic lifetimes “working to scrub entropy from the universe.” By destroying everything they touch and then leaking radiation, black holes are credited with acting as entropy “vacuum cleaners.” Penrose and Gurtzadyan predict that as the universe nears the end of its current expansion, the remaining black holes will evaporate or gobble each other up, setting things back into a state of order. The universe would begin to revert back into a “pre–Big Bang state,” as a highly ordered system and ready to trigger the next BBI. The remote future of one aeon thus becomes the BBI of the next aeon.39 Penrose and Gurtzadyan claim that circular patterns in the CMB would provide evidence supporting the CCC-type model, but no such patterns have been identified.
The consensus of the physics community is that the laws of our universe at its present low temperature are a manifestation of symmetry breaking that occurred when the universe was much hotter early in its history. The type of symmetry breaking experienced in a Level 1 multiverse are associated with the inflaton and other particles called moduli that determine the characteristics of a universe. As the Level 1 multiverse cools, the moduli fall into a specific set of (meta-stable) potential energy minima (aka vacuum states). The vacuum states determine the general features of that multiverse. For example, in a string/M multiverse, the values of the moduli determine the number of the spatial directions that grow large, and the geometry of the spatial directions that remain compact. The approximately twenty physical parameters in our universe, which include the masses and mass ratios of the fundamental particles and the dark energy density, are also determined by such minima. The set of minima of the moduli vary between each Level 1 multiverse in a Level 2 multiverse. Thus, a Level 2 multiverse can contain Level 1 domains where not only the initial conditions differ, but the fundamental properties differ as well.
Level 1 multiverses within a string/M Level 2 multiverse would have dark energy density varying from -MPl4 to +MPl4. Those with a negative dark energy density, defined as anti-de Sitter space-times, undergo accelerated contraction following initial inflation. Typical anti-de Sitter multiverses have very short lifetimes before re-compaction. Those with a positive dark energy density, referred to de Sitter space-times, undergo accelerating expansion following inflation. Those with zero cosmological constant, referred to as Minkowski space-times, do not undergo any acceleration following inflation.
The possible values of dark energy density in string/M multiverses are discrete, but in very small, nearly continuous units referred to as a discretum. The unit gap between allowed values can be as small as, or smaller than, is required to produce the measured dark energy density of our observable universe. The larger a positive dark energy density is in a universe, the faster that universe may nucleate new universes with slightly lower dark energy densities. Through universe nucleation, a string/M-multiverse implies physical creation that is likely never-ending.
The closer the dark energy density of a universe is to zero, the more stable a universe is and the slower it would nucleate. Thus, our universe, with its infinitesimal 10-120MPl4 dark energy density, is more stable against nucleation than are universes with larger dark energy density. Tegmark provides numerous additional examples of fine tuning within our universe that suggest an ensemble of Level 1 multiverses within a Level 2 multiverse.40 A Level 2 multiverse is clearly more diverse than an individual Level 1 multiverse contained within. In contrast to an individual Level 1 multiverse, a Level 2 multiverse can provide a scientific explanation for apparent fine-tuning in our observable universe.
Relatedly (or perhaps equivalently), the embedding of our Level 1 multiverse within a Level 2 multiverse provides a scientific rationale for the anthropic principle. The concept of an anthropic principle was expressed by biologist Alfred Russell Wallace. In 1904, he stated that “such (a) vast and complex universe as that which we know exists around us, may have been absolutely required . . . in order to produce a world that should be precisely adapted in every detail for the orderly development of life culminating in man.”41 In the course of the following decades, the evidence of an evolving and expanding universe produced ongoing speculation among leading physics, such as Hermann Weyl, P. A. M. Dirac, Arthur Eddington, and Robert Dicke, regarding the role of the physical constants for the appearance of life.42
The anthropic principle was more formally proposed by Brandon Carter, Christopher Collins and Stephen Hawking, and Bernard Carr and Martin Rees.43 At a Princeton conference in 1970 and at an IAU symposium at Cracow in 1973, Carter presented a systematic investigation of relationships between physical constants and astronomical and physical phenomena. Carter made an anthropic link to the values of the physical constants in the context of an ensemble of universes. The publication of his talks was contemporaneous with a paper by Collins and Hawking that focused on certain classes of universes that would be suitable for the development of life. The physics-based approach of these three cosmologists raised wider scientific interest in the anthropic principle as an explanatory proposal in the multiverse context. This lead to the development of the strong anthropic principle (SAP) and the weak anthropic principle (WAP), both of which can be expressed in either statement form or predictive form.44 Carter’s predictive version of the WAP is
We must be prepared to take account of the fact that our location in the universe is necessarily privileged to the extent of being compatible with our existence as observers.45
Carter’s corresponding SAP declared that
The universe (and hence the fundamental parameters on which it depends) must be such as to admit the creation of observers within it as some stage.46 In that predictive version, Carter assumed the existence of observers and from there made deductions. He then concluded it is of course philosophically possible—as a last resort, when no stronger physical argument is available—to promote a prediction based on the strong anthropic principle to the status of an explanation by thinking in terms of a ‘world ensemble.’47
Thus, Carter argued that our existence as observers demanded a selection effect taking place in the context of a multiverse.48 Biofriendly universes are selected over the non-observer generating universes, thereby explaining why the nature of physics in such a universe is friendly to life.49 Carr and Rees further explored an anthropic principle from that standpoint. They proved that the basic features of the universe are essentially determined by a few microphysical constants and gravity. Many interrelations between different scales that seem surprising were shown to be straightforward consequences of simple physical arguments. Nevertheless, Carr and Rees identified several aspects of our universe, some of which seem to be prerequisites for the evolution of any form of life, that depend rather delicately on apparent “coincidences” among the physical constants. John Barrow and Frank Tipler raised public interest in the anthropic principle through their encyclopedic examination of it.50
Robin Collins and John Leslie claim these arguments have a limit.51 They argue we appear not to be just generic observers; rather, that we seem to be very special observers. They believe that the universe is more fine-tuned than necessary for our existence or for any other observer. They disagree with a claim that such hyper fine-tuning can be removed by an invocation of a Level 2 multiverse. Leslie judges this to be logical support for theism.
Level 3 Multiverse
A Level 3 multiverse is composed of Level 2 and 1 multiverses wherein each Level 1 multiverse continually branches. Branching occurs when differing outcomes are possible for a given situation, including choices are made by a living being. Another appellation for a Level 3 multiverses is “Everett” multiverse, in honor of mathematician and physicist Hugh Everett III (1930–1982).
Everett proposed a many worlds interpretation (MWI) of quantum mechanics, wherein all possible independent outcomes actually occur. The Hugh Everett school of thought goes further to argue that through the branching process everything allowed by the laws of physics does happen somewhere and/or sometime. Some such events may require early universe branching to provide variation of the physical constants.52
The MWI is one of several mainstream interpretations of quantum mechanics. The MWI states that unique quantum outcomes, technically referred to as collapse of a wave function, cause the branching of one universe into several, with each different possible outcome observed only in its own branch. For example, a universe in which a coin is tossed splits into two alternative universes. In one branch everyone observes the coin landing heads-up. In the other branch everyone observes the coin landing tails-up. More generally, N different independent possible results cause a single multiverse to branch into N different multiverses. Each branch becomes distinct and unreachable by all others, assuming that backward time travel in a given Level 1 multiverse is not possible.
Theoretical support for a Level 3 multiverse might be provided by quantum gravity and black holes. An underlying assumption for the existence of Level 3 multiverses is physical unitarity, that is, that physical interactions keep probabilities well defined. Another way of describing unitarity is in terms of information. If a system is unitary, no information is ever lost. If a system is not unitary, information will be lost. Collapse of a wave function into just one possible outcome does not conserve probability. In contrast, simultaneous branching into all possible outcomes does conserve probability.
Black holes are non-unitary in a single Level 1 multiverse if they remove information from the rest of the Level 1 multiverse through the permanent capture of matter and energy. However, if that information captured by black holes is eventually returned in some form or other to the rest of the universe, then black holes are unitary. Since the turn of the 21st century, developments in string/M-theory have led to a growing agreement that black holes are most likely unitary. These theoretical advancements convinced even Stephen Hawking, the first to propose black hole information loss, to conclude that black hole “elementary quantum gravity interactions do not lose information.”53 More recently though, Hawking narrowed his new view a bit with the caveat that although “information can be recovered in principle, . . . it is lost for all practical purposes.”54. However, in his final paper, co-authored with Malcolm Perry and Andrew Strominger, Hawking fully re-affirmed the view that no information is lost because complete information about the quantum state of a black hole may be “stored in a holographic plate at the future boundary of the horizon.”55
Surprisingly, a Level 3 branching multiverse does not contain any more possibilities in a Hubble volume than does a Level 1 multiverse in a vast collection of Hubble volumes. In effect, all the different “multiverses” created by “splits” in a Level 3 multiverse can be found in collections of Hubble volumes in a Level I multiverse. Tegmark recognizes that “the only difference between Level I and Level 3 is where your dopplegängers reside. In Level I they live elsewhere in good old three-dimensional space. In Level 3 they live on another quantum branch in infinite-dimensional Hilbert space.”56
All Level 1 multiverses embedded with different physical constants in a Level 2 multiverse can be found in a Level 3 multiverse as “universes” created by “splits” at the moment of spontaneous symmetry breaking. The distribution of outcomes in a given Hubble volume in a Level 3 multiverse is identical to the distribution obtained by sampling different Hubble volumes within a Level 1 multiverse. As a result of unitarity, if physical constants and space-time dimensionality vary in Level 2, then they will vary equivalently between the quantum branches at Level 3.
There is one significant difference between these multiverse levels. As stated, in a Level 3 multiverse there is no possibility of traveling from one branch to another unless reverse time travel is permitted. Alternate realities in a Level 3 multiverse might overlap only for a short amount of time and then decohere. They are then permanently separated when the system is observed. (In the widely held Copenhagen interpretation of quantum mechanics, decoherence is interpreted as “collapse of the wave function” and only one outcome is claimed.) In the case of a thrown die, the six branching realities remain merged from the throw of the die to identification of the top face by an observer. If a die were thrown into a dark isolated vacuum and the outcome unknown, the six-branched universe would remain in an overlapping mixed state until the die interacted with an external object or field. But once the throw of the die becomes known, the six unique universes become permanently separated. After that, no one could travel from one branch to another. So, a person could never meet his or her dopplegängers in different branches of a Level 3 multiverse. In contrast, it is at least theoretically possible for a person to travel between regions in a Level 1 multiverse to meet a dopplegänger (e.g., by traveling through a wormhole).
Level 4 Multiverse
A Level 4 multiverse is the multiverse of multiverses. The collection of Level 4 multiverses is defined as the complete set of multiverses formed of all consistent mathematical structures. It is claimed by Tegmark to be the “ultimate ensemble” of multiverses. The assumption behind a Level 4 multiverse is an isomorphism between mathematical existence and physical existence. The validity of this assumption was challenged by Davies.57 This isomorphism rests on two claims: (i) that the physical world is a mathematical structure and (ii) that all mathematical structures exist “out there.” This level considers all universes that can be described by different mathematical structures as equally real. Tegmark defines a mathematical structure to have physical existence if any self-aware substructure (SAS) within the mathematical structure perceives itself as living in a physically real world. Tegmark suggests that all mathematical structures exist physically as well. If this proposal is correct, all properties of multiverses could in principle by derived by an infinitely intelligent mathematician.58
Level 4 is a catch-all for mathematical structures that we can conceive of, but which we can’t observe as physical realities in our universe. The different Level 4 multiverses are each governed by different master equations. Unlike Level 2 universes, there are not just different manifestations of the same fundamental rules (e.g., of string/M theory), but entirely different sets of rules. That is, we imagine a Level 4 multiverse to contain a Level 2 string multiverse, a Level 2 loop quantum gravity multiverse, a Level 2 Horava-Lifshitz gravity multiverse, and so forth, as just a few of the possible mathematical structures of the infinitely many.
The assertion that the physical world is a mathematical structure is connected with Eugene Wigner’s observation that “the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious (for) there is no rational explanation for it.”59 Such an observation naturally occurs when the physical universe is a representation of a mathematical structure. For example, a “theory of everything” can naturally be expressed as a mathematical structure.
The Level 4 multiverse is claimed by some to answer John Wheeler’s long-standing question of “Why these particular equations, not others?” The proffered resolution is that all conceivable multiverse structures exist at Level 4. Any pair of Level 4 multiverses would be completely distinct from each other because their respective mathematical structures would be mutually inconsistent. From a strictly scientific and probabilistic approach, some argue that a Level 4 may be required for our universe to exist, because all mathematical structures may need to exist to guarantee that any particular mathematical structure exists. Level 4 is generally viewed as the highest multiverse level, thereby bringing closure to multiverse hierarchy. (The exception would be a multiverse not based on a mathematical structure.)
An example of a Level 4 multiverse according to Greene is a computer simulated universe. Some physicists, philosophers, and computer scientists suggest that technology will advance to the stage when computers could simulate each and every detail of a universe. The computers could then create a simulated multiverse whose reality is (nearly) as complex as our own. Objects within the program may falsely believe themselves to be real, independent beings even though they are actually computer functions. This simulation concept was first proposed by Nick Bostrom.60 Such simulations would likely not be perfectly consistent and perhaps identifiable from within by occasional unexplainable “glitches” in their fundamental laws of physics or inconsistencies in their history.61
Another of Greene’s examples is a holographic universe.62 According to the holographic principle, a universe physically equivalent to ours could exist on a distant boundary surface of one less spatial dimension in which everything about our universe is precisely mirrored. That is, our universe of 3+1 space-time dimensions could be equivalently represented by a 2+1 universe. The term “holomorphic universe” originates with the ability of two-dimensional holograms produced by laser interference effects to fully contain all of the information to portray three-dimensional images. For holographic partner universes, the higher dimensional universe contains gravity while the equivalent lower dimensional universe does not. Instead, the role of gravity is played by a force more like electromagnetism. The physical equivalence between such universes, first theorized in Juan Maldecena’s “The Large N Limit of Superconformal Field Theories and Supergravity” was proved in Yoshifumi Hyakutake’s “Quantum Near-Horizon Geometry” and Masanori Hanada’s “Holographic Description of Quantum Black Hole” for a special case.63
Level 4 is the most controversial of multiverse categories. Jürgen Schmidhuber believes that the set of mathematical structures is not well-defined and is limited to only universes describable by computer programs.64 Even strong Level 3 multiverse proponents like Page question the reality of a Level 4 multiverse. Don Page asks if the broadest definition of a Level 4 multiverse is too general to be plausible. He doubts that all logical possibilities could actually exist or that all mathematical structures could have physical reality.65 Further, Robert Mann argues that the Level 4 multiverse does not explain why our universe has the order that it does.66 Page and Mann expect that a multiverse randomly picked from all logical possibilities or mathematical structures would surely be far more chaotic than we find our universe to be. Page seeks a more logical version of Level 4 that is significantly more explanatory in nature and that arises naturally out of ultra-elegant laws of nature. Tegmark has proposed limiting Level 4 multiverses to only those based on mathematical structures of computable functions.67 Schmidhuber supports the more restricted ensemble of only quickly computable universes.
Multiverse Scientific, Philosophical, and Religious Controversies
Good Scientific Theory?
For most of the last century, a good scientific hypothesis or theory has been defined as something testable, meaning possible to disprove. It should be falsifiable. Because multiverses are (predominantly) undetectable beyond our observable universe component, multiverse proposals are viewed by opponents as untestable and unfalsifiable, blurring the line between science and philosophy. Thus, opponents of the multiverse concept frequently claim that it is not good science. This is a fundamental reason a multiverse is a controversial topic within the scientific community.
Predominant support of the existence of a multiverse of some type is found among theoretical physicists and cosmologists. Predominant opposition is found among experimentalists. Burton Richter has written that
what passes for the most advanced theory in particle physics these days is not really science. . . . [The multiverse proposal] looks to be more theological speculation, the development of models with no testable consequences, than it is the development of practical knowledge, the development of models with testable and falsifiable consequences (Karl Popper's definition of science).68
Some theorists have also sided with such experimentalists. Cosmologist Paul Steinhardt calls the multiverse a dangerous idea that he is unwilling to contemplate. In a 2003 New York Times opinion piece, “A Brief History of the Multiverse,” Paul Davies also expresses the view that multiverse theories are nonscientific:
How is the existence of the other universes to be tested? To be sure, all cosmologists accept that there are some regions of the universe that lie beyond the reach of our telescopes, but somewhere on the slippery slope between that and the idea that there are an infinite number of universes, credibility reaches a limit. As one slips down that slope, more and more must be accepted on faith, and less and less is open to scientific verification. Extreme multiverse explanations are therefore reminiscent of theological discussions. Indeed, invoking an infinity of unseen universes to explain the unusual features of the one we do see is just as ad hoc as invoking an unseen Creator. The multiverse theory may be dressed up in scientific language, but in essence it requires the same leap of faith.69
Tegmark and Page expressly disagree with these claims. Instead, they emphasize that unobservable aspects of a theory do not inherently make a theory untestable.70 For Tegmark the key question is not whether parallel universes in a multiverse exist, but rather how many levels a multiverse has. Tegmark and Page have separately shown that each of the four levels of multiverses have different sets of supporting evidence. Proofs have been developed that each level and type of multiverse can be tested and falsified statistically if the multiverse proposal can
i. predict what its ensemble of universes is, and
ii. specify a probability distribution, referred to as the measure, over its collection of universes.
Tegmark acknowledges that construction of a measure for Level 2 and 3 multiverses is much more difficult than for a Level 1 multiverse. Further, how to develop a measure for a Level 4 multiverse is completely unresolved and is likely not possible to do.
Nobel laureate Frank Wilczek values the understanding that physical reality is vastly larger than our limited perception of it—that the perceived part is merely a glimpse of the whole. Wilczek reminds us this idea exists at many levels and has a long history. Thought experiments are not difficult to design that demonstrate scientists have often formed an inadequate perception of the extent of physical reality. Wilczek stresses that the physic laws that describe the observable universe are most naturally formulated in a larger framework that includes space-time beyond it. According to Wilczek, positivist views are themselves not scientific. Instead he argues that they should be identified as a “philosophical moral exhortation” and must therefore bow to more scientific arguments for or against a multiverse.71
With regard to the multiverse debate, Wilczek suggests use of the terms universality and multiversity, and the application thereof. By universality, he means “the same fundamental laws apply at all times and all places”; by multiversity that “different laws apply at different times and places.” For Wilczek the key question is: “Are there aspects of observable reality that can be explained by multiversity, but not otherwise?” His answer is a resounding “Yes!”
Wilczek combines both philosophical and scientific arguments in support of the likelihood of a multiverse. His first piece of evidence is the appearance of fine-tuning. Wilczek remarks that
the happy coincidences between life’s requirements and nature’s choices of parameter values might be just a series of flukes, but one could be forgiven for beginning to suspect that something deeper is at work.
His second evidence is “the phase transition paradigm”—the concept that “empty space” or “vacuum” can exist in different phases, as typically associated with different amounts of symmetry. His third evidence is the BBI, as already discussed. Fourth is the “outrageously small, but non-zero, value of the dark energy density.” Fifth is the apparent abundance of universes on the string landscape.72 (This abundance argument may need to be tempered if, as Cumrun Vafa believes, a large share of the stringy de Sitter landscape is unstable.)73 Last is the abundance of free parameters in the Standard Model of Particles and in the Standard Model of Cosmology that prove notoriously resistant to theoretical understanding.74
In his consideration of the multiverse, George Ellis offers a middle ground between the proponents and opponents. Ellis is concerned not only with the science, but also with the scientific philosophy by which multiverse theories are generally substantiated.
He, like most cosmologists, accepts Tegmark’s concept of “domains” of our universe beyond the observable volume. Nevertheless, for Ellis these domains are so far distant that it’s extremely unlikely any evidence of an early interaction of our observable universe with these (or any other part of the multiverse) will ever be found.75
Ellis understands that for many theorists, the lack of empirical testability or falsifiability of a multiverse is not a major concern:
Many physicists who talk about the multiverse, especially advocates of the string landscape, do not care much about parallel universes per se. For them, objections to the multiverse as a concept are unimportant. Their theories live or die based on internal consistency and, one hopes, eventual (laboratory) testing.
Although Ellis believes there is little hope that physical tests for existence of a multiverse will ever be possible, he grants that the theories on which the speculation is based are not without scientific merit. Ellis concludes that multiverse theory is a productive research program and urges that
the contemplation of the multiverse is an excellent opportunity to reflect on the nature of science and on the ultimate nature of existence: Why we are here? In looking at this concept, we need an open mind, though not too open. It is a delicate path to tread. Parallel universes may or may not exist; the case is unproved. We are going to have to live with that uncertainty. Nothing is wrong with scientifically based philosophical speculation, which is what multiverse proposals are. But we should name it for what it is.76
Occam’s Razor versus Kolmogorov Complexity
The 14th-century English philosopher William of Occam famously advocated the principle of parsimony (simplicity) to judge which of two competing explanations is more likely to be true. Occam’s razor, as the principle became known as, is frequently applied to multiverse proposals, perhaps as equally by proponents as by critics. Critics argue that to postulate a practically infinite number of unobservable universes, just to explain our own, seems contrary to Occam’s razor. In the view of the director of the National Institutes of Health, Francis Collins, the concept of multiverse “certainly fails Occam’s Razor” by multiplying explanatory entities beyond necessity. Several other multiverse opponents similarly suggest that higher level multiverses are “wasteful.”
However, counter-arguments have been raised in defense of multiverse proposals. The response to the “wasteful theory” claim is often the posing of questions like
When we feel that God or nature is wasteful, what precisely are we disturbed about? It’s definitely not “waste of space” since the standard flat universe model with its infinite volume does not receive similar criticism. Further it’s certainly not “mass” or “atoms” either, for the same reason. Once one has “wasted” an infinite amount of something, what is the concern about “wasting more”?77
Historian of science Ted Davis believes that failing Occam’s razor wouldn’t necessarily make multiverse proposals bad science. For Davis, “Occam’s razor is a metaphysical claim about science, not a scientific claim in itself.” Even so, Davis acknowledges that “many scientists and philosophers would rather not touch its sharp edge, so [violating Occam’s razor] counts for something.”78
An associated criticism is the apparent reduction in simplicity resulting from the existence of an infinite number of universes in a multiverse. But do multiverses actually increase or decrease simplicity? In terms of Kolmogorov complexity, proponents argue that a multiverse can be simpler to describe than a single universe contained within it.79
In algorithmic information theory, the Kolmogorov complexity of an object is a measure of the computability resources needed to specify the object. Consider the differences in Kolmogorov complexity of two 16-characters strings, the first string being “abababababababab” and the second string being “abrt34q6dgndfgff.” The information in the first string can be described by the 10-component phrase “ab 8 times,” whereas the information in the second string cannot be expressed in fewer than 16 components. Therefore, the first string has a lower Kolmogorov complexity than the second string. By this reasoning, many proponents believe that a multiverse does satisfy Occam’s razor, because an entire ensemble is often much simpler than one of its members.
Each higher-level multiverse has a lower Kolmogorov complexity than its embedded lower-level ones: Advancing from our own observable universe to a vast Level 1 multiverse eliminates the need to specify initial conditions. Upgrading a Level 1 multiverse to a Level 2 multiverse eliminates the need to specify physical constants and forces. Going from a Level 2 multiverse to a Level 3 multiverse eliminates the need for branch decision-making. Taking the final step from a Level 3 to a Level 4 multiverse eliminates the need to specify anything, so long as one assumes the structure is mathematical. Nonetheless, the Level 4 caveat may be profound—it may necessitate knowledge of every possible mathematical structure to verify that nothing is missing.
According to Tegmark,
a common feature of all four multiverse levels is that the simplest and arguably most elegant theory involves parallel universes by default. To deny the existence of those universes, one needs to complicate the theory by adding experimentally unsupported processes and ad hoc postulates: finite space, wave function collapse and ontological asymmetry. Our judgment therefore comes down to which we find more wasteful and inelegant: many worlds or many words. Perhaps we will gradually get used to the weird ways of our cosmos and find its strangeness to be part of its charm.80
Theory of Anything?
Opponents claim that multiverse theories have gained currency mostly because too much has been invested in theories that have failed, for example, inflation or string theory. They tend to see these as an attempt to redefine the values of science to which they object even more strongly. For example, in response to the 2014 Annual Question on the website Edge, Paul Steinhardt refers to the multiverse as
a pervasive idea in fundamental physics and cosmology that should be retired: the notion that we live in a multiverse in which the laws of physics and the properties of the cosmos vary randomly from one patch of space to another. According to this view, the laws and properties within our observable universe cannot be explained or predicted because they are set by chance. Different regions of space too distant to ever be observed have different laws and properties, according to this picture. Over the entire multiverse, there are infinitely many distinct patches. Among these patches, in the words of Alan Guth, anything that can happen will happen—and it will happen infinitely many times. Hence, I refer to this concept as a Theory of Anything. Any observation or combination of observations is consistent with a Theory of Anything. No observation or combination of observations can disprove it. Proponents seem to revel in the fact that the Theory cannot be falsified. The rest of the scientific community should be up in arms since an unfalsifiable idea lies beyond the bounds of normal science. Yet, except for a few voices, there has been surprising complacency and, in some cases, grudging acceptance of a Theory of Anything as a logical possibility. The scientific journals are full of papers treating the Theory of Anything seriously. What is going on? . . . a Theory of Anything is useless because it does not rule out any possibility and worthless because it submits to no do-or-die tests. Many papers discuss potential observable consequences, but these are only possibilities, not certainties, so the Theory is never really put at risk.81
Many multiverse supporters reject these criticisms. They argue, in particular, that a string multiverse is falsifiable: that the hypothesized string landscape’s expectation values of properties unrelated to the appearance of life can be compared to the actual values of those properties in our observable universe.82 For example, a comparison can be made of the expectation value of the top quark mass and the experimental value.
Ellis acknowledges that counter-arguments against claims of no falsifiable predictions can be made by applying the weak anthropic principle.83 If, within an anthropically allowed range, there are infinitely more Level 1 multiverses in the theoretical Level 2 string landscape possessing one sign of a parameter than there are Level 1 multiverses with opposite sign, then a firm prediction arises under very weak assumptions regarding the probability measure. This prediction can be compared to the parameter’s sign in our universe.
Self-Identity and Free Will
The issue of dopplegängers in a multiverse has been a concern of many. The concept can seem troublesome, both philosophically and theologically. In particular, it raises a profound philosophical and theological question regarding free will. Robert Mann and George Ellis are especially concerned about the related implications of a Level 3 multiverse.84
The existence of numerous dopplegängers of an individual, who all have identical past lives and memories, could prevent the individual from being able to compute his or her own future, even if the individual had complete knowledge of the entire multiverse. There is no way for a person to determine by observation which of the dopplegängers is himself or herself, until the person makes a choice for an upcoming event. Looking from “outside” (beyond) the multiverse, the subset that could be the person shrinks to only those who made the same decision as he or she did. More and more dopplegängers become distinguishable as they make decisions that differ from the individual’s.
One mitigating aspect might appear from consideration of genetically identical siblings (i.e., twins, triplets, etc.) as a form of dopplegängers within the same Hubble volume. For these dopplegängers, histories diverge in the womb or at birth. One might survey n-tuplets regarding a possible upper limit (if they believe such a limit exists) to the scale of n beyond which they might conceive that none of them would have free will any longer or are no longer responsible for their own actions and decisions.
Independent of the amount of identical history with dopplegängers, responsibility for actions can remain with an individual. An identical physical description doesn’t necessarily remove individual free will or responsibility—aspects of an individual that go beyond the physical realm and into the moral realm. A philosophical question is if two physically identical dopplegängers might also be emotionally and psychologically identical. If they are not, then the pair should perhaps be considered more like twins than dopplegängers, with each morally responsible for individual choices and actions.
For a given set of dopplegängers, decisions might be statistical but fundamentally based on individual free will decisions. The concept is perhaps analogous to accurate predictions of outcomes of large groups based on statistical surveys of a much smaller, representative group. Individuals may make decisions based on free will, but their decisions within larger groups generally follow the statistics of smaller representative groups.
Notably, these arguments do not remove the dopplegänger concerns of many, especially with regard to Level 3 (Everett) versions. Self-identity and free will in a multiverse are issues that are viewed not just as philosophical concerns, but even more as theological issues.
The multiverse concept provokes significant theological debate. The ongoing transformation from a universe paradigm to a multiverse paradigm increasingly challenges some view of humanity’s uniqueness. A multiverse also raises questions regarding theological understandings of the manner and means in which a divine creator interacts with its creation.
Relatedly, some have feared that multiverse proposals will join evolution as another battleground in culture wars. For example, Christoph Cardinal Schönborn, the archbishop of Vienna, believes that scientists created the idea of a multiverse “to avoid the overwhelming evidence for purpose and design found in modern science.”85 Others, such as philosopher William Lane Craig, claim that multiverse theory is motivated by a refusal to accept evidence of God’s handiwork in the cosmos, referring to the multiverse idea as an act of “desperation on the part of atheist scientists.”86 Canadian journalist Denyse O’Leary, an ally of the Intelligent Design movement, asserts that “religious or anti-religious motives dominate the discussion among scientists developing multiverse models.”87
For these multiverse critics, the cosmology of only a single universe is understood as a source of theological promise, providing them evidence that the universe is specifically designed for life. They believe that the probability is astonishingly small that in a single universe a random set of physical parameters could ever permit the appearance of life in any form we know.
Alternately, an unimaginably large (perhaps infinite) number of universes with distinct physical laws makes the probability significantly higher regarding the existence of universes with our life-yielding properties. Thus, Craig interprets the multiverse concept as challenging (at least at the universe level) the fine-tuning argument for the existence of God. However, leading cosmologists of theistic belief, like Don Page, argue that modern multiverse theory did not arise to oppose theism but developed from scientific pursuit of key questions in particle physics, string theory, and cosmic inflation. In Does God So Love the Multiverse, Page also offers an explanation why those who oppose evolution may also oppose a multiverse: just as evolution removed one particular design argument for the existence of God, the multiverse removes one particular fine-tuning argument for the existence of God. Nevertheless, many multiverse proponents of theistic faith, including Page, perceive fine-tuning as an issue similar to evolution, but elevated to a much larger scale. The question transforms from “Why does our universe appeared so fine-tuned for life?” into “Why do the underlying mathematical equations of the multiverse seem so fine-tuned to allow for life within some universes?”88
Multiverse-based arguments against theistic belief have been raised by Carlos Calle, Lawrence Krauss, Stephen Hawking, and Leonard Mlodinow.89 They all judge that existence of a multiverse removes the need for a god. They argue that the foundation of any multiverse is quantum gravity, which (as discussed) likely induces inflation of a vast number of vacuum bubbles. It is then most certain that life in general, and sentient life in particular, will exist somewhere in some such bubbles, without the need for a divine creator. In this picture, the existence of quantum gravity is the unexplainable fundamental concept underlying everything else and is envisioned to replace the role of a divine creator.
Calle proposes a parallel between the concepts of multiverse and evolution by drawing an analogy between his writings and those of Richard Dawkins, who believes that evolution removes the need for God within the biological realm. Calle writes that
biology has a designer, a watchmaker, but a blind watchmaker, a mindless watchmaker without a purpose. Biology’s watchmaker is natural selection. Richard Dawkin’s [sic] book, The Blind Watchmaker explains it clearly and authoritatively. . . . Although biology deals with what may be the most complex system in the universe, physics concerns itself with the ultimate questions of existence. . . . Biology can be explained through natural selection. The universe can be explained with the laws of nature, its watchmaker.90
In contrast, Robin Collins promotes the multiverse concept as consistent with, and supportive of, a theistic perspective.91 However, per the prior discussion, Collins does not conclude that a multiverse provides an adequate explanation for the hyper-fine tuning we observe, nor for the special observer class to which he assigns us. Page likewise believes that a multiverse, especially a Level 3 (Everett) multiverse, does not disprove the existence of a deity, but promotes theistic belief.92 Just as evolution, for theists, can demonstrates a grand design for life, Page believes that a multiverse
can reveal an even more grand design of the universe, since the physical process that generates the multiverse would have to have suitable basic laws and initial conditions to produce life at all (no matter what the constants of physics are, since often they seem to be fine-tuned for several different reasons. The laws and initial conditions [of the multiverse] would apparently have to be even more special to produce not just life, but life like ours observing the order we actually see around us.93
From the latter half of the 20th century, the cosmic multiverse has earned growing recognition, not only as a philosophical or theological concept, but also as a scientific concept. The multiverse concept is now a realm of cutting-edge science, simultaneously prompting profound philosophical and theological questions. Over the last few decades, implications of a multiverse in general, and of each form of possible realization, has become a mainstream topic and the focus of many conferences and workshops.
The coming of age of the multiverse is the result of a long chain of paradigm advancements in humankind’s perception of reality. With each advancing step, our understanding of the physical environment around us has grown larger, grander, more complex, yet simultaneously more unified and ordered. Over a period of several thousand years, we have moved from local Mythocentric paradigms, to Geocentric, Heliocentric, and to the present Universe-centric paradigm—and the paradigm shift beyond Universe-centric is in process.
Unanswered puzzles about our universe and its properties have led many scientists to join philosophers and theologians to ponder and propose the existence of a vast collection of additional universes. For many, a multiverse of one form or another proffers resolutions to many underlying science questions. For others, a multiverse raises more questions than answers. Nonetheless, acceptance of the existence of physical reality beyond our observable universe continues to grow. That space-time continues beyond the limit of our observable universe is a mainstay of the Standard Model of cosmology. In that sense, the transformation from a Universe-centric to a Multiverse-centric paradigm is indeed far along. And yet, whether humanity will ever determine, at least theoretically, what exists beyond the veils of our finite observable volume remains an underlying as-yet unanswered question.
That the transformation to a multiverse paradigm has moved beyond the communities of scientists, philosophers, and theologians is evidenced by such bestselling books as Brian Greene’s The Elegant Universe, The Fabric of Cosmos, and The Hidden Reality; Brian Clegg’s Before the Big Bang: The Prehistory of the Universe; and Mary-Jane Rubenstein’s Worlds without End: The Many Lives of the Multiverse. The multiverse has even reached the Broadway stage via Nick Payne’s Constellations. That playwrights are contemplating the implications of parallel universes for the relationship between a beekeeper and a cosmologist is indeed indicative of the rise of a multiverse paradigm!
Review of the Literature
Scientific, philosophical, and theological implications of a multiverse have been explored at numerous conferences and workshops, including: Anthropic Arguments in Fundamental Physics and Cosmology at Cambridge University in 2001, coordinated by Bernard Carr, Robert Crittenden, Martin Rees, and Neil Turok and partly funded by the John Templeton Foundation (JTF); Universe or Multiverse in 2003 and Multiverse and String Theory: Towards Ultimate Explanations in Cosmology in 2005, both held at Stanford University and chaired by Paul Davies and Andrei Linde, sponsored by JTF; Expectations of a Final Theory at Cambridge University in 2005, coordinated by Bernard Carr, Robert Crittenden, Martin Rees, and Neil Turok and partly funded by JTF; String Theory and the Multiverse: Philosophical and Theological Implications symposium at Wheaton College in 2008, coordinated by Robert Bishop and Gerald Cleaver; God and the Multiverse: A Workshop held at Ryerson University in 2013, coordinated by Klaas Kraay; Collision of Universes conference held on Grand Cayman Island in 2009, coordinated by Anthony Aguirre; Multiverse: One Universe or Many at the World Science Foundation in 2013, moderated by John Hockenberry; and Why Trust a Theory? Reconsidering Scientific Methodology in Light of Modern Physics held at the Ludwig Maximillian University (LMU) of Munich in 2015, coordinated by Richard Dawid.
The December 7–9, 2015, Munich workshop was organized for physicists and philosophers to jointly address the concerns Ellis and Silk presented in their Nature article, “Scientific Method: Defend the Integrity of Physics.”94 In the article, they strongly criticized string and multiverse theorists for not requiring experimental verification of their proposals when “faced with difficulties in applying fundamental theories to the observed Universe,” but using instead arguments of elegance. Ellis and Silk focused on observations made by Richard Dawid of LMU that
string theorists had started to follow the principles of Bayesian statistics, which estimates the likelihood of a certain prediction being true on the basis of prior knowledge, and later revises that estimate as more knowledge is acquired.95
According to Dawid,
physicists have begun to use purely theoretical factors, such as the internal consistency of a theory or the absence of credible alternatives, to update estimates, instead of basing those revisions on actual data.96
Both string theory/multiverse proponents and opponents attended the conference. At the end of the workshop it was reported that
the feuding physicists did not seem any closer to agreement. Dawid—who co-organized the event with Silk, Ellis and others—says that he does not expect people to change their positions in a fundamental way. But he hopes that exposure to other lines of reasoning might “result in slight rapprochement.” Ellis suggests that a more immersive format, such as a two-week summer school, might be more successful at producing a consensus.97
Whatever the future of the multiverse concept, Dawid believes that multiverse research reveals to us
something about science in general. The rising importance of assessments of scientific underdetermination carries a message for the whole philosophy of science. Assessments of scientific under-determinism have always played a crucial role within the scientific process but were largely neglected by the philosophy of science. The case study . . . now guides us towards acknowledging their relevance within the scientific process and therefore leads towards a more accurate and coherent understanding of science. Science cannot be fully grasped in terms of the coherence between physical theory and empirical data. Rather, it has to be seen as an enterprise that binds together observations and theoretical assessments at different levels of analysis in order to obtain its results. . . .
Finally, it must be emphasized once again that the current situation in fundamental physics is too volatile for allowing stable judgement on the actual range and power of the strategies discussed. Future scientific developments may be expected to lead to a more robust understanding of these matters. The main message to be extracted from the physical status quo at this point cannot be that we have a fully-fledged and fully reliable method of non-empirical theory confirmation. . . . Rather the novel message is that strategies of non-empirical theory confirmation and strategies of establishing final theory claims have become part of scientific reasoning. Conclusions which can be reached by following these strategies correspondingly turn into scientifically legitimate claims which can fail or succeed as such. This, on its own, is a rather far-reaching conclusion. If the named strategies indeed stabilize their scientific success, they have the capacity of providing an altogether new understanding of the way science relates to the world.98
The primary concepts reviewed herein are discussed in the collection of articles in Carr’s Universe or Multiverse.99 Addition primary sources include Ellis’s “Does the Multiverse Really Exist?” and his Nature article with Silk, Mann’s “Inconstant Multiverse” and “The Puzzle of Existence,” and Page’s “Does God So Love the Multiverse?” and “A Theological Argument for an Everett Multiverse.”100
Links to Digital Materials
Multiverse: One Universe or Many is a debate among Andreas Albrecht, Alan Guth, Andrei Linde, and Neil Turok, moderated by John Hockenberry at the World Science Foundation in 2013.);
Here is a helpful web link to several significant papers regarding philosophy and theology aspects of a multiverse.
Additional web discussions include:
Aghapour, A. Does Multiverse Theory Bring Theology into Science?.
Craig, W. L. Multiverse and Design Argument.
Craig, W. L. Has the Multiverse Replaced God.
Gorjanc, B. Fine-Tuning for Life in the Multiverse.
Ouellette, J. Multiverse Collisions May Dot the Sky.
Schwarz, P. The Official String Theory Web Site.
Tegmark, M. The Universes of Max Tegmark.
Woit, P. Not Even Wrong.
An especially useful reference to scientific, philosophical, and theological implications and controversies of the multiverse is Carr, Bernard, ed. Universe or Multiverse. Cambridge, U.K.: Cambridge University Press, 2007. This collection contains papers presented at the Cambridge 2001 and Stanford 2003 workshops. The content of this volume is mostly written at an academic research level.
Several additional books present multiverse concepts at a more general level. References for the historical development of the multiverse concept include:
Barrow, John. The Book of Nothing: Vacuum, Voids, and the Latest Ideas about the Origins of the Universe. New York: Vintage Books, 2000.Find this resource:
Carr, Bernard. “Metacosmology and the Limits of Science.” In Mathematical Structures of the Universe, edited by Michal Eckstein, Michal Heller, and Sebastian Szybka, 407–432. Cracow: Copernicus Centre Press, 2014.Find this resource:
Clegg, Brian. Before the Big Bang: The Prehistory of the Universe. New York: St. Martin’s Griffin, 2009.Find this resource:
Ferris, Timothy. The Whole Shebang: A State-of-the-Universe(s) Report. New York: Touchstone, 1998.Find this resource:
Gribbin, John. In Search of the Multiverse. Hoboken, NJ: John Wiley and Sons, 2009.Find this resource:
Guth, Alan. The Inflationary Universe: The Quest for a New Theory of Cosmic Origins. Cambridge, MA: Perseus Books, 1997.Find this resource:
Randall, Lisa. Knocking on Heaven’s Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World. London: Bodley Head, 2011.Find this resource:
Rubenstein, Mary-Jane. Worlds without End: The Many Lives of the Multiverse. New York: Columbia University Press, 2014.Find this resource:
Vilenkin, Alex. Many Worlds in One: The Search for Many Universes. New York: Hill and Wang, 2006.Find this resource:
Cyclic universe realizations of a Level 1 multiverse are presented in:
Bojowald, Martin. Once before Time: A Whole Story of the Universe. New York: Alfred A. Knopf, 2010.Find this resource:
Unger, Roberto, and Lee Smolin. The Singular Universe and the Reality of Time. Cambridge, U.K.: Cambridge University Press, 2015.Find this resource:
The concepts and properties of a Level 3 (Everett) multiverse are reviewed in:
Wallace, David. The Emergent Multiverse: Quantum Theory according to the Everett Interpretation. Oxford: Oxford University Press, 2012.Find this resource:
A range of theological interpretations of a multiverse are explored in:
Drees, Willem. Beyond the Big Bang: Quantum Cosmologies and God. La Salle, IL: Open Court, 1990.Find this resource:
Russell, Robert. Cosmology: From Alpha to Omega. Minneapolis: Fortress Press, 2008.Find this resource:
Russell, Robert, Nancey Murphy, and C. J. Isham, eds., Quantum Cosmology and the Laws of Nature: Scientific Perspective on Divine Action. Vatican City State: Vatican Observatory, 1999.Find this resource:
Stenger, Victor. God and the Multiverse: Humanity’s Expanding View of the Cosmos. Amherst, NY: Prometheus Press, 2014.Find this resource:
Susskind, Leonard. “The World as a Hologram.” Journal of Mathematical Physics 36 (1995): 6377–6396.Find this resource:
(1.) St. Albertus Magnus, De Caelo et Mundo, Lib. I, Tract III, Cap. I, (Opera, Lugduni, 1651, II, 40), quoted in Grant McColley, “The Seventeenth-Century Doctrine of a Plurality of Worlds,” Annals of Science 1, no. 4 (1936): 385.
(2.) Michael Crowe, “A History of the Extraterrestrial Life Debate,” Zygon 32, no. 2 (1997): 149, quoting Steven Dick, Plurality of Worlds: The Origins of the Extraterrestrial Life Debate from Democritus to Kant (Cambridge, U.K.: Cambridge University Press, 1982).
(3.) Giordano Bruno, “Teofilo,” in Cause, Principle, and Unity, Fifth Dialogue (1588), ed. and trans. Jack Lindsay (London: William Clowes and Sons, 1962).
(4.) Bernard Carr and George Ellis, “Universe or Multiverse?” Astronomy and Geophysics 49 (2008): 2.29.
(5.) Carr, “Universe or Multiverse?”
(6.) Robin Collins, “Modern Cosmology and Anthropic Fine-Tuning: Three Approaches,” in Georges Lemaître: Life, Science and Legacy, ed. Rodney Holder and Simon Mitton (Berlin: Heidelberg Springer, 2012), 173–191.
(7.) John Leslie, Infinite Minds, (Oxford: Clarendon Press, 2001).
(8.) Brian Clegg, Before the Big Bang (New York: St. Martin’s Griffin, 2008).
(9.) Clegg, Before the Big Bang.
(10.) Georges Lemaître, “Un univers homogène de masse constante et de rayon croissant rendant compte de la vitesse radiale des nèbuleuses extra-galactiques,” Annals of the Scientific Society of Brussels (in French) 47A (1927): 41.
(11.) Vesto M. Slipher, “Radial Velocity Observations of Spiral Nebulae,” The Observatory 40 (1917): 304–306.
(12.) Edwin Hubble, “A Relation between the Distance and Radial Velocity among Extra-Galactic Nebulae,” Proceedings of the National Academy of the Sciences 15 (1929): 3, 168–173.
(13.) Tammy Plotner, “The Expanding Universe—Credit to Hubble or Lemaitre?,” Universe Today; and G. Lemaitre”, “Famous Scientists: The Art of Genius.”
(14.) Georges Lemaître, “A Homogeneous Universe of Constant Mass and Growing Radius Accounting for the Radial Velocity of Extragalactic Nebulae,” Monthly Notices of the Royal Astronomical Society 91 (1931): 5, 483–490.
(15.) Plotner, “The Expanding Universe.”
(16.) Fred Hoyle, “Recent Developments in Cosmology,” Nature 208, no. 5006 (October 9, 1965): 111–114.
(17.) J.–P. Luminet, J. Weeks, A. Riazuelo, R. Lehoucq, and J.P. Uzan, “Dodecahedral Space Topology as an Explanation for Weak Wide-Angle Temperature Correlations in the Cosmic Microwave Background,” Nature 425, no. 6958 (2003): 593; and Jean-Pierre Luminet, “The Status of Cosmic Topology after Planck Data,” Universe 2 (2016): 1.
(18.) Stephen Weinberg, “Anthropic Bound on the Cosmological Constant,” Physical Review Letters 59 (1987): 2607.
(19.) Vera Rubin and W. Kent Ford Jr., “Rotation of the Andromeda Nebula from a Spectroscopic Survey of Emission Regions,” Astrophysical Journal 159 (1970): 379.
(20.) Alan Guth, “Inflationary Universe: A Possible Solution to the Horizon and Flatness Problems,” Physical Review D23 (1981): 347.
(21.) Grant Remmen and Sean Carrol, “How Many e-Folds Should We Expect from High-Scale Inflation?” Physical Review D90 (2014): 063517.
(22.) Paul Davies, Cosmic Jackpot: Why Our Universe Is Just Right for Life (Boston: Houghton Mifflin, 2007). Also published as The Goldilocks Enigma: Why Our Universe Is Just Right for Life (New York: Penguin Press, 2008).
(23.) David Bailey, “What Is the Multiverse and What Is Its Significance?”
(24.) Bailey, “What Is the Multiverse.”
(25.) Richard Dawid, String Theory and the Scientific Method (Cambridge, U.K.: Cambridge University Press, 2013).
(26.) Helmut Satz, Before Time Began (Oxford: Oxford University Press, 2017).
(27.) Peter Woit, “String Theory and the Scientific Method, Not Even Wrong,” May 14, 2013 post; and Peter Woit, “The Dangerous Irrelevance of String Theory, Not Even Wrong,” June 2017 post.
(28.) Dawid, String Theory and the Scientific Method.
(29.) Max Tegmark, “Parallel Universes,” in Science and Ultimate Reality: From Quantum to Cosmos,” ed.J. D. Barrow, P. C. W. Davies, and C. L. Harper (Cambridge, U.K.: Cambridge University Press, 2003); and Max Tegmark, “Parallel Universes: Not Just a Staple of Science Fiction, Other Universes Are a Direct Implication of Cosmological Observations,” Scientific American 288 (2003): 40.
(30.) Bernard Carr and George Ellis, “Universe or Multiverse,” Astronomy and Geophysics 49, no. 2 (2008): 2.29; Bernard Carr, “Defending the Multiverse,” Astronomy and Geophysics 49, no. 2 (2008): 2.36; and George Ellis, “Opposing the Multiverse,” Astronomy and Geophysics 49, no. 2 (2008): 2.29.
(31.) Max Tegmark, “The Multiverse Hierarchy,” in Universe or Multiverse, ed. Bernard Carr (Cambridge, U.K.: Cambridge University Press, 2007).
(32.) Tegmark, “The Multiverse Hierarchy.”
(33.) Jaume Garriga and Alexander Vilenkin, “Many Worlds in One,” Physical Review D64 (2001): 043511.
(34.) Lee Smolin, “Scientific Alternatives to the Anthropic Principle,” in Carr, Universe or Multiverse.
(35.) Arvind Borde, Alan Guth, and Alex Vilenkin, “Inflationary Space-Times Are Incomplete in Past Directions,” Physical Review Letters 90 (2003): 15130; Alan Guth, “Time since the Beginning,” in Astrophysical Ages and Time Scales 245 (2001): 3–17, Astronomical Society of the Pacific Conference Series; Alan Guth, “Inflation,” to be published in Carnegie Observatories Astrophysics Series, vol. 2: Measuring and Modeling the Universe; Jaume Garriga, Alan Guth, and Alex Vilenkin, “Eternal Inflation, Bubble Collisions, and the Persistence of Memory,” Physical Review D76 (2007): 123512; and Alan Guth, “Eternal Inflation and Its Implications,” Journal of Physics A40 (2007): 6811–6826.
(36.) Andrei Linde and Vitaly Vanchurin, “How many universes are in the multiverse?” Physical Review D81 (2010): 083525.
(38.) Roger Penrose, Cycles of Time: An Extraordinary View of the Universe (London: Bodley Head, 2010).
(39.) Jaime Trosper, “Sir Roger Penrose: An Alternate Theory of the Big Bang?,” Futurism (April 14, 2014).
(40.) Tegmark, “Parallel Universes”; and Tegmark, “Parallel Universes: Not Just a Staple of Science Fiction.”
(41.) George Ellis, “Editorial Note to: Brandon Carter, Large Number Coincidences and the Anthropic Principle in Cosmology,” General Relativity and Gravity 43 (2011): 3213–3223.
(42.) P. A. M. Dirac, “The Cosmological Constants,” Nature 139 (1937): 323; Arthur Eddington, Fundamental Theory (Cambridge, U.K.: Cambridge University Press, 1948); and Robert Dicke, “Dirac’s Cosmology and Mach’s Principle,” Nature 192 (1961): 440–441.
(43.) Brandon Carter, “Large Number Coincidences and the Anthropic Principle in Cosmology,” in IAU Symposium 63: Confrontation of Cosmological Theories with Observational Data, ed. S. Longair (Dordrecht, The Netherlands: Reidel, 1974), 291–298; Christopher Collins and Stephen Hawking, “Why Is the Universe Isotropic?” Astrophysics Journal 180 (1973): 317–334; and Bernard Carr and Martin Rees, “The Anthropic Principle and the Structure of the Physical World,” Nature 278 (1979): 605–612.
(44.) Ellis, “Editorial Note to: Brandon Carter.”
(45.) Carter, “Large Number Coincidences.”
(46.) Carter, “Large Number Coincidences.”
(47.) Carter, “Large Number Coincidences.”
(48.) Tegmark, “The Multiverse Hierarchy.”
(49.) Ellis, “Editorial Note to: Brandon Carter.”
(50.) John Barrow and Frank Tipler, The Anthropic Cosmological Principle (New York: Oxford University Press, 1986).
(51.) Robin Collins, “The Multiverse Hypothesis: A Theistic Perspective,” in Carr, Universe or Multiverse; John Leslie, Universes (London: Routledge, 1989); and John Leslie, Infinite Minds (Oxford: Clarendon Press, 2001).
(52.) Carr and Ellis, “Universe or Multiverse.”
(53.) Stephen Hawking, “Information Loss in Black Holes,” Physical Review D72 (2005): 084013.
(55.) Stephen Hawking, Malcolm Perry, and Andrew Strominger, “Soft Hair on Black Holes,” Physical Review Letters 116 (2016): 231301.
(56.) Tegmark, “Parallel Universes”; and Tegmark, “Parallel Universes: Not Just a Staple of Science Fiction.”
(57.) Davies, Cosmic Jackpot.
(58.) Max Tegmark, “The Mathematical Universe,” Foundations of Physics 38 (2008): 101.
(59.) Eugene Wigner, “The Unreasonable Effectiveness of Mathematics in the Natural Sciences,” Communications on Pure and Applied Mathematics 13 (1960): 1–14.
(60.) Nick Bostrom, “Are We Living in a Computer Simulation?” Philosophical Quarterly 53 (2003): 211, 243–255.
(61.) John Barrow, The Infinite Book: A Short Guide to the Boundless, Timeless and Endless (London: Jonathan Cape, 2005).
(63.) Juan Maldecena, The Large N Limit of Superconformal Field Theories and Supergravity,” Advances in Theoretical and Mathematical Physics 2 no. 2 (1998): 231; Yoshifumi Hyakutake, “Quantum Near-Horizon Geometry of a Black 0-Brane,” Progress of Theoretical and Experimental Physics, 2014, no. 3 (2014): 33B04-0; and Masanori Hanada, Yoshifumi Hyakutake, Goro Ishiki, and Jun Nishimura, “Holographic Description of Quantum Black Hole on a Computer,” Science, 344 no. 6186 (2014): 882–885.
(64.) Jürgen Schmidhuber, “Algorithmic Theories of Everything,” International Journal of Foundations of Computer Science 13 (2002): 587–612.
(65.) Don Page, “Predictions and Tests of Multiverse Theories,” in Carr, Universe or Multiverse; and Don Page, “Does God So Love the Multiverse?,” in Science and Religion in Dialogue, ed. Melville Y. Stewart (Oxford: Wiley-Blackwell, 2010), 380–395.
(66.) Robert Mann, “Inconstant Multiverse,” Perspectives on Science and Christian Faith 57 (2005): 302.
(67.) Tegmark, “The Multiverse Hierarchy.”
(68.) Burton Richter, “Theory in Particle Physics: Theological Speculation versus Practical Knowledge,” Physics Today 59 (2006): 8.
(69.) Paul Davies, “A Brief History of the Multiverse,” New York Times, April 12, 2003.
(70.) Tegmark, “Parallel Universes”; and Tegmark, “Parallel Universes: Not Just a Staple of Science Fiction,”
(71.) Frank Wilczek, “Multiversality,” Classical and Quantum Gravity 30 (2013): 193001.
(72.) Leonard Susskind, The Cosmic Landscape: String Theory and the Illusion of Intelligent Design (New York: Little, Brown, 2006).
(73.) Georges Obied, Hirosi Ooguri, Lev Spodyneiko, and Cumrun Vafa, “de Sitter Space and the Swampland,” 2018; and Prateek Agrawal, Georges Obied, Paul Steinhardt, and Cumrun Vafa, “On the Cosmological Implications of the String Swampland, Physics Letters B784 (2018): 271–276.
(74.) Wilczek, “Multiversality.”
(75.) George Ellis, “Multiverses, Science, and Ultimate Causation,” in Georges Lemaître: Life, Science and Legacy, ed. Rodney Holder and Simon Mitton (Berlin: Springer Heidelberg, 2012), 125–144.
(76.) George Ellis, “Multiverses: Descriptions, Uniqueness and Testing,” in Carr, Universe or Multiverse; and George Ellis, “Does the Multiverse Really Exist?,” Scientific American, August 2011.
(78.) Davis, “On Creating the Cosmos.”
(79.) Andre Kolgomorov, “On Tables of Random Numbers,” Sankhyā: The Indian Journal of Statistics, Series A A25 (1963): 369–375; Page, “Predictions and Tests of Multiverse Theories”; and Page, “Does God So Love the Multiverse?”
(80.) Tegmark, “Parallel Universes”; and Tegmark, “Parallel Universes: Not Just a Staple of Science Fiction.”
(81.) Paul Steinhart, “What Scientific Idea Is Ready for Retirement? Theories of Anything,” 2014.
(82.) Page, “Predictions and Tests of Multiverse Theories”; and Albert Schellekens, “Life at the Interface of Particle Physics and String Theory,” Reviews of Modern Physics 85, no. 4 (2013): 1491–1540.
(83.) George Ellis and Lee Smolin, “The Weak Anthropic Principle and the Landscape of String Theory,” 2009.
(84.) Mann, “Inconstant Multiverse”; Ellis, “Opposing the Multiverse”; and Robert Mann, “The Puzzle of Existence,” Perspectives on Science and Christian Faith 61 (2009): 139.
(85.) Christoph Schönborn, “Finding Design in Nature”, Op-Ed contribution to The New York Times, July 7, 2005.
(88.) Page, “Does God So Love the Multiverse?”
(89.) Carlos Calle, The Universe: Order without Design (Amherst, NY: Prometheus Books, 2009); Lawrence Krauss, A Universe from Nothing (New York: Free Press, 2012); and Stephen Hawking and Leonard Mlodinow, The Grand Design (New York: Bantam Books, 2012).
(90.) Calle, The Universe: Order without Design.
(91.) Collins, “The Multiverse Hypothesis”; and Trevor Persaud, “Christ of the Klingons,” Christianity Today, December 2010.
(93.) Page, “Does God So Love the Multiverse?”
(94.) George Ellis and Joe Silk, “Scientific Method: Defend the Integrity of Physics,” Nature 516 (2014): 312–323.
(95.) Dawid, String Theory and the Scientific Method.
(97.) Castelvecchi, “Feuding Physicists Turn to Philosophy.”
(98.) Dawid, String Theory and the Scientific Method.
(99.) Bernard Carr, ed., Universe or Multiverse (Cambridge, U.K.: Cambridge University Press, 2007).
(100.) Ellis, “Does the Multiverse Really Exist?”; Ellis and Silk, “Scientific Method: Defend the Integrity of Physics”; Mann, “Inconstant Multiverse”; Mann, “The Puzzle of Existence”; Page, “Does God So Love the Multiverse?”; and Page, “A Theological Argument for an Everett Multiverse.”