Tag Archives: relativity

Can you exceed the speed of light by manipulating space-time in some way?


Other than in science fiction, there is absolutely no known way to exceed the speed of light and to transmit matter or information in that manner.

We need hard evidence that nature permits such things to happen, and this evidence is completely lacking. Physicists have been accelerating electrons to within a millimeter/sec of the speed of light for decades and have never seen any departure from what is strictly permitted by special relativity. There is no ‘gap’ or ‘quirk’ that has ever been experimentally discovered that shows light-weight electrons can surmount the speed of light.

Another problem is that we do not actually know what the structure of spacetime is. All we know is that it is defined by the worldline geometry of particles. It is not something that preexists matter and energy, so other than pathologies such as black holes, we do not know what it means to actually ‘manipulate’ spacetime. Spacetime is manipulated by altering the worldlines of particles, not the other way around.

What facts disprove the Big Bang theory?

The current, seemingly comprehensive theory of the origin and evolution of our universe is called inflationary, big bang cosmology. This theory explains how our universe emerged from a singularity of high temperature and density and expanded and cooled. As it did so, space dilated at a faster-than-light pace and cooled so that the familiar elementary particles and forces could produce the cosmic background radiation and a variety of complex particles as the universe expanded. By about three minutes after the Big Bang, the ratio of hydrogen to helium had been cosmologically established, and so had the amount of cosmic background photons and neutrinos. In addition, during the rapid ‘inflation’ period, irregularities in the cosmic density of matter were established as well as a background of gravitational radiation.

The spectacular results of the WMAP satellite , along with the results from COBE and a number of other high-precision studies of the cosmic background radiation have now established that the universe is 13.7 billion years old, and that its space geometry is exactly ‘flat’. Also, according to the most recent WMAP ‘pie graph’ above, the amount of the gravitating matter in ordinary stars and gas is only 4.6%. The rest of the gravitating ‘mass’ of the observable universe is in the form of Dark Matter (24%), with an additional contribution of Dark Energy (71.4%). It is this Dark Energy that is causing our universe to expand at an accelerated pace, which has been observationally detected using distant supernovae.

There have been over the years several potential rivals to Big Bang cosmology, but with the exception of Steady State theory, none have attracted more than a handful of interested supporters. The reason is that they failed to predict, or offer explanations for, some basic observations that are widely agreed upon to be key tests of any cosmological theory, even by the rivals to Big Bang theory!

DeSitter Cosmology ca 1917 – The universe is presumed to be entirely empty of matter, but expands exponentially in time because of the presence of a non-zero cosmological constant. This refutes nearly every observational fact now available, including the actual rate of expansion which follows a linear, Hubble law, not an exponential expansion law with time. Also, the density of matter in the universe is not zero because we are here, and so are a lot of other stars and galaxies!

Einstein Static Cosmology ca 1917-The universe does not expand, and is static in time. The cosmological constant is precisely tuned to balance the attractive tendency of matter. Like DeSitter cosmology, it also fails to agree with modern observations, because the universe is expanding linearly with time. It is also unstable to the slightest perturbation in the value of this constant.

Lemaitre Cosmology ca 1924 – The universe started with a ‘big bang’ with no cosmological constant. The initial state was a giant radioactive atom containing all the matter in the universe near absolute zero. This theory agrees with the observed expansion, but fails to explain the existence of the cosmic background radiation and the universal abundances of hydrogen, helium and deuterium because it requires the decay of a single massive ‘super atom’ at the instant of the Big Bang.

Steady State Cosmology ca 1950 – Developed by Fred Hoyle and Thomas Gold, it proposes that the universe has been expanding for eternity and that new galaxies are created, atom by atom, in intergalactic space ‘out of empty space’. This theory had its heyday in the 1950’s and 1960’s, but was never able to explain convincingly where the cosmic background radiation came from, why it is so isotropic, and why its temperature is fixed at 2.7 K. It also provided no clues as to why there ought to be a universal abundance ratio for hydrogen, helium and deuterium.

Cold Big Bang Cosmology ca 1965 – Developed by David Layzer at Harvard, it proposes that the Big Bang occurred, but that the initial state was at absolute zero and consisted of a pure solid of hydrogen. This fragmented into galaxy-sized clouds as the universe expanded. It provided no explanation for where the cosmic background field came from and why it is isotropic and at a temperature of 2.7 K.

Hagedorn Cosmology ca 1968 – Physicist Robert Hagedorn proposed that all of the details of big bang theory are probably true, except that the early history of the universe had a limiting temperature of about 1 trillion degrees because the structure of matter has an infinite ladder of ‘fundamental particles’ out of which electrons, protons and neutrons are constructed. This has been refuted with the discovery that quarks exist which place any limiting temperature for the early history of the universe at temperatures well above 1000 trillion degrees.

Brans-Dicke Cosmology ca 1955 – Einstein’s equation for gravity in general relativity is modified to include a ‘scalar’ field. This field causes the value of the constant of gravity to change slowly over billions of years. This also leads to a modification of the early history of the universe. Experimental searches for a change in the constant of gravity show that it has not changed to within experimental error during the last 2 – 3 billion years. It would cause the evolution of the Earth-Moon system to be significantly altered, and the evolution of the Sun to be severly modified. Neither of these effects have been observed.

Old Inflationary, Big Bang Cosmology ca 1980 – Developed as a ‘toy’ model by Alan Guth in 1980. The Inflationary era which ended 10^-34 seconds after the Big Bang caused the nucleation of innumerable ‘bubbles’ of true vacuum which merged together to form a patina of matter and radiation in a very lumpy configuration. The cosmic background radiation, however, shows that the universe is very smooth to 1 part in at least 10,000 since about 300,000 years after the Big Bang. There is no evidence for such a turbulent and lumpy transition era.

Oscillatory Big Bang Cosmology ca 1930 – This a a possible modification to Big Bang cosmology that differs only in that the current expansion will be replaced by a collapse phase and then an expansion phase etc etc. There is no evidence that there was ever a prior expansion-collapse phase. The universe also does not seem to have enough matter to make it a ‘closed’ universe destined to recollapse in the future; an important requirement for any future oscillation cycle.

Big Bang Cosmology with added Neutrino Families ca 1970 – Big Bang cosmology is largely correct, except that to solve the ‘missing’ or ‘dark’ matter problem, new families of neutrinos have to be added to the universe. This would change the cosmological abundance ratios of helium and deuterium relative to hydrogen so that the current observed values no longer are possible. There is also no experimental evidence that more than 3 types of neutrinos exist; and these are already consistent with the measured cosmological abundances.

Chronometric Cosmology ca 1970 – Developed by I. Segal at MIT, it proposes that space-time has a different mathematical structure than the one that forms the basis for Big Bang cosmology. So far as we can tell, the major disagreement is in the rate of the expansion of the universe which comes out as a quadratic law between distance and expansion speed, rather than the linear Hubble Law. This proposal seems to be inconsistent with what has been observed for distant galaxies during the last 3-4 decades. There may be other disagreements with Big Bang cosmology, but Chronometric cosmology has not been explored deep enough to make testable predictions in these other areas.

Alfven Cosmology ca 1960 – Developed by physicist Hans Alfven, it proposes that the universe contains equal parts of matter and anti-matter. No explanation is made for many of the other observational facts in cosmology. If there were equal parts of matter and anti-matter, there ought to be regions in the universe where these were in contact to produce X-rays or gamma rays due to the annihilation process. No such large-scale background has ever been detected that can be attributed to proton or electron annihilation.

Plasma Cosmology ca 1970 – The matter in the universe, on the largest scales, is not neutral, but has a very weak net charge which is virtually undetectable. This causes electromagnetic forces to dominate over gravitational forces in the universe so that all of the phenomena we observe are not the products of gravitation alone. This is an intriguing theory, but other then denying their importance, it cannot easily explain the origin of the cosmic background radiation, its isotropy and temperature, and the abundances of helium and deuterium.

The basic observations that are agreed to me cosmological tests for any theory are:

1…. The universe is expanding. – This is a large-scale observation which spans the entire observable universe so it must be ‘cosmological’

2…. There exists a cosmic background radiation field detectable at microwave frequencies. – Why doesn’t it occur at other frequencies and only seen in the microwave region, covering every direction of the sky?

3…. The cosmic microwave background field is measurably isotropic to better than a few parts in 100,000 after compensation is made for the relativistic Doppler effect caused by Earth/Sun/Milky Way motion. – This is a large-scale property of this phenomenon that has nothning to do with the Milky Way or other galaxies so it must be a cosmological feature.

4…. The cosmic microwave background radiation field is precisely that of a black body. – Many other kinds of radiation are known, but NONE have exactly a black body spectrum. Only the cosmic background radiation is a perfect black body to the limits of our ability to measure its spectrum.

5…. The cosmic microwave background radiation field has a temperature of 2.7 K. – Why 2.7 K? Why not 5.019723 K. Only Big Bang cosmology predicts a relic radiation at a temperature near 3 degrees and not some other value.

6…. There does exist a universal abundance ratio of helium to hydrogen consistent with the current expansion rate and cosmic background temperature. – Whether we look at the compsition of stars, planets or even gas clouds in distant galaxies, we always seem to come upon a ‘universal’ constant ratio of helium to hydrogen and deuterium to hydrogen. There must be an explanation for this that has nothing to do with just our solar system or Milky Way.

7…. The cosmological abundance of deuterium relative to hydrogen and helium is consistent with the levels expected given the current expansion rate and density. – If the universe expanded faster, then there would be less time for heavier elements such as helium and deuterium to form.

8…. There are only three families of neutrinos. – Although we have not confirmed this to be true in the vicinity of distant galaxies, we do see the same kinds of elements and physics occurring out there, especially supernovae whose physics depend very sensitively upon the numbers of distinct types of neutrinos, and the constancy of the underlying ‘weak interaction’ physics.

9…. The night sky is not as bright as the surface of the Sun. – A simple but profound observation which can only be resolved by the correct distribution of stars in the universe, their ages, and the expansion of the universe.

10… The cosmic background radiation field is slightly lumpy at a level of one part in 100,000 to 1,000,000. – Why is this? And why by this amount?

11… There are no objects that have ages indisputably greater than the expansion age of the universe. – Our universe nearby does not seem to have very old stars older than 20 billion years even though their properties should be easily recognizable and a simple extension of the physics and evolution of the oldest stars we do see.

12… There are about 10,000,000,000 photons in the cosmic background radiation field for every proton and neutron of matter. – This is an important ‘thermodynamic’ number which tells us how the universe has evolved up to the present time. Why is its entropy so huge?

13… The degree of galaxy clustering observed is consistent with an expanding universe with a finite age less than 20 billion years. – A direct observation which again tells us that gravity has not had a long time to act to build up large complex structures in today’s universe.

14… There are no elements heavier than lithium which have a universal abundance ratio. – What process created these heavier elements?

15… The universe was once opaque to its own radiation. – This must follow from the black body shape of the cosmic background radiation.

16… The universe is now dominated exclusively by matter and not a mixture of matter and anti-matter. – Only a few contenders to Big Bang cosmology make any attempt at explaining this direct observation.

So there you have it. This is not a game of billiards where the cue ball (data) is carefully lined up so that Big Bang theory comes out looking inevitable. Any of these other theories have been repeatedly invited to take their best shot too, and the results are always the same. The proponents have to intervene to even get their theories to pony up a simple prediction for any of these cosmological data.

The biggest prediction of Big Bang cosmology lies in its very foundations. It is based on the inerrancy of general relativity and how this theory accounts for gravity under extreme conditions. Its basic predictions have been tested many times, and new exotic phenomena such as the Lenze-Thirring effect and gravity waves have also been predicted and confirmed by the theory. It seems to be a flawless explanation for how grfavity workd, but if it is accurate, then we need lots of Dark Matter and Dark Energy in the universe in addition to the 5% of stars and gas that we can see. That is the big problem.

Dark matter is found not only on the cosmic scale but in regions as small as galaxies. In fact it was discovered in galaxies long before WMAP made its first studies. Our own Milky Way seems to have six times more gravitating stuff orbiting its center than in all the luminous matter and gas clouds we can detect. In fact, any large systems of matter we have studied have this Dark MAtter problem. Some physicists have interpreted this as an actual breakdown in General Relativity itself, but their proponents cannot find an extension or replacement for general relativity that makes Dark Matter go away. Meanwhile, physicists have not detected any of the particle candidates for Dark matter at the Large Hadron Collider or other labs around the world.

So Dark Matter can be added to Big Bang cosmology, but we don’t yet know what kind of physical material it is, or whether there might be something subtly wrong with General Relativity itself.

Where in the universe did the Big Bang happen?

The Big Bang did not happen inside the 3-d space of our universe, at least that’s what our best understanding of physics seems to be telling us during the last 100 years! That means it did not happen ‘over there’ a billion light years beyond Antares.

The only guide we have for answering this question is Big Bang cosmology and Einstein’s general theory of relativity. These make very specific predictions for what happened to space and time during the Big Bang.

The figure above (Credit: Martin Kornmesser, Luis Calcada, NASA, ESA/Hubble) shows the expansion of the universe predicted by general relativity. Note that the latitude and longitude positions of the ‘star’ galaxies remains the same but the distance between them is dilating as the radius of the sphere increases. This is a representation of how our universe is expanding. From this geometric analogy, if the surface of the sphere represented the 3-d volume of our universe at a specific time since the big bang, you see that the volume of space is increasing but space isnt being added in from some where else! This is a sphere in 4-dimensions with a 3-d surface, in geometric analog to a 3-d basket ball with a 2-d surface. Also, as you shrink the sphere’s radius, the volume of 3-d space decreases steadily until it approaches the condition where the radius is zero. You can do this with a mathematical ‘balloon’ but not a real one. At zero radius we also have the condition where the 3-d volume of space also vanishes.

Now suppose that this spherical surface was filled with atoms. As you shrink the volume of space, the density of this matter increases steadily. When you get close to the time where the radius is zero, the average density of matter in the 3-d space of this sphere has grown enormously. When the radius becomes zero, the density becomes infinite and we have what physicists call a singularity.

So, the best, non-mathematical description that any cosmologist can create for describing the Big Bang is that it occurred in every cubic centimeter of space in the universe with no unique starting point. In fact, it was an event which our mathematics indicate, actually brought space and time into existence. It did not occur IN space at a particular location, because it created space ( and time itself) as it went along. There may have existed some state ‘prior’ to the Big Bang, but it is a state not described by its location in time or space. This state preceded the existence of our time and space.

Einstein’s Fudge

Einstein’s Cosmic Fudge Factor

Written by Sten Odenwald
Copyright (C) 1991. Sky Publishing Corporation. Reprinted by permission. See April, 1991 issue

Black holes…quarks…dark matter. It seems like the cosmos gets a little stranger every year. Until recently, the astronomical universe known to humans was populated by planets, stars, galaxies, and scattered nebulae of dust and gas. Now, theoretists tell us it may also be inhabited by objects such as superstrings, dark matter and massive neutrinos — objects that have yet to be discovered if they exist at all!
As bizarre as these new constituents may sound, you don’t have to be a rocket scientist to appreciate the most mysterious ingredient of them all. It is the inky blackness of space itself that commands our attention as we look at the night sky; not the sparse points of light that signal the presence of widely scattered matter.

During the last few decades, physicists and astronomers have begun to recognize that the notion of empty space presents greater subtleties than had ever before been considered. Space is not merely a passive vessel to be filled by matter and radiation, but is a dynamic, physical entity in its own right.

One chapter in the story of our new conception of space begins with a famous theoretical mistake made nearly 75 years ago that now seems to have taken on a life of its own.

In 1917, Albert Einstein tried to use his newly developed theory of general relativity to describe the shape and evolution of the universe. The prevailing idea at the time was that the universe was static and unchanging. Einstein had fully expected general relativity to support this view, but, surprisingly, it did not. The inexorable force of gravity pulling on every speck of matter demanded that the universe collapse under its own weight.

His remedy for this dilemma was to add a new ‘antigravity’ term to his original equations. It enabled his mathematical universe to appear as permanent and invariable as the real one. This term, usually written as an uppercase Greek lambda, is called the ‘cosmological constant’. It has exactly the same value everywhere in the universe, delicately chosen to offset the tendency toward gravitational collapse at every point in space.

A simple thought experiment may help illustrate the nature of Lambda. Take a cubic meter of space and remove all matter and radiation from it. Most of us would agree that this is a perfect vacuum. But, like a ghost in the night, the cosmological constant would still be there. So, empty space is not really empty at all — Lambda gives it a peculiar ‘latent energy’. In other words, even Nothing is Something!

Einstein’s fudged solution remained unchallenged until 1922 when the Russian mathematician Alexander Friedmann began producing compelling cosmological models based on Einstein’s equations but without the extra quantity. Soon thereafter, theorists closely examining Einstein’s model discovered that, like a pencil balanced on its point, it was unstable to collapse or expansion. Later the same decade, Mount Wilson astronomer Edwin P. Hubble found direct observational evidence that the universe is not static, but expanding.

All this ment that the motivation for introducing the cosmological constant seemed contrived. Admitting his blunder, Einstein retracted Lambda in 1932. At first this seemed to end the debate about its existence. Yet decades later, despite the great physicist’s disavowal, Lambda keeps turning up in cosmologists’ discussions about the origin, evolution, and fate of the universe.

THEORY MEETS OBSERVATION

Friedmann’s standard ‘Big Bang’ model without a cosmological constant predicts that the age of the universe, t0, and its expansion rate (represented by the Hubble parameter, H0) are related by the equation t0 = 2/3H0. Some astronomers favor a value of H0 near 50 kilometers per second per megaparsec (one megaparsec equals 3.26 million light years). But the weight of the observational evidence seems to be tipping the balance towards a value near 100. In the Friedmann model, this implies that the cosmos can be no more than 7 billion years old. Yet some of our galaxy’s globular clusters have ages estimated by independent methods of between 12 and 18 billion years!

In what’s called the Einstein-DeSitter cosmology, the Lambda term helps to resolve this discrepancy. Now a large value for the Hubble parameter can be attributed in part to “cosmic repulsion”. This changes the relationship between t0 and H0, so that for a given size, the universe is older than predicted by the Friedmann model.

In one formulation of Einstein’s equation, Lambda is expressed in units of matter density. This means we can ask how the cosmological constant, if it exists at all, compares with the density of the universe in the forms of stars and galaxies.

So far, a careful look at the available astronomical data has produced only upper limits to the magnitude of Lambda. These vary over a considerable range – from about 10 percent of ordinary matter density to several times that density.

The cosmological constant can also leave its mark on the properties of gravitational lenses and faint galaxies. One of the remarkable features of Einstein’s theory of general relativity is its prediction that space and time become deformed or ‘warped’ in the vicinity of a massive body such as a planet, star or even a galaxy. Light rays passing through such regions of warped “space-time” have their paths altered. In the cosmological arena, nearby galaxies can deflect and distort the images of more distant galaxies behind them. Sometimes, the images of these distant galaxies can appear as multiple images surrounding the nearby ‘lensing’ galaxy.

At Kyoto University M. Fukugita and his coworkers predicted that more faint galaxies and gravitational lenses will be detected than in a Friedmann universe if Lambda is more than a few times the matter density. Edwin Turner, an astrophysicist at Princeton University also reviewed the existing, scant, data on gravitational lenses and found that they were as numerous as expected for Lambda less that a few times the matter density. By the best astronomical reconning, Lambda is probably not larger than the observed average matter density of the universe. For that matter, no convincing evidence is available to suggest that Lambda is not exactly equal to zero. So why not just dismiss it as an unnecessary complication? Because the cosmological constant is no longer, strictly, a construct of theoretical cosmology.

NOTHING AND EVERYTHING

To understand how our universe came into existence, and how its various ingredients have evolved, we must delve deeply into the fundamental constituents of matter and the forces that dictate how it will interact. This means that the questions we will have to ask will have more to do with physics than astronomy. Soon after the big bang, the universe was at such a high temperature and density that only the details of matter’s composition (quarks, electrons etc) and how they interact via the four fundamental forces of nature were important. They represented the most complex collections of matter in existence, long before atoms, planets, stars and galaxies had arrived on the scene.

For two decades now, physicists have been attempting to unify the forces and particles that make up our world – to find a common mathematical description that encompasses them all. Some think that such a Theory of Everything is just within reach. It would account not only for the known forms of matter, but also for the fundamental interactions among them: gravity, electromagnetism, and the strong and weak nuclear forces.

These unification theories are known by a variety of names: grand unification theory, supersymmetry theory and superstring theory. Their basic claim is that Nature operates according to a small set of simple rules called symmetries.

The concept of symmetry is at least as old as the civilization of ancient Greece, whos art and archetecture are masterworks of simplicity and balance. Geometers have known for a long time that a simple cube can be rotated 90 degrees without changing its outward appearance. In two dimensions, equalateral triangles look the same when they are rotated by 120 degrees. These are examples of the geometric concept of Rotation Symmetry.

There are parallels to geometric symmetry in the way that various physical phenomena and qualities of matter express themselves as well. For example, the well-known principle of the Conservation of Energy is a consequence of the fact that when some collections of matter and energy are examined at different times, they each have precisely the same total energy, just as a cube looks the same when it is rotated in space by a prescribed amount. Symmetry under a ‘shift in time’ is as closely related to the Conservation of Energy as is the symmetry of a cube when rotated by 90 degrees.

Among other things, symmetries of Nature dictate the strengths and ranges of the natural forces and the properties of the particles they act upon. Although Nature’s symmetries are hidden in today’s cold world, they reveal themselves at very high temperatures and can be studied in modern particle accelerators.

The real goal in unification theory is actually two-fold: not only to uncover and describe the underlying symmetries of the world, but to find physical mechanisms for ‘breaking’ them at low energy. After all, we live in a complex world filled with a diversity of particles and forces, not a bland world with one kind of force and one kind of particle!

Theoreticians working on this problem are often forced to add terms to their equations that represent entirely new fields in Nature. The concept of a field was invented by mathematicians to express how a particular quantity may vary from point to point in space. Physicists since the 18th century have adopted this idea to describe quantitatively how forces such as gravity and magnetism change at different distances from a body.

The interactions of these fields with quarks, electrons and other particles cause symmetries to break down. These fields are usually very different than those we already know about. The much sought after Higgs boson field, for example, was introduced by Sheldon Glashow, Abdus Salam and Steven Weinberg in their unified theory of the electromagnetic and weak nuclear forces.

Prior to their work, the weak force causing certain particles to decay, and the electromagnetic force responsible for the attraction between charged particles and the motion of compass needles, were both considered to be distinct forces in nature. By combining their mathematical descriptions into a common language, they showed that this distinction was not fundamental to the forces at all! A new field in nature called the Higgs field makes these two forces act differently at low temperature. But at temperatures above 1000 trillion degrees, the weak and electromagnetic forces become virtually identical in the way that they affect matter. The corresponding particles called the Higgs Boson not only cause the symmetry between the electromagnetic and weak forces to be broken at low temperature, but they are also responsible for confiring the property of mass on particles such as the electrons and the quarks!

There is, however a price that must be paid for introducing new fields into the mathematical machinery. Not only do they break symmetries, but they can also give the vacuum state an enormous latent energy that, curiously, behaves just like Lambda in cosmological models.

The embarrassment of having to resurrect the obsolete quantity Lambda is compounded when unification theories are used to predict its value. Instead of being at best a vanishingly minor ingredient to the universe, the predicted values are in some instances 10 to the power of 120 times greater than even the most generous astronomical upper limits!

It is an unpleasant fact of life for physicists that the best candidates for the Theory of Everything always have to be fine-tuned to get rid of their undesirable cosmological consequences. Without proper adjustment, these candidates may give correct predictions in the microscopic world of particle physics, but predict a universe which on its largest scales looks very different from the one we inhabit.

Like a messenger from the depths of time, the smallness – or absence – of the cosmological constant today is telling us something important about how to craft a correct Theory of Everything. It is a signpost of the way Nature’s symmetries are broken at low energy, and a nagging reminder that our understanding of the physical world is still incomplete in some fundamental way.

A LIKELY STORY

Most physicists expect the Theory of Everything will describe gravity the same way we now describe matter and the strong, weak and electromagnetic forces – in the language of quantum mechanics. Gravity is, after all, just another force in Nature. So far this has proven elusive, due in part to the sheer complexity of the equations of general relativity. Scientists since Einstein have described gravity ( as well as space and time) in purely geometric terms. Thus we speak of gravity as the “curvature of space-time”.

To acheive complete unification, the dialects of quantum matter and geometric space have to be combined into a single language. Matter appears to be rather precisely described in terms of the language of quantum mechanics. Quarks and electrons exchange force-carrying particles such as photons and gluons and thereby feel the electromagnetic and strong nuclear forces. But, gravity is described by Einstein’s theory of general relativity as a purely geometric phenomenon. These geometric ideas of curvature and the dimensionality of space have nothing to do with quantum mechanics.

To unify these two great foundations of physics, a common language must be found. This new language will take some getting used to. In it, the distinction between matter and space dissolves away and is lost completely; matter becomes a geometric phenomenon, and at the same time, space becomes an exotic form of matter.

Beginning with work on a quantum theory of gravity by John Wheeler and Bryce DeWitt in the 1960’s, and continuing with the so-called superstring theory of John Schwartz and Michael Green in the 1980’s, a primitive version of such a ‘quantum-geometric’ language is emerging. Not surprisingly, it borrows many ideas from ordinary quantum mechanics.

A basic concept in quantum mechanics is that every system of elementary particles is defined by a mathematical quantity called a wave function. This function can be used, for example, to predict the probability of finding an electron at a particular place and time within an atom. Rather than a single quantity, the wave function is actually a sum over an infinite number of factors or ‘states’, each representing a possible measurement outcome. Only one of these states can be observed at a time.

By direct analogy, in quantum gravitation, the geometry of space-time, whether flat or curved, is only one of an infinite variety of geometric shapes for space-time, and therefore the universe. All of these possibilities are described as separate states in the wave function for the universe.

But what determines the probability that the universe will have the particular geometry we now observe out of the infinitude of others? In quantum mechanics, the likelihood that an electron is located somewhere within an atom is determined by the external electric field acting on it. That field is usually provided by the protons in the atomic nucleus. Could there be some mysterious field ‘outside’ our universe that determines its probability?

According to Cambridge University theorist Stephen Hawking, this is the wrong way to look at the problem. Unlike the electron acted upon by protons, our universe is completely self-contained. It requires no outside conditions or fields to help define its probability. The likelihood that our universe looks the way it does depends only on the strengths of the fields within it.

Among these internal fields, there may even be ones that we haven’t yet discovered. Could the cosmological constant be the fingerprint in our universe of a new ‘hidden’ field in Nature? This new field could affect the likelihood of our universe just as a kettle of soup may contain unknown ingredients although we can still precisely determine the kettle’s mass.

A series of mathematical considerations led Hawking to deduce that the weaker the hidden field becomes, the smaller will be the value we observe for the cosmological constant, and surprisingly, the more likely will be the current geometry of the universe.

This, in turn, implies that if Lambda were big enough to measure by astronomers in the first place, our universe would be an improbable one. Philosophically, this may not trouble those who see our cosmos as absolutely unique, but in a world seemingly ruled by probability, a counter view is also possible. There may, in fact, exist an infinite number of universes, but only a minority of them have the correct blend of physical laws and physical conditions resembling our life-nurturing one.

Hawking continued his line of speculation by suggesting that, if at the so-called Planck scale of 10 to the power of -33 centimeters the cosmos could be thought of as an effervescent landscape, or “space-time foam”, then perhaps a natural mechanism could exist for eliminating the cosmological constant for good.

One of the curiosities of combining the speed of light and Newton’s constant of gravitation from general relativity, with Planck’s constant from quantum mechanics, is that they can be made to define unique values for length, time and energy. Physicists believe that at these Planck scales represented by 10 to the power of -33 centimeters and 10 to the power of -43 seconds, general relativity and quantum mechanics blend together to become a single, comprehensive theory of the physical world: The Theory Of Everything. The energy associated with this unification, 10 to the power of 19 billion electron volts, is almost unimaginably big by the standards of modern technology.

The universe itself, soon after the Big Bang, must also have passed through such scales of space, time and energy during its first instants of existence. Cosmologists refer to this period as the Planck Era. It marks the earliest times that physicists are able to explore the universe’s physical state without having a complete Theory of Everything to guide them.

WORMHOLES

Harvard University physicist Sidney Coleman has recently pursued this thought to a possible conclusion. Instead of some mysterious new field in Nature, maybe the Lambda term appears in our theories because we are using the wrong starting model for the geometry of space at the Planck scale.

Previous thinking on the structure of space-time had assumed that it behaved in some sense like a smooth rubber sheet. Under the action of matter and energy, space-time could be deformed into a variety of shapes, each a possible geometric state for the universe. Nearly all candidates for the Theory of Everything’s embed their fields and symmetries in such a smooth geometrical arena.

But what if space-time were far more complicated? One possibility is that ‘wormholes’ exist, filling space-time with a network of tunnels. The fabric of space-time may have more in common with a piece of Swiss cheese than with a smooth rubber sheet.

According to Coleman, the addition of wormholes to space-time means that, like the ripples from many stones tossed into a pond, one geometric state for the universe could interfere with another. The most likely states ( or the biggest ripples) would win out. The mathematics suggest that quantum wormhole interference at the Planck scale makes universes with cosmological constants other than zero exceedingly unlikely.

How big would wormholes have to be to have such dramatic repurcussions? Surprisingly, the calculations suggest that small is beautiful. Wormholes the size of dogs and planets would be very rare. Universes containing even a few of them would exist with a vanishingly low probability. But wormholes smaller than 10 to the power of -33 centimeters could be everywhere. A volume the size of a sugar cube might be teeming with uncounted trillions of them flashing in and out of existence!

Coleman proposes that the action of these previously ignored mini- wormholes upon the geometric fabric of the universe that forces Lambda to be almost exactly zero. Like quantum ‘Pac Men’, they gobble up all the latent energy of space-time that would otherwise have appeared to us in the form of a measureable cosmological constant!

The addition of wormholes to the description of space-time admits the possibility that our universe did not spring into being aloof and independent, but was influenced by how other space-times had already evolved – ghostly mathematical universes with which we can never communicate directly.

The most likely of these universes had Lambda near zero, and it is these states that beat out all other contenders. In a bizarre form of quantum democracy, our universe may have been forced to follow the majority, evolving into the high probability state we now observe, without a detectable cosmological constant.

EPILOG

Wormholes? Wave functions? Hidden fields? The answer to the cosmological constant’s smallness, or absence, seems to recede into the farthest reaches of abstract thinking, faster than most of us can catch up.

As ingenious as these new ideas may seem, the final pages in this unusual story have probably not been written, especially since we can’t put any of these ideas to a direct test. It is a tribute to Einstein’s genius that even his ‘biggest blunder’ made near the beginning of this century still plagues physicists and astronomers as we prepare to enter the 21st century. Who would ever have thought that something that may not even exist would lead to such enormous problems!