Tag Archives: grand unification theory

Einstein’s Fudge

Einstein’s Cosmic Fudge Factor

Written by Sten Odenwald
Copyright (C) 1991. Sky Publishing Corporation. Reprinted by permission. See April, 1991 issue

Black holes…quarks…dark matter. It seems like the cosmos gets a little stranger every year. Until recently, the astronomical universe known to humans was populated by planets, stars, galaxies, and scattered nebulae of dust and gas. Now, theoretists tell us it may also be inhabited by objects such as superstrings, dark matter and massive neutrinos — objects that have yet to be discovered if they exist at all!
As bizarre as these new constituents may sound, you don’t have to be a rocket scientist to appreciate the most mysterious ingredient of them all. It is the inky blackness of space itself that commands our attention as we look at the night sky; not the sparse points of light that signal the presence of widely scattered matter.

During the last few decades, physicists and astronomers have begun to recognize that the notion of empty space presents greater subtleties than had ever before been considered. Space is not merely a passive vessel to be filled by matter and radiation, but is a dynamic, physical entity in its own right.

One chapter in the story of our new conception of space begins with a famous theoretical mistake made nearly 75 years ago that now seems to have taken on a life of its own.

In 1917, Albert Einstein tried to use his newly developed theory of general relativity to describe the shape and evolution of the universe. The prevailing idea at the time was that the universe was static and unchanging. Einstein had fully expected general relativity to support this view, but, surprisingly, it did not. The inexorable force of gravity pulling on every speck of matter demanded that the universe collapse under its own weight.

His remedy for this dilemma was to add a new ‘antigravity’ term to his original equations. It enabled his mathematical universe to appear as permanent and invariable as the real one. This term, usually written as an uppercase Greek lambda, is called the ‘cosmological constant’. It has exactly the same value everywhere in the universe, delicately chosen to offset the tendency toward gravitational collapse at every point in space.

A simple thought experiment may help illustrate the nature of Lambda. Take a cubic meter of space and remove all matter and radiation from it. Most of us would agree that this is a perfect vacuum. But, like a ghost in the night, the cosmological constant would still be there. So, empty space is not really empty at all — Lambda gives it a peculiar ‘latent energy’. In other words, even Nothing is Something!

Einstein’s fudged solution remained unchallenged until 1922 when the Russian mathematician Alexander Friedmann began producing compelling cosmological models based on Einstein’s equations but without the extra quantity. Soon thereafter, theorists closely examining Einstein’s model discovered that, like a pencil balanced on its point, it was unstable to collapse or expansion. Later the same decade, Mount Wilson astronomer Edwin P. Hubble found direct observational evidence that the universe is not static, but expanding.

All this ment that the motivation for introducing the cosmological constant seemed contrived. Admitting his blunder, Einstein retracted Lambda in 1932. At first this seemed to end the debate about its existence. Yet decades later, despite the great physicist’s disavowal, Lambda keeps turning up in cosmologists’ discussions about the origin, evolution, and fate of the universe.

THEORY MEETS OBSERVATION

Friedmann’s standard ‘Big Bang’ model without a cosmological constant predicts that the age of the universe, t0, and its expansion rate (represented by the Hubble parameter, H0) are related by the equation t0 = 2/3H0. Some astronomers favor a value of H0 near 50 kilometers per second per megaparsec (one megaparsec equals 3.26 million light years). But the weight of the observational evidence seems to be tipping the balance towards a value near 100. In the Friedmann model, this implies that the cosmos can be no more than 7 billion years old. Yet some of our galaxy’s globular clusters have ages estimated by independent methods of between 12 and 18 billion years!

In what’s called the Einstein-DeSitter cosmology, the Lambda term helps to resolve this discrepancy. Now a large value for the Hubble parameter can be attributed in part to “cosmic repulsion”. This changes the relationship between t0 and H0, so that for a given size, the universe is older than predicted by the Friedmann model.

In one formulation of Einstein’s equation, Lambda is expressed in units of matter density. This means we can ask how the cosmological constant, if it exists at all, compares with the density of the universe in the forms of stars and galaxies.

So far, a careful look at the available astronomical data has produced only upper limits to the magnitude of Lambda. These vary over a considerable range – from about 10 percent of ordinary matter density to several times that density.

The cosmological constant can also leave its mark on the properties of gravitational lenses and faint galaxies. One of the remarkable features of Einstein’s theory of general relativity is its prediction that space and time become deformed or ‘warped’ in the vicinity of a massive body such as a planet, star or even a galaxy. Light rays passing through such regions of warped “space-time” have their paths altered. In the cosmological arena, nearby galaxies can deflect and distort the images of more distant galaxies behind them. Sometimes, the images of these distant galaxies can appear as multiple images surrounding the nearby ‘lensing’ galaxy.

At Kyoto University M. Fukugita and his coworkers predicted that more faint galaxies and gravitational lenses will be detected than in a Friedmann universe if Lambda is more than a few times the matter density. Edwin Turner, an astrophysicist at Princeton University also reviewed the existing, scant, data on gravitational lenses and found that they were as numerous as expected for Lambda less that a few times the matter density. By the best astronomical reconning, Lambda is probably not larger than the observed average matter density of the universe. For that matter, no convincing evidence is available to suggest that Lambda is not exactly equal to zero. So why not just dismiss it as an unnecessary complication? Because the cosmological constant is no longer, strictly, a construct of theoretical cosmology.

NOTHING AND EVERYTHING

To understand how our universe came into existence, and how its various ingredients have evolved, we must delve deeply into the fundamental constituents of matter and the forces that dictate how it will interact. This means that the questions we will have to ask will have more to do with physics than astronomy. Soon after the big bang, the universe was at such a high temperature and density that only the details of matter’s composition (quarks, electrons etc) and how they interact via the four fundamental forces of nature were important. They represented the most complex collections of matter in existence, long before atoms, planets, stars and galaxies had arrived on the scene.

For two decades now, physicists have been attempting to unify the forces and particles that make up our world – to find a common mathematical description that encompasses them all. Some think that such a Theory of Everything is just within reach. It would account not only for the known forms of matter, but also for the fundamental interactions among them: gravity, electromagnetism, and the strong and weak nuclear forces.

These unification theories are known by a variety of names: grand unification theory, supersymmetry theory and superstring theory. Their basic claim is that Nature operates according to a small set of simple rules called symmetries.

The concept of symmetry is at least as old as the civilization of ancient Greece, whos art and archetecture are masterworks of simplicity and balance. Geometers have known for a long time that a simple cube can be rotated 90 degrees without changing its outward appearance. In two dimensions, equalateral triangles look the same when they are rotated by 120 degrees. These are examples of the geometric concept of Rotation Symmetry.

There are parallels to geometric symmetry in the way that various physical phenomena and qualities of matter express themselves as well. For example, the well-known principle of the Conservation of Energy is a consequence of the fact that when some collections of matter and energy are examined at different times, they each have precisely the same total energy, just as a cube looks the same when it is rotated in space by a prescribed amount. Symmetry under a ‘shift in time’ is as closely related to the Conservation of Energy as is the symmetry of a cube when rotated by 90 degrees.

Among other things, symmetries of Nature dictate the strengths and ranges of the natural forces and the properties of the particles they act upon. Although Nature’s symmetries are hidden in today’s cold world, they reveal themselves at very high temperatures and can be studied in modern particle accelerators.

The real goal in unification theory is actually two-fold: not only to uncover and describe the underlying symmetries of the world, but to find physical mechanisms for ‘breaking’ them at low energy. After all, we live in a complex world filled with a diversity of particles and forces, not a bland world with one kind of force and one kind of particle!

Theoreticians working on this problem are often forced to add terms to their equations that represent entirely new fields in Nature. The concept of a field was invented by mathematicians to express how a particular quantity may vary from point to point in space. Physicists since the 18th century have adopted this idea to describe quantitatively how forces such as gravity and magnetism change at different distances from a body.

The interactions of these fields with quarks, electrons and other particles cause symmetries to break down. These fields are usually very different than those we already know about. The much sought after Higgs boson field, for example, was introduced by Sheldon Glashow, Abdus Salam and Steven Weinberg in their unified theory of the electromagnetic and weak nuclear forces.

Prior to their work, the weak force causing certain particles to decay, and the electromagnetic force responsible for the attraction between charged particles and the motion of compass needles, were both considered to be distinct forces in nature. By combining their mathematical descriptions into a common language, they showed that this distinction was not fundamental to the forces at all! A new field in nature called the Higgs field makes these two forces act differently at low temperature. But at temperatures above 1000 trillion degrees, the weak and electromagnetic forces become virtually identical in the way that they affect matter. The corresponding particles called the Higgs Boson not only cause the symmetry between the electromagnetic and weak forces to be broken at low temperature, but they are also responsible for confiring the property of mass on particles such as the electrons and the quarks!

There is, however a price that must be paid for introducing new fields into the mathematical machinery. Not only do they break symmetries, but they can also give the vacuum state an enormous latent energy that, curiously, behaves just like Lambda in cosmological models.

The embarrassment of having to resurrect the obsolete quantity Lambda is compounded when unification theories are used to predict its value. Instead of being at best a vanishingly minor ingredient to the universe, the predicted values are in some instances 10 to the power of 120 times greater than even the most generous astronomical upper limits!

It is an unpleasant fact of life for physicists that the best candidates for the Theory of Everything always have to be fine-tuned to get rid of their undesirable cosmological consequences. Without proper adjustment, these candidates may give correct predictions in the microscopic world of particle physics, but predict a universe which on its largest scales looks very different from the one we inhabit.

Like a messenger from the depths of time, the smallness – or absence – of the cosmological constant today is telling us something important about how to craft a correct Theory of Everything. It is a signpost of the way Nature’s symmetries are broken at low energy, and a nagging reminder that our understanding of the physical world is still incomplete in some fundamental way.

A LIKELY STORY

Most physicists expect the Theory of Everything will describe gravity the same way we now describe matter and the strong, weak and electromagnetic forces – in the language of quantum mechanics. Gravity is, after all, just another force in Nature. So far this has proven elusive, due in part to the sheer complexity of the equations of general relativity. Scientists since Einstein have described gravity ( as well as space and time) in purely geometric terms. Thus we speak of gravity as the “curvature of space-time”.

To acheive complete unification, the dialects of quantum matter and geometric space have to be combined into a single language. Matter appears to be rather precisely described in terms of the language of quantum mechanics. Quarks and electrons exchange force-carrying particles such as photons and gluons and thereby feel the electromagnetic and strong nuclear forces. But, gravity is described by Einstein’s theory of general relativity as a purely geometric phenomenon. These geometric ideas of curvature and the dimensionality of space have nothing to do with quantum mechanics.

To unify these two great foundations of physics, a common language must be found. This new language will take some getting used to. In it, the distinction between matter and space dissolves away and is lost completely; matter becomes a geometric phenomenon, and at the same time, space becomes an exotic form of matter.

Beginning with work on a quantum theory of gravity by John Wheeler and Bryce DeWitt in the 1960’s, and continuing with the so-called superstring theory of John Schwartz and Michael Green in the 1980’s, a primitive version of such a ‘quantum-geometric’ language is emerging. Not surprisingly, it borrows many ideas from ordinary quantum mechanics.

A basic concept in quantum mechanics is that every system of elementary particles is defined by a mathematical quantity called a wave function. This function can be used, for example, to predict the probability of finding an electron at a particular place and time within an atom. Rather than a single quantity, the wave function is actually a sum over an infinite number of factors or ‘states’, each representing a possible measurement outcome. Only one of these states can be observed at a time.

By direct analogy, in quantum gravitation, the geometry of space-time, whether flat or curved, is only one of an infinite variety of geometric shapes for space-time, and therefore the universe. All of these possibilities are described as separate states in the wave function for the universe.

But what determines the probability that the universe will have the particular geometry we now observe out of the infinitude of others? In quantum mechanics, the likelihood that an electron is located somewhere within an atom is determined by the external electric field acting on it. That field is usually provided by the protons in the atomic nucleus. Could there be some mysterious field ‘outside’ our universe that determines its probability?

According to Cambridge University theorist Stephen Hawking, this is the wrong way to look at the problem. Unlike the electron acted upon by protons, our universe is completely self-contained. It requires no outside conditions or fields to help define its probability. The likelihood that our universe looks the way it does depends only on the strengths of the fields within it.

Among these internal fields, there may even be ones that we haven’t yet discovered. Could the cosmological constant be the fingerprint in our universe of a new ‘hidden’ field in Nature? This new field could affect the likelihood of our universe just as a kettle of soup may contain unknown ingredients although we can still precisely determine the kettle’s mass.

A series of mathematical considerations led Hawking to deduce that the weaker the hidden field becomes, the smaller will be the value we observe for the cosmological constant, and surprisingly, the more likely will be the current geometry of the universe.

This, in turn, implies that if Lambda were big enough to measure by astronomers in the first place, our universe would be an improbable one. Philosophically, this may not trouble those who see our cosmos as absolutely unique, but in a world seemingly ruled by probability, a counter view is also possible. There may, in fact, exist an infinite number of universes, but only a minority of them have the correct blend of physical laws and physical conditions resembling our life-nurturing one.

Hawking continued his line of speculation by suggesting that, if at the so-called Planck scale of 10 to the power of -33 centimeters the cosmos could be thought of as an effervescent landscape, or “space-time foam”, then perhaps a natural mechanism could exist for eliminating the cosmological constant for good.

One of the curiosities of combining the speed of light and Newton’s constant of gravitation from general relativity, with Planck’s constant from quantum mechanics, is that they can be made to define unique values for length, time and energy. Physicists believe that at these Planck scales represented by 10 to the power of -33 centimeters and 10 to the power of -43 seconds, general relativity and quantum mechanics blend together to become a single, comprehensive theory of the physical world: The Theory Of Everything. The energy associated with this unification, 10 to the power of 19 billion electron volts, is almost unimaginably big by the standards of modern technology.

The universe itself, soon after the Big Bang, must also have passed through such scales of space, time and energy during its first instants of existence. Cosmologists refer to this period as the Planck Era. It marks the earliest times that physicists are able to explore the universe’s physical state without having a complete Theory of Everything to guide them.

WORMHOLES

Harvard University physicist Sidney Coleman has recently pursued this thought to a possible conclusion. Instead of some mysterious new field in Nature, maybe the Lambda term appears in our theories because we are using the wrong starting model for the geometry of space at the Planck scale.

Previous thinking on the structure of space-time had assumed that it behaved in some sense like a smooth rubber sheet. Under the action of matter and energy, space-time could be deformed into a variety of shapes, each a possible geometric state for the universe. Nearly all candidates for the Theory of Everything’s embed their fields and symmetries in such a smooth geometrical arena.

But what if space-time were far more complicated? One possibility is that ‘wormholes’ exist, filling space-time with a network of tunnels. The fabric of space-time may have more in common with a piece of Swiss cheese than with a smooth rubber sheet.

According to Coleman, the addition of wormholes to space-time means that, like the ripples from many stones tossed into a pond, one geometric state for the universe could interfere with another. The most likely states ( or the biggest ripples) would win out. The mathematics suggest that quantum wormhole interference at the Planck scale makes universes with cosmological constants other than zero exceedingly unlikely.

How big would wormholes have to be to have such dramatic repurcussions? Surprisingly, the calculations suggest that small is beautiful. Wormholes the size of dogs and planets would be very rare. Universes containing even a few of them would exist with a vanishingly low probability. But wormholes smaller than 10 to the power of -33 centimeters could be everywhere. A volume the size of a sugar cube might be teeming with uncounted trillions of them flashing in and out of existence!

Coleman proposes that the action of these previously ignored mini- wormholes upon the geometric fabric of the universe that forces Lambda to be almost exactly zero. Like quantum ‘Pac Men’, they gobble up all the latent energy of space-time that would otherwise have appeared to us in the form of a measureable cosmological constant!

The addition of wormholes to the description of space-time admits the possibility that our universe did not spring into being aloof and independent, but was influenced by how other space-times had already evolved – ghostly mathematical universes with which we can never communicate directly.

The most likely of these universes had Lambda near zero, and it is these states that beat out all other contenders. In a bizarre form of quantum democracy, our universe may have been forced to follow the majority, evolving into the high probability state we now observe, without a detectable cosmological constant.

EPILOG

Wormholes? Wave functions? Hidden fields? The answer to the cosmological constant’s smallness, or absence, seems to recede into the farthest reaches of abstract thinking, faster than most of us can catch up.

As ingenious as these new ideas may seem, the final pages in this unusual story have probably not been written, especially since we can’t put any of these ideas to a direct test. It is a tribute to Einstein’s genius that even his ‘biggest blunder’ made near the beginning of this century still plagues physicists and astronomers as we prepare to enter the 21st century. Who would ever have thought that something that may not even exist would lead to such enormous problems!

The Planck Era

The Planck Era

Written by Sten Odenwald. Copyright (C) 1984 Kalmbach Publishing. Reprinted by permission

The Big Bang theory says that the entire universe was created in a tremendous explosion about 20 billion years ago. The enormity of this event is hard to grasp and it seems natural to ask ourselves ‘What was it like then?’ and ‘What happened before the Big Bang?’. To try to answer these queries, lets take a brief journey backwards in time.
We first see the formation of our own sun about 15 billion years after the Big Bang and then by 5 billion years, the formation of the first galaxies. By 700,000 years, the universe is awash with the fireball radiation that keeps all matter at a temperature of 4,000 degrees. Because of this, darkness is completely absent since every point in the sky glows with the brilliance of the sun. No stars, planets or even dust grains exist, just a hot dense plasma of electrons, protons and helium nuclei. By 3 minutes, we see helium form from the fusion of hydrogen atoms while the universe seeths at a temperature of nearly 1 billion degrees. The average density of matter is that of lead. By 1 second, the Lepton Era ends and the ratio of neutrons to protons has become fixed at 1 neutron for every 5 protons. The temperature is now 5 billion degrees everywhere. At about .0001 second, we watch as the Quark Era ends and the temperature of the fireball radiation rises to an incredable 1 trillion degrees. Quarks, for the first time, can combine in groups of two and three to become neutrons, protons and other types of heavy particles. The universe is now packed with matter as densly as the nucleus of an atom. A mountain like Mt. Everest could be squeezed into a volume no greater than the size of a golf ball!

By 1 billionth of a second, the temperature is 1 thousand trillion degrees and we see the electromagnetic and weak forces merge into one force. The density of the universe has increased to the point where the entire earth could be contained in a thimble. Quarks and anti-quarks are no longer confined inside of particles like neutrons and protons but are now part of a superheated plasma of unbound particles. As the remaining history of the universe unfolds, a long period seems to pass when nothing really new happens. Then, at a time 10(-35) second after the Big Bang, a spectac ular change in the size of the universe occurs. This is the GUT Era when the strong nuclear force becomes distinguishable from the weak and electromagnetic forces. The temperature is an incredable 10 thousand trillion trillion degrees and the density of matter has sored to nearly 10(75) gm/cm3. This number is so enormous that even our analogies are almost beyond comprehension. At these densities, the entire Milky Way galaxy could easily be stuffed into a volume no larger than a single hydrogen atom! Electrons and quarks together with their anti-particles, were the major constituents of matter and very massive particles called Leptoquark Bosons caused the quarks to decay into electrons and vice versa. If we now move forward in time we would witness the vacuum of space undergoing a ‘phase transition’ from a higher energy state to a lower energy state. This is analogous to a ball rolling down the side of a mountain and coming to rest in the lowest valley. As the universe ‘rolls down hill’ it begins a brief but stupendous period of expansion. The universe swells to billions of times its former size in almost no time at all.

In addition to this, a slight excess of matter over anti-matter appears becaus of the decay of massive particles called X Higgs Bosons. As we continue to watch the universe age, the remaining pairs of particles and anti-particles find themselves and vanish in a tremendous burst of annihilation. From this paroxysm, the bulk of the fireball radiation that we now observe is born.

The GUT Era is the last stop in our fanciful journey through time. If we had asked what it was like before the GUT Era, we would immediately have entered a vast no mans land where few indisputable facts would serve to gui de us. What does seem clear is that gravity is destined to grow in importance, eventually becoming the dominant force acting between parti cles, even at the microscopic level.

G R A V I T Y

According to theories developed since the 1930’s, what we call a ‘force’ is actually a collective phenomenon caused by the exchange of innumerable, force-carrying particles called gauge bosons. The electromagnetic force, which causes like charges to attract and dissimilar ones to repel, is transmitted by gauge bosons called photons, the strong force that binds nucleii together is transmitted by gluons and the weak force which causes particles to decay is transmitted by the, recently discovered, W and Z Intermediate Vector Bosons. In an analogous way, physicists believe that gravity is transmitted by particles called Gravitons. If gravity really does have such a quantum property, its effects should appear once quarks and electrons can be forced to within 10(-33) centimeter of one another, a distance called the Planck length. To acheive these conditions, quarks and electrons will have to be collided at energies of 10(19) GeV. An accelerator patterned after the 2-mile, Stanford Linear Accelerator would have to be 1 light-year in length to push particles to these incredable energies! Fortunatly, what humans find impossible to do, Nature with its infinite resources finds less difficult. Before the universe was 10(-43) second old, matter routinely experienced collisions at these energies. This period is what we call the Planck Era.

THROUGH A LOOKING GLASS, DARKLEY

Since our technology will not allow us to physically reproduce the conditions during these ancient times, we must use our mathematical theories of how matter behaves to mentally explore what the universe was like then. We know that the appearence of the universe before 10(-43) second can only be adequatly described by modifying the Big Bang theory because this theory is, in turn, based on the General Theory of Relativity. General Relativity tells us how gravity operates on the macroscopic scale of planets, stars and galaxies. At the Planck scale, we need to extend General Relativity so that it includes not only the macroscopic properties of gravity but also is microscopic characteristics as well. The theory of ‘Quantum Gravity’ is still far from completion but physicists tend to agree that, at the very least, Quantum Gravity must combine the conceptual elements of the two great theories of modern physics: General Relativity and Quantum Mechanics.

In the language of General Relativity, gravity is a consequence of the deformati on of space caused by the presence of matter and energy. Gravity is just another name for the amount of curvature in the geometry of 3-dimensional space. In Quantum Gravity theory, gravity is produced by massless gravitons so that gravitons now represent individual packages of curved space that travel through space at the speed of light.

The appearence and dissappearence of innumerable gravitons gives the geometry of space a very lumpy and dynamic appearance. John Wheeler at Princeton University thinks of this as a foamy, sub-structure to space where the geometry of space twists and contorts so that far flung regions of space may suddenly find themselves connected by ‘wormholes’ which constantly appear and dissappear within 10(-43) seconds. Even as you are reading this article, this frenetic activity is occurring in the hyper-microscopic domain, 100 billion billion times smaller than the nucleus of an atom. For a comparison, the size of the sun and the size of a single atom stand in about this same proportion. Although Quantum Gravity effects are completely undetectable today at the atomic and nuclear scale, during the Planck Era, macroscopic and microscopic worlds merged and the Quantum Gravity of the microcosm suddenly became the Quantum Cosmology of the macrocosm!

QUANTUM COSMOLOGY

As we approach the end of the Planck Era, the random appearance and dissappearance of innumerable gravitons will eventually force us to give up the concept of a specific geometry to 3-dimensional space. Instead, the geometry at a given moment will have to be thought of as an average over all 3-dimensional space geometries that are possible. Once again, the reason for this is that particles are squeezed so closely together that we can now see individual gravitons moving around in the space between them causing space to become curved. We can no longer get away with saying that the space between two quarks, for example, is flat. This is what we mean when we say that the gravitational force between them is insignificant when compared to the other three forces of Nature.

To make matters much worse, not only will Quantum Gravity not allow us to calculate the exact 3-dimensional geometry to space but, at the Planck scale, it will not allow us to simultaneously determine its exact geometry and precise rate of change in time. What this means is that we may never be able to calculate with any certainty exactly what the history of the universe was like before 10-43 second. Today, the large-scale geometry of space is one of three possible types: flat and infinite, negatively curved and infinite or positively curved and finite. During the Planck Era, the ‘large-scale’ geometry was contorted by wormholes and and infinite number of possibilities were possible. To probe the history of the universe then would be like trying to trace your ancestral roots if every human being on earth had a possibility of being one of your parents. Now try to trace your family tree back a few generations! The farther back in time you go, the greater are the number of possible ancestors you could have had. An entirely new conception of what we mea n by ‘a history for the universe’ will have to be developed. Even the concepts of space and time will have to be completely re-evaluated in the face of the qua ntum fluctuations of spacetime at the Planck Era!

THE BIRTH OF THE UNIVERSE

The picture that seems to emerge from using our sketchy outline of what Quantum Gravity theory might look like is that as we approach the Planck Era, gravitons are exchanged between quarks and electrons with increasingly higher energy and in greater number. By the time we reach the end of the Planck Era at 10(-43) second, gravitons will begin to carry as much energy as the other force carriers (Gluons, IVBs and Photons). At still earlier times, a period of complet e symmetry and unification between all the natural forces will ensue. Only one super-unified force exists here (gravity) and only one kind of particle dominates the activity of this age(Gravitons).

During the early 70’s, the Russian physicists Ya. Zel’dovitch and A. Starobinski of the USSR Academy of Science proposed that the rapidly changing geometry of space during the Planck Era may actually have created all the matter, anti-matter and radiation that existed soon after Creation. In their picture of Creation, the rapidly changing geometry of space created particles and anti-particles with masses of 10(19) GeV. This production of matter and anti-matter removed energy from the enormous fluctuations occuring in the geometry of space and eventually succeeded in damping them out altogether by the end of the Planck Era. They also found that the rate of particle creation increased as more and more particles were created.

Several recent studies by Physicists Edward Tryon of Hunter College, R. Brout, F. Englert and E. Gunzig of the University of Brussels and david Atkatz and Heinz Pagels of the Rockefeller University have shed additional light on what Creation may have been like. Imagine if you can, nothing at all! This is the primordial vacuum of space. There is complete darkness here, no light yet exists. The number of dimensions to space was probably not the normal 3 that we are so accustomed to but may have been as high as 11 according to Supergravity theory! In this infinite emptiness, random fluctuations occurred that ever so slightly changed the energy of the vacuum at various points in space. Eventually, one of these fluctuations attained a critical energy and began to grow. As it grew, very massive particles called leptoquarks and anti-leptoquarks were created, causing the expansion to accelerate. This is much like a ball rolling down a hill that moves slowly at first and then gains momentum. The expansion of the proto-universe, in turn, caused still more leptoquarks to be created. This furious cycle continued until, at long last, the leptoquarks decayed into quarks, leptons (electrons, muons etc) and their anti-particles and the universe emerged from the Planck Era. Particle creation stopped once the fluctuations in the geometry of space subsided.

So, we are left with the remarkable possibility that, in the beginning, there ex isted quite literally, nothing at all and from it emerged nearly all of the matter and radiation that we now see. This process has been described by the physicist Frank Wilczyk at the University of California, Santa Barbara by saying, ” The reason that there is something instead of nothing is that nothing is unstable”. A ball sitting on the summit of a steep hill needs but the slightest tap to set it in motion. A random fluctuation in space was apparently all that was required to unleash the incredable latent energy of the vacuum, thus creating matter and energy and an expanding universe from ‘nothing at all’.

The universe did not spring into being instantaneously but was created a little bit at a time in a ‘bootstrap’ process. Once a few particles were created by quantum fluctuations of the empty vacuum, it became easier for a few more to appear and so, in a rapidly escalating process, the universe gushed forth from nothingness.

How long did this take? The primordial vacuum could have existed for an eternity before the particular fluctuation that gave rise to our universe happened. Physicist Edward Tryon expresses this best by saying that ” Our universe is simply one of those things that happens from time to time”.

The principles of Quantum Gravity may ultimatly force us to reconsider questions like ‘What happened before the Big Bang?’ because they imply the existence of something (time) that may not have any meaning at all. These questions may be as empty of meaning as an explorer on the north pole asking, ‘Which way is North?’. Only the complete theory of Quantum Gravity may tell us how to ask the right questions!