Tag Archives: spacetime

Thinking about Nothing

Looking back at the millennia of model building and deduction that has occurred, not a century has gone by when the prevailing opinion hasn’t been that a perfectly empty vacuum is impossible.

Aristotle’s Aether blends seamlessly into the 19th century Ether. In this century, overlapping quantum waves and virtual particles have finally taken root as the New Ether, though it is now infinitely more ephemeral than anything Aristotle or Maxwell could have imagined. We have also seen how the Atomist School of ancient Greece reached its final vindication in the hands of 19th century scientists such as Boltzman. By the 20th century, the Atomist’s paradigm has even been extended to include not just the graininess of matter, but the possible quantum graininess of the vacuum and space itself. In the virtual particles that animate matter, we finally glimpse the world which Heinrich Hertz warned us about nearly a century ago when he said that we would eventually have to reach some accommodation with “invisible confederates” existing alongside what we can see, to make our whole model of reality more logically self-consistent.

Even by the start of the 21st Century, we have reached this accommodation only by shrugging our shoulders and honestly admitting that there are things going on in the world that seem to defy human intuition. What impresses me most about the evolution of our vision of the vacuum is that the imagery we find so potent today is actually in some sense thousands of years old.

It is difficult to imagine that humans would be drawn to the same understanding of physics and astronomy that we now enjoy if our brains had been wired only slightly differently. Without sight and mobility we could not form the slightest notion of 3-D space and geometry. This is what Kant spoke about, what Henri Poincare described at great length without the benefit of 20th century neuroscience, and what Jacob Bronowski described in his book The Origins of Knowledge and Imagination with the benefit of such knowledge. But the object of science is more than just making sense of our senses. It must also guide us towards a deeper understanding of the physical world. This understanding must be self-consistent, and independent of whether we are sensorially or neurologically handicapped. Mathematics as the premier language of physical model building, seems uniquely suited to providing us with an understanding of the physical world. Mathematics lets us see the world in a way that all of the other human languages do not.

If our mathematical understanding of nature is a product of mental activity, and this activity can be physically affected by the hard-wiring of our brain, how do we arrive at a coherent model of the physical world? Can we see in this process any explanation for why certain ideas in physics appear to be so historically tenacious?

It is commonly believed that in order for mathematics and the underlying logic to exist, at the very least a conscious language must be pre-existent to support it. This is the point of view expressed by Benjamin Whorf. But the thoughtful reflections by individuals such as Einstein, Feynman and Penrose point in a very different direction. Einstein once wrote a note to Jaques Hadamard prompted by Hadamard’s investigation of creative thinking,

“…The words of language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements of thought are certain signs ( symbols ) and more or less clear images which can be voluntarily reproduced and combined…The above mentioned elements are, in my case, of visual and some muscular type…”

Roger Penrose echoes some of this same description in his book, The Emperor’s New Mind,

“…Almost all my mathematical thinking is done visually and in terms of non-verbal concepts, although the thoughts are quite often accompanied by inane and almost useless verbal commentary such as ‘that thing goes with that thing and that thing goes with that thing’..”

Freeman Dyson, one of the architects of modern QED had this to say about how Feynman did his calculations,

“…Dick was using his own private quantum mechanics that nobody else could understand. They were getting the same answers whenever they calculated the same problem…The reason Dick’s physics was so hard for ordinary people to grasp was that he did not use equations…Dick just wrote down the solutions out of his head without ever writing down the equations. He had a physical picture of the way things happen, and the pictures gave him the solutions directly with a minimum of calculation…It was no wonder that people who had spent their lives solving equations were baffled by him. Their minds were analytical; his was pictorial…”

In many instances, the conversion of abstract thinking into conventional language is seen as a laborious, almost painful process. Often words are inadequate to encompass the subtleties of the non-verbal, abstract ideas and their interrelationships. According to Penrose,

“I had noticed, on occasion, that if I have been concentrating hard for a while on mathematics and someone would engage me suddenly in conversation, then I would find myself almost unable to speak for several seconds”

In fact, abstract thinking is often argued to be a right-hemisphere function. Visual or pattern-related thinking and artistic talents are frequently coupled to this hemisphere, and since the language centers are in the left-hemisphere, with such a disconnect between language and abstract thinking, there is little wonder that theoreticians and artists find themselves tongue-tied in explaining their ideas, or are inclined to report that their work is non-verbal.

So the creation of sophisticated physical theories may involve a primarily non-verbal and visual-symbolic thinking processes, often manipulating patterns and only later, with some effort of will, translating this into spoken language or fleshing out the required mathematical details. Could this be why scientists, and artists for that matter have such difficulty in explaining what they are thinking to the rest of the population? Could this be why ancient philosophers managed to land upon archetypes for their Creation legends that seem familiar to us in the 20th century? The symbols that are used appear disembodied, and no amount of word play can capture all of the nuances and motivations that went into a particular interpretive archetypes, and make them seem compelling to the non-mathematician or non-artist. Feynman once wrote about the frustrating process of explaining to the public what goes on in nature,

“…Different people get different reputations for their skill at explaining to the layman in layman’s language these difficult and abstruse subjects. The layman then searches for book after book in the hope that he will avoid the complexities which ultimately set in, even with the best expositor of this type. He finds as he reads a generally increasing confusion, one complicated statement after another,… all apparently disconnected from one another. It becomes obscure, and he hopes that maybe in some other book there is some explanation…but I do not think it is possible, because mathematics is NOT just another language. Mathematics is a language plus reasoning…if you do not appreciate the mathematics, you cannot see, among the great variety of facts, that logic permits you to go from one to the other…”

If this is the mental frame used by some physicists to comprehend physics, it is little wonder that a great chasm exists between the lay person and the physicist in explaining what is going on. The task that even a physicist such as Freeman Dyson had in translating Feynman’s diagrammatic techniques into mathematical symbology, seems even more challenging knowing that Feynman may have had a whole other perspective on visualization via his apparent color-symbol synthesia. The equations below are the current best mathematical expression for the Standard Model in physics, which describes all known particles and fields excepting gravity.

Another feature of thinking that separates scientists and artists from everyone else seems to be the plasticity of the thinking process itself. Scientists flit from one idea to another until they arrive at a model that best explains the available data, although scientists can also get rooted to particular perspectives that are difficult to forget after decades of inculcation. The general adult population prefers a more stable collection of ideas and ‘laws’ which it can refer to over a lifetime.

Where does this all leave us?

The vacuum has been promoted to perhaps the most important clue to our own existence. The difficulty is that we lack a proper Rosetta Stone to translate the various symbolisms we use to describe it. The clues that we do have are scattered among a variety of enigmatic subjects which strain at our best intellectual resources to understand how they are linked together. Could it be that we are lacking an even more potent symbolic metaphor, and an internal non-verbal language, to give it life? Where would such a thing come from?

Spider web covered with dew drops

If we take our clue from how ideas in physics have emerged in the past, the elements of the new way of thinking may be hidden in some unexpected corner of nature. We may find an analogy or a metaphor in our mundane world which, when mixed with mathematical insight, may take us even closer to understanding gravity, spacetime and vacuum. It is no accident that string theory owes much of its success because it asks us to think about quantum fields as ordinary strings operating in an exotic mathematical setting. It is exciting to think that the essential form of the Theory of Everything could be this close to us, perhaps even lurking in a pattern we see, and overlook, in our everyday lives.

Much of this symbolic process may be performed sub-consciously, and only the form of dreams, insights or hunches seem to bring them into consciousness when the circumstances are appropriate. It is, evidently, the non-verbal and unconscious right hemisphere which experiences these ideas. Is there a limit to this process of symbolic thinking? At least a dozen times this century, physicists have had to throw up their hands over what to make of certain features of the world: the collapse of the wave function; quantum indeterminacy; particle/wave dualism; cosmogenesis. Some of these may eventually find their explanation at the next level of model building. Others such as the meaning of quantum indeterminacy and particle/wave dualism, seem to be here to stay.

In working with these contradictions, the human mind prefers the avenue of denial, you can almost hear your inner voice saying “Aw come on, quantum mechanics just can’t be that weird!” or a state of anxiety as the two hemispheres try to fabricate conflicting world models. Little wonder that we have particle/wave duality, the seeming schism between matter and energy, and a whole host of other ‘polar’ ideas in physics, as two separate minds try to resolve the universe into one model or another with the left one preferring time ordered patterns, and the right one, spatial patterns.

It is hard to believe that our brains can control what we experience of the objective world, but we need only realize that the brain actually blindsides us in a variety of subtle ways, from seeing a wider sensory world. The object of science, however, is to discern the shapes of objective laws in a way that gets to the universal elements of nature that are not coupled to a particular kind of brain circuitry. It doesn’t matter if all scientists have anasognosia and see the world differently in some consistent way, what counts is that they must still live by the laws of motion dictated by gravity and quantum mechanics.

Nils Bohr believed atoms are not real in the same sense as trees. The quantum world really does represent a different kind of reality than our apparently naive understanding of macroscopic reality implies. This being the case, we must first ask to what extent fields and the denizens of the quantum vacuum can be represented by any analogy drawn from the macroworld? We already know that the single most important distinguishing characteristic of atomic particles is their spin; far more so than mass or charge. Yet unlike mass and charge, quantum mechanical spin has ABSOLUTELY no analog in the macroscopic world. Moreover, fundamental particles cannot be thought of as tiny spheres of charged matter located at specific points in space. They have no surface, and participate in an infernal wave-like dance of probability, at least when they are not being observed. Yet despite this warning, we feel comfortable that we understand something about what reality is at this scale, in the face of these irreconcilable differences between one set of mental images and what experiments tell us over and over again. What is the true nature of the vacuum? How did the universe begin? I suspect we will not know the answer to these questions in your lifetime or mine, perhaps for the same reason that it took 3000 years for geometers to ‘discover’ non-Euclidean geometry.

At the present time we are faced with what may amount to only a single proof of the parallel-line postulate, unable to see our way through to another way of looking at the proof. There is also the very real worry that some areas of nature may require modalities of symbolic thinking beyond the archetypes that our brains are capable of providing as a consequence of their neural hard-wiring. Today, we have quantum field theory and its tantalizing paradoxes, much as the ancient geometers had their parallel-line postulate. We, like they, scratch the same figures in the sand over and over again, hoping to see the glimmerings of a new world view appearing in the shifting sands. At a precision of one part in a trillion, our quantum theories work too well, and seem to provide few clues to the new direction we must turn to see beyond them.

The primary arbiters we have at our disposal to decide between various interpretive schemes, experimental data, are not themselves in unending supply as the abrupt cancellation of the U.S. Superconducting Super Collider program in 1989 showed. It was replaced by the CERN Large Hadron Collider shown above, but even the LHC may not be large enough to access the new physics we need to explore to further our theories and understanding.

Whatever answers we need seem to be hidden, not in the low- energy world accessible to our technology, but at vastly higher energies well beyond any technology we are likely to afford in the next few centuries. It is easy to provide a jet plane with an energy of 100 billion billion billion volts — its energy of motion at a speed of a few hundred miles per hour, but it is beyond understanding how to supply a single proton or electron with the same energy. On the other hand, our internal symbolic thinking seems to lead us to similar interpretative schemes, and unconscious dualities which may only be a reflection of our own neural architecture, which we all share, and which has remained essentially unchanged for millennia. We visualize the vacuum in the same way as the Ancients did because we are still starting from the same limited collection of internal imagery. At least for some general problems, we seem to have hit a glass ceiling for which our current style of theory building seems to lead us to a bipolar and contradictory world populated by various dualities: matter/energy, space/time, wave/particle. When we finally do break through to a new kind of reality in our experiments, would we be able to recognize this event? Will our brains filter out this new world and show us only the ghostly shadows of contradictory archetypes cast upon the cave wall?

We have seen that many schemes have been offered for describing the essential difference between matter and empty space; many have failed. Theoreticians since Einstein have speculated about the geometric features of spacetime, and the structure of electrons and matter for decades. The growing opinion now seems to be that, ultimately, only the properties of space such as its geometry or dimensionality can play a fundamental role in defining what matter really is. In a word, matter may be just another form of space. If the essence of matter is to be found in the geometric properties of ’empty’ space, our current understanding of space will not be sufficient to describe all of matter’s possible aspects.

Misconceiving the Big Bang

The Big Bang was NOT a Fireworks Display!

Written by Sten Odenwald
Copyright (C) 1997. Published in the Washington Post Horizon education supplement on May 14, 1997.

The Big Bang wasn’t really big. Nor was it really a bang. In fact, the event that created the universe and everything in it was a very different kind of phenomenon than most people–or, at least, most nonphysicists–imagine.
Even the name “Big Bang” originally was a put-down cooked up by a scientist who didn’t like the concept when it was first put forth. He favored the idea that the universe had always existed in a much more dignified and fundamentally unchanging, steady state.

But the name stuck, and with it has come the completely wrong impression that the event was like an explosion and that the universe is expanding today because the objects in it are being flung apart like fragments of a detonated bomb.

Virtually every basic aspect of this intuitive image for the Big Bang (we ARE stuck with the name) is incorrect. To understand why, you need to understand Albert Einstein’s general theory of relativity. Or, at least, you need to have a sense of it. That may sound daunting, but general relativity is the most revolutionary scientific advance of the 20th century, and we all ought to acquire some feeling for it before the century ends.

After all, it’s been 82 years since Einstein put forth his theory. It’s been tested in scores of experiments and has always passed with flying colors and is now firmly established as our premier guide to understanding how gravity operates. Moreover, it is part of the foundation of Big Bang cosmology. And it is because of general relativity that we know the Big Bang was (and is, for the event is still going on) nothing like an explosion.

Albert Einstein developed general relativity in order to make his famous theory of special relativity include the effects of gravity. It is a better way than Sir Isaac Newton’s of understanding how gravity works. Like a hungry amoeba, general relativity ( or just GR for short) had absorbed both Einstein’s newly-minted special relativity and Newton’s physics, giving us the means to replicate ALL of the predictions from these two great theories, while extending them into unfamiliar realms of experience. One of these realms was the Black Hole. The other was the shape and evolution of the universe itself.

Big Bang cosmology says that the universe came into existence between 10 to 20 billion years ago, and that from a hot dense state has been expanding and cooling ever since, remains unassailable. Yet, Big Bang cosmology is vulnerable. It is based on GR being accurate over an enormous range of scales in time and space. Just how good is general relativity? So far, GR has made the following specific predictions:

1…The entire orbit of Mercury rotates because of the curved geometry of space near the sun. The amount of ‘perihelion shift’ each century was well known at the time Einstein provided a complete explanation for it in 1915.

2…Light at every frequency can be bent in exactly the same way by gravity. This was confirmed in the 1919 Solar Eclipse for optical light using stars near the Sun’s limb, and in 1969-1975 using radio emissions from star-like quasars also seen near the limb of the Sun. The deflection of the light was exactly as predicted by GR.

3…Clocks run slower in strong gravitational fields. This was confirmed by Robert Pound and George Rebka at Harvard University in 1959, and by Robert Vessot in the 1960’s and 70’s using high-precession hydrogen maser clocks flown on jet planes and on satellites.

4…Gravitational mass and inertial mass are identical. Most recently in 1971, Vladimir Braginsky at Moskow University confirmed GRs prediction of this to within 1 part in a trillion of the exact equality required by GR.

5…Black holes exist. Although these objects have been suspected to exist since they were first introduced to astronomers in the early 1970’s, it is only in 1992 that a critical acceptance threshold was crossed in the astronomical community. It was then that Hubble Space Telescope observations revealed monstrous, billion-sun black holes in the cores of nearby galaxies such as Messier 87, Messier 33 and NGC 4261.

6…Gravity has its own form of radiation which can carry energy. Russel Hulse and Joseph Taylor in 1975 discovered two pulsars orbiting each other, and through careful monitoring of their precise pulses during the next 20 years, confirmed that the system is loosing energy at a rate within 1 percent of the prediction by GR based on the emission of gravitational radiation.

7…A new force exists called ‘gravito-magnetism’. Just as electric and magnetic fields are linked together, according to GR, a spinning body produces a magnetism-like force called gravitomagnetism. GR predicts that rotating bodies not only bend space and time, but also make empty space spin. A NASA satellite called Gravity Probe B will be launched in the next few years to see whether this effect exists. This is a killer. If it is not found, GR is mortally wounded despite its long string of other successes.

8…Space can stretch during the expansion of the universe. This was confirmed by Edwin Hubble’s detection of the recession of the galaxies ca 1929. More recently in 1993, Astronomer Kenneth Kellerman confirmed that the angular sizes of distant radio sources shrink to a minimum then increase at greater distances exactly as expected for a dilating space. This is not predicted by any other cosmological model that does not also include the dilation of space as a real, physical phenomenon.

We have now boxed ourselves into a corner. If we accept the successes of GR, we are forced to see the world and the cosmos through its eyes, and its eyes alone, since it is the theory which satisfies all known tests to date.

So, how should we think about the Big Bang? Our mental ‘fireworks’ image of the Big Bang contains these basic elements: 1) A pre-existing sky or space into which the fragments from the explosion are injected; 2) A pre-existing time we can use to mark when the explosion happened; 3) Individual projectiles moving through space from a common center; 4) A definite moment when the explosion occurred; and 5) Something that started the Big Bang.

All of these elements to our visualization of the Big Bang are completely false according to GR!

Preexisting Space?

There wasn’t any!

The mathematics of GR state specifically and unambiguously that 3-dimensional space was created at the Big Bang itself, at ‘Time Zero’, along with everything else. It was a ‘singular’ event in which the separations between all particles everywhere, vanished. This is just another way of saying that our familiar 3-dimensional space vanished. Theorists studying various prototypes for the Theory of Everything have only modified this statement somewhat. During its earliest moments, the universe may have existed in a nearly incomprehensible state which may have had more than 4 dimensions, or perhaps none at all. Many of these theories of the earliest moments hypothesize a ‘mother space-time’ that begat our own universe, but you cannot at the same time place your minds eye both inside this Mother Spacetime to watch the Big Bang happen, and inside our universe to see the matter flying around. This is exactly what the fireworks display model demands that you do.

Preexisting Time?

There wasn’t any of this either!

Again, GR’s mathematics treats both space and time together as one object called ‘space-time’ which is indivisible. At Time Zero plus a moment, you had a well defined quantity called time. At Time Zero minus a moment, this same quantity changed its character in the mathematics and became ‘imaginary’. This is a mathematical warning flag that something dreadfully unexpected has happened to time as we know it. In a famous quote by Einstein, “…time and space are modes by which we think and not conditions in which we live”. Steven Hawking has looked at the mathematics of this state using the fledgling physics of Quantum Gravity Theory, and confirms that at the Big Bang, time was murdered in the most thorough way imaginable. It may have been converted into just another ‘timeless’ dimension of space…or so the mathematics seems to suggest.

Individual objects moving out from a common center?

Nope!

GR says specifically that space is not a passive stage upon which matter plays out its dance, but is a member of the cast. When you treat both galaxies and space-time together, you get a very different answer for what happens than if you treat them separately, which is what we instinctively always do. Curved space distorts the paths of particles, sometimes in very dramatic ways. If you stepped into a space ship and tried to travel to the edge of the universe and look beyond, it would be impossible. Not only could you not reach a supposed “edge” of the universe no matter how long or how fast you traveled, in a closed universe, you would eventually find yourself arriving where you departed. The curvature of space would bring you right back, in something like the way the curvature of Earth would bring you home if you flew west and never changed course. In other words, the universe has no edge in space. There is nothing beyond the farthest star.

As a mental anchor, many have used the expanding balloon as an analogy to the expanding universe. As seen from any one spot on the balloon’s surface, all other spots rush away from it as the balloon is inflated. There is no one center to the expansion ON THE SURFACE of the balloon that is singled out as the center of the Big Bang. This is very different than the fireworks display which does have a dramatic, common center to the expanding cloud of cinders. The balloon analogy, however, is not perfect, because as we watch the balloon, our vantage point is still within a preexisting larger arena which GR says never existed for the real universe.

The center of the Big Bang was not a point in space, but a point in time! It is a center, not in the fabric of the balloon, but outside it along the 4th dimension…time. We cannot see this point anywhere we look inside the space of our universe out towards the distant galaxies. You can’t see time afterall! We can only see it as we look back in time at the ancient images we get from the most distant objects we can observe. We see a greatly changed, early history of the universe in these images but no unique center to them in space.

It is at this point that common sense must give up its seat on the bus, and yield to the insights provided by GR. And it is at precisely this point that so many non-physicists refuse to be so courteous. And who can blame them? But there’s more to come.

Projectiles moving through space?

Sorry!

GR again has something very troubling to say about this. For millions of years we have learned from experience on the savanas of the African continent and elsewhere, that we can move through space. As we drive down the highway, we have absolutely no doubts what is happening as we traverse the distance between landmarks along the roadside. This knowledge is so primal that we are incapable of mustering much doubt about it. But science is not about confirming our prejudices. It’s about revealing how things actually are.

What if I told you that you could decrease the distance from your house and the Washington Monument by ‘standing still’ and just letting space contract the distance away? GR predicts exactly this new phenomenon, and the universe seems to be the only arena we know today in which it naturally occurs. Like spots glued to the surface of the balloon at eternally fixed latitude and longitude points, the galaxies remain where they are while space dilates between them with the passage of time. There is no reason at all we should find this kind of motion intuitive.

If space is stretching like this, where do the brand new millions of cubic light years come from, from one moment to the next? The answer in GR is that they have always been there. To see how this could happen, I like to think of the shape of our universe as a “Cosmic Watermellon”. The fact that this is only the shape for a ‘closed’ finite universe is only a technicality. Finite watermellons are also cheaper to buy than infinite ones.

GR predicts the entire past, present and future of the universe all at once, and predicts its entire 4-dimensional shape. As we slice the 4-dimensional, Cosmic Watermellon at one end of the cosmic time line, we see 3-dimensional space and its contents soon after the Big Bang. At the other end of the Cosmic Watermellon in the far future, we see the collapse of space and matter just before the Big Crunch. But in between, our slices show the shape of space (closed, spherical volumes) and the locations of galaxies ( at fixed locations) as space dilates from one extreme to the other.

As a particular slice through an ordinary watermellon, we see that its meat has always been present in the complete watermellon. The meat is present as a continuous medium, and we never ask where the meat in a particular slice came from. Cosmologically, GR ask us to please think of 3-dimensional space in the same way. Space, like the meat of the watermellon, has always existed in the complete shape of the universe in 4-dimensions. But it is only in 4-dimensions that the full shape of the universe is revealed. It is a mystery why our consciousness insists on experiencing the universe one moment at a time, and that is why we end up with the paradox of where space comes from. There really is no paradox at all.

Space is not ‘nothing’ according to Einstein, it is merely another name for the gravitational field of the universe. Einstein once said, “Space-time does not claim existence on its own but only as a structural quality of the [gravitational] field”. If you could experimentally turn-off gravity with a switch, space-time would vanish. This is the ultimate demolition experiment known to physics for which an environmental impact statement would most certainly have to be filed.

The gravitational field at one instant is wedded to itself in the next instant by the incessant quantum churnings of the myriad of individual particles that like bees in a swarm, make up the gravitational field itself. In this frothing tumult, the gravitational field is knit together, quantum by quantum, from perhaps even more elemental building blocks, and it is perhaps here that we will find the ultimate origin for the expansion of the universe and the magical stretching of space. We hope the much anticipated Theory of Everything will have more to say about this, but to actually test this theory may require technologies and human resources that we can only dimly dream of.

Was there a definite moment to the Big Bang?

GR is perfectly happy to forecast that our universe emerged from an infinite density, zero-space ‘Singularity’ at Time Zero, but physicists now feel very strongly that this instant was smeared out by any number of quantum mechanical effects, so that we can never speak of a time before about 10^-43 seconds after the Big Bang. Just as Gertrude Stein once remarked about my hometown, Oakland, California that “There is no ‘There’ there”, at 10^-43 seconds, nature may tell us that before the Big Bang, “There was no ‘When’ there” either. The moment dissolves away into some weird quantum fog, and as Steven Hawking speculates, time may actually become bent into a new dimension of space and no longer even definable in this state. Ordinary GR is unable to describe this condition and only some future theory combing GR and quantum mechanics will be able to tell us more. We hope.

Something started the Big Bang!

At last we come to the most difficult issue in modern cosmology. In the fireworks display, we can trace the events leading up to the explosion all the way back to the chemists that created the gunpowder and wrapped the explosives. GR, however, can tell us nothing about the equivalent stages leading up to the Big Bang, and in fact, among its strongest statements is the one that says that time itself may not have existed. How, then, do we speak or think about a condition, or process, that started the whole shebang if we are not even allowed to frame the event as “This happened first…then this…then kerpowie!”? This remains the essential mystery of the Big Bang which seems to doggedly transcend every mathematical description we can create to describe it.

All of the logical frameworks we know about are based on chains of events or states. All of our experiences of such chains in the physical world have been ordered in time. Even when the mathematics and the theory tell us ‘What happened before the Big Bang to start it?’ is not a logical or legitimate question, we insist on viewing this as a proper question to ask of nature, and we expect a firm answer. But like so many other things we have learned this century about the physical world, our gut instincts about which questions ought to have definite answers is often flawed when we explore the extreme limits to our physical world.

I wrote this essay before seeing the new IMAX file at the Air and Space Museum ‘Cosmic Journey”, by far one of the nicest and most heroic movies of its kind I had ever seen. But of course it showed the Big Bang as a fireworks display. No matter. It doesn’t take a rocket scientist to accept the fact that the Big Bang was a spectacular moment in history. What is amazing is that the daring audacity of humans may have demystified some of it, and revealed a universe far stranger than any could have imagined.

Still, we are haunted by our hunches and intuitions gathered over millenia, and under circumstances far removed from the greater physical world we are now exploring. No wonder it all seems so alien and maddeningly complex.

Before the Big Bang

Beyond the Big Bang

Written by Sten Odenwald Copyright (C) 1987, Kalmbach Publishing. Reprinted by permission

Sometime between 15 and 20 billion years ago the universe came into existence. Since the dawn of human awareness, we have grappled with the hows and whys of this event and out of this effort have sprung many ideas. An ancient Egyptian legend describes how the universe was created by Osiris Khepera out of a dark, boundless ocean called Nu and that Osiris Khepera created himself out of this ocean by uttering his own name. Human inventiveness has not stood still in the 5000 years since these ideas were popular. The modern theory of the Big Bang states that our universe evolved from an earlier phase billions of times hotter than the core of our sun and trillions of times denser than the nucleus of an atom. To describe in detail such extreme physical conditions, we must first have a firm understanding of the nature of matter and of the fundamental forces. At the high temperatures likely to have attended the Big Bang, all familiar forms of matter were reduced to their fundamental constituents. The forces of gravity and electromagnetism together with the strong and weak nuclear forces, were the essential means through which the fundamental particles of matter interacted.
The feedback between cosmology and particle physics is nowhere more clearly seen than in the study of the early history of the universe. In October, 1985 the giant accelerator at Fermilab acheived for the first time, the collision of protons and anti-protons at energies of 1.6 trillion electron volts, about 1600 times the rest mass of the proton. This was a unique event because for one split second, on a tiny planet in an undistinguished galaxy, a small window onto the Creation Event was opened for the first time in at least 15 billion years.

THE LIMITS OF CERTAINTY

The persuit by physicists of a single, all encompassing theory capable of describing the four natural forces has, as a by-product, resulted in some surprising glimpses of the Creation Event. Although such a theory remains perhaps several decades from completion, it is generally recognized that such a theory will describe physical conditions so extreme it is quite possible that we may never be able to explore them first- hand, even with the particle accelerators that are being designed today. For example, the Superconducting Supercollider to be built by the early 1990’s will cost 6 billion dollars and it will allow physicists to collide particles at energies of 40 trillion electron volts ( 40,000 GeV) matching the conditions prevailing 10 seconds after the Big Bang. The expected windfall from such an accelerator is enormous and will help to answer many nagging questions now plaguing the theoretical community, but can we afford to invest perhaps vastly larger sums of money to build machines capable of probing the quantum gravity world at 10 GeV? At these energies, the full unification of the natural forces is expected to become directly observable. How curious it is that definite answers to questions such as, ‘What was Creation like?’ and ‘Do electrons and quarks have internal structure?’ are so inextricably intertwined. Our ability to find answers to these two questions, among others, does not seem to be hampered by some metaphysical prohibition, but by the resources our civilization can afford to devote to finding the answers. Fortunatly, the situation is not quite so bleak, for you see, the ‘machine’ has already been ‘built’ and every possible experiment we can ever imagine has already been performed!

WHAT WE THINK WE KNOW

We are living inside the biggest particle accelerator ever created – the universe. Ten billion years before the sun was born, Nature’s experiment in high-energy physics was conducted and the experimental data can now be examined by studying the properties and contents of the universe itself. The collection of fundamental facts that characterize our universe is peculiar in that it derives from a variety of sources. A partial list of these ‘meta-facts’ looks like this:

1) We are here, therefore, some regions of the universe are hospitible to the creation of complex molecules and living, rational organisms.

2) Our Universe has 4 big dimensions and all are increasing in size as the universe expands in time and space.

3) There are 4 dissimilar forces acting in Nature.

4) Only matter dominates; no anti-matter galaxies exist and this matter is built out of 6 quarks and 6 types of leptons.

The task confronting the physicist and the astronomer is to create, hopefully, a single theory consistent with these metafacts that can then be used to derive the secondary characteristics of our universe such as the 2.7 K background radiation, the primordial element abundances, and galaxy formation. The interplay between the study of the macrocosm and the microcosm has now become so intense that astronomers have helped physicists set limits to the number of lepton families — No more than 4 are allowed otherwise the predicted cosmological abundance of helium would seriously disagree with what is observed. Physicists, on the other hand, use the astronomical upper limits to the current value of the cosmological constant to constrain their unification theories.

An extention to the standard Big Bang model called the Inflationary Universe (see The Decay of the False Vacuum) was created by MIT physicist Alan Guth in 1981. This theory combined Grand Unification Theory with cosmology and, if correct, allows astronomers to trace the history of the universe all the way back to 10 seconds after the Big Bang when the strong, weak and electromagnetic forces were unified into a single ‘electro-nuclear’ force. During the 4 years since the Inflationary Universe model was proposed, other theoretical developments have emerged that may help us probe events occurring at an even earlier stage, perhaps even beyond the Creation Event itself. Ten years ago, theoreticians discovered a new class of theories called Supersymmetric Grand Unified Theories ( SUSY GUTs). These theories, of which there are several competing types, have shown great promise in providing physicists with a unified framework for describing not just the electro-nuclear force but also gravity, in addition to the particles they act on (see The Planck Era: March 1984). Unfortunately, as SUSY GUTs were studied more carefully, it was soon discovered that even the most promising candidates for THE Unified Field Theory suffered from certain fundamantal deficiencies. For instance:

1) There were not enough basic fields predicted to accomodate the known particles.

2) Left and right-hand symmetry was mandated so that the weak force, which breaks this symmetry, had to be put in ‘by hand’.

3) Anomalies exist which include the violation of energy conservation and charge.

4) The Cosmological Constant is 10 times larger than present upper limits suggest.

In recent years, considerable effort has gone into extending and modifying the postulates of SUSY GUTs in order to avoid these problems. One avenue has been to question the legitimacy of a very basic premise of the field theories developed heretofore. The most active line of theoretical research in the last 25 years has involved the study of what are called ‘point symmetry groups’. For example, a hexagon rotated by 60 degrees about a point at its center is indistinguishable from one rotated by 120, 180, 240, 300 and 360 degrees. These 6 rotation operations form a mathematical group so that adding or subtracting any two operations always result in a rotation operation that is already a member of the group ( 180 = 120 + 60 etc). The Grand Unification Theories of the electro-nuclear interaction are based on point symmetry groups named SU(3), SU(2) and U(1) which represent analogous ‘rotations’ in a more complex mathematical space. In the context of ponderable matter, point symmetry groups are also the mathematical statement of what we believe to be the structure of the fundamental particles of matter, namely, that particles are point-like having no physical size at all. But what if this isn’t so? The best that experimental physics has to offer is that the electron which is one of a family of 6 known Leptons, behaves like a point particle at scales down to 10 cm, but that’s still an enormous distance compared to the gravitational Planck scale of 10 cm where complete unification with gravity is expected to occur.

By assuming that fundamental particles have internal structure, Michael Green at Queen Mary College and John Schwartz at Caltech made a remarkable series of discoveries which were anounced in the journal NATURE in April 1985. They proposed that, if a point particle were replaced by a vibrating ‘string’ moving through a 10-dimensional spacetime, many of the problems plaguing SUSY GUTs seemed to vanish miraculously. What’s more, of all the possible kinds of ‘Superstring’ theories, there were only two ( called SO(32) and E8 x E8′) that were: 1) Consistent with both the principles of relativity and quantum mechanics,2) Allowed for the asymmetry between left and right-handed processes and, 3) Were free of anomalies. Both versions were also found to have enough room in them for 496 different types of fields; enough to accomodate all of the known fundamental particles and then some! Superstring theories also have very few adjustable parameters and from them, certain quantum gravity calculations can be performed that give finite answers instead of infinite ones. In spite of their theoretical successes, Superstring theories suffer from the difficulty that the lightest Superstring particles will be completely massless while the next more massive generation will have masses of 10 GeV. It is not even clear how these supermassive string particles are related to the known particles which are virtually massless by comparison (a proton has a mass of 1 GeV!). It is also not known if the 496 different particles will cover the entire mass range between 0 and 10 GeV. It is possible that they may group themselves into two families with masses clustered around these two extreems. In the later instance, experimental physicists may literally run out of new particles to discover until accelerators powerful enough to create supermassive particles can be built.

An attractive feature of the SO(32) model, which represents particles as open-ended strings, is that gravity has to be included from the start in order to make the theory internally consistent and capable of yielding finite predictions. It is also a theory that reduces to ordinary point field theories at energies below 10 GeV. The complimentary theory, E8 x E8′, is the only other superstring theory that seems to work as well as SO(32) and treats particles as though they were closed strings without bare endpoints. This model is believed to show the greatest promise for describing real physical particles. It also includes gravity, but unlike SO(32), E8 x E8′ does seem to reduce at low energy, to the symmetry groups associated with the strong, weak and electromagnetic interactions, namely, SU(3), SU(2) and U(1).

If E8 x E8′ is destined to be the ‘ultimate, unified field theory’, there are some additional surprises in store for us. Each group, E8 and E8′, can be reduced mathematically to the products of the groups that represent the strong, weak and electromagnetic forces; SU(3) x SU(2) x U(1). If the E8 group corresponds to the known particles what does E8′ represent? In terms of its mathematical properties, symmetry considerations alone seem to require that the E8′ group should be a mirror image of E8. If E8 contains the groups SU(3), SU(2) and U(1) then E8′ contains SU(3)’, SU(2)’ and U(1)’. The primed fields in E8′ would have the same properties as those we ascribe to the strong, weak and electromagnetic forces. The E8′ particle fields may correspond to a completly different kind of matter, whose properties are as different from matter and anti-matter as ordinary matter is from anti-matter! ‘Shadow Matter’ as it has been called by Edward Kolb, David Seckel and Michael Turner at Fermilab, may actually co-exist with our own – possibly accounting for the missing mass necessary to close the universe. Shadow matter is only detectable by its gravitational influence and is totally invisible because the shadow world electromagnetic force (shadow light) does not interact with any of the particles in the normal world.

BEYOND SPACE AND TIME

The quest for a mathematical description of the physical world uniting the apparent differences between the known particles and forces, has led physicists to the remarkable conclusion that the universe inhabits not just the 4 dimensions of space and time, but a much larger arena whose dimensionality may be enormous (see Does Space Have More Than 3 Dimensions?). Both the Superstring theories and SUSY GUTs agree that our physical world has to have more than the 4 dimensions we are accustomed to thinking about. A remarkable feature of Superstring theory is that of all the possible dimensionalities for spacetime, only in 10-dimensions ( 9 space dimensions and 1 time dimension) will the theory lead to a computationally finite and internally consistent model for the physical world that includes the weak interaction from the outset, and where all of the troublesome anomalies cancil exactly. In such a 10-dimensional world, it is envisioned that 6 dimensions are now wrapped-up or ‘compactified’ into miniscule spheres that accompany the 4 coordinates of every point in spacetime. What would a description of the early universe look like from this new viewpoint? The 6 internal dimensions are believed to have a size of order 10 cm.

As we follow the history of the universe back in time, the 3 large dimensions of space rapidly shrink until eventually they become only 10 cm in extent. This happened during the Planck Era at a time, 10 seconds after the Creation Event. The appearance of the universe under these conditions is almost unimaginable. Today as we look out at the most distant quasar, we see them at distances of billions of lightyears. During the Planck Era, the matter comprising these distant systems was only 10 cm away from the material that makes-up your own body!

What was so special about this era that only 4 of the 10 dimensions were singled-out to grow to their enormous present size?. Why not 3 ( 2 space + 1 time) or 5 ( 4 space + 1 time)? Physicists have not as yet been able to develope an explanation for this fundamental mystery of our plenum, on the other hand, it may just be that had the dimensional breakdown of spacetime been other than ‘4 + 6’, the physical laws we are the products of, would have been totally inhospitable to life as we know it.

As we relentlessly follow the history of the universe to even earlier times, the universe seems to enter a progressively more and more symmetric state. The universe at 10 seconds after the Big Bang may have been populated by supermassive particles with masses of 10^15 GeV or about 10^-13 grams each. These particles ultimatly decayed into the familiar quarks and leptons once the universe had grown colder as it expanded. In addition, there may only have been a single kind of ‘superforce’ acting on these particles; a force whose character contained all of the individual attributes we now associate with gravity, electromagnetism and the strong and weak nuclear forces. Since the particles carrying the ‘superforce’ had masses similar to those of the supermassive particles co-existing then, the distinction between the force-carriers and the particles they act on probably broke-down completely and the world became fully supersymmetric.

To go beyond the Planck Era may require a radical alteration in our conventional way of thinking about time and space. Only glimpses of the appropriate way to think about this multidimensional landscape can be found in the equations and theories of modern-day physics. Beyond the Planck Era, all 10 dimensions (and perhaps others) become co-equal at least in terms of their physical size. The supermassive Superstring particles begin to take-on more of the characteristics of fluctuations in the geometry of spacetime than as distinguishable, ingredients in the primordial, cosmological ‘soup’. There was no single, unique geometry for spacetime but, instead, an ever-changing quantum interplay between spacetimes with an unlimited range in geometry. Like sound waves that combine with one another to produce interference and reinforcement, the spacetime that emerged from the Planck Era is thought to be the result of the superposition of an infin ite number of alternate spacetime geometries which, when added together, produced the spacetime that we are now a part of.

Was there light? Since the majority of the photons were probably not created in large numbers until at least the beginning of the Inflationary Epoc, 10^-36 seconds after the Big Bang, it is not unthinkable that during its earliest moments, the universe was born out of darkness rather than in a blinding flash of light. All that existed in this darkness before the advent of light, was an empty space out of which our 10-dimensional spacetime would later emerge. Of course, under these conditions it is unclear just how we should continue to think about time itself.

In terms of the theories available today, it may well be that the particular dimension we call Time had a definite zero point so that we can not even speak logically about what happened before time existed. The concept of ‘before’ is based on the presumption of time ordering. A traveler standing on the north pole can never move to a position on the earth that is 1 mile north of north! Nevertheless, out of ingrained habit, we speak of the time before the genesis of the universe when time didn’t exist and ask, “What happened before the Big Bang?”. The list of physicists investigating this ‘state’ has grown enormously over the last 15 years. The number of physicists, worldwide, that publish research on this topic is only slightly more than 200 out of a world population of 5 billion!

QUANTUM COSMOLOGY

In the early 1970’s Y. Zel’dovitch and A. Starobinski of the USSR along with Edward Tryon at Hunter College proposed that the universe emerged from a fluctuation in the vacuum. This vacuum fluctuation ‘ran away’ with itself, creating all the known particles out of empty space at the ‘instant’ of no-time. To understand what this means requires the application of a fundamental fact of relativistic quantum physics discovered during the latter half of the 1920’s. Vacuum fluctuations are a direct consequence of Heisenberg’s Uncertainty Principle which limits how well we can simultaneously know a particle’s momentum and location (or its total energy and lifetime). What we call empty space or the physical vacuum is a Newtonian fiction like absolute space and time. Rather than a barren stage on which matter plays-out its role, empty space is known to be filled with ‘virtual particles’ that spontaneously appear and disappear beyond the ability of any physical measurement to detect directly. From these ghost particles, a variety of very subtle phenomena can be predicted with amazing accuracy. Depending on the total rest mass energy of the virtual particles created in the vacuum fluctuation, they may live for a specific lifetime before Heisenberg’s Uncertainty Principle demands that they vanish back into the nothingness of the vacuum state. In such a quantum world, less massive virtual particles can live longer than more massive ones. Edward Tyron proposed that the universe is just a particularly long-lived vacuum fluctuation differing only in magnitude from those which occur imperceptably all around us. The reason the universe is so long lived in spite of its enormous mass is that the positive energy latent in all the matter in the universe is offset by the negative potential energy of the gravitational field of the universe. The total energy of the universe is, therefore, exactly zero and its maximum lifetime as a ‘quantum fluctuation’ could be enormous and even infinite! According to Tryon, “The Universe is simply one of those things which happens from time to time.”

This proposal by Tryon was regarded with some scepticism and even amusement by astronomers, and was not persued much further. This was a fate that had also befallen the work on 5-dimensional general relativity by Theodore Kaluza and Oskar Klein during the 1920’s which was only resurrected in the late 1970’s as a potent remedy for the ills plaguing supersymmetry theory.

In 1978, R. Brout, P. Englert, E. Gunzig and P. Spindel at the University of Brussels, proposed that the fluctuation that led to the creation of our universe started out in an empty, flat, 4-dimensional spacetime. The fluctuation in space began weakly, creating perhaps a single matter- antimatter pair of supermassive particles with masses of 10^19 GeV. The existence of this ‘first pair’ stimulated the creation from the vacuum of more particle-antiparticle pairs which stimulated the production of still others and so on. Space became highly curved and exploded, disgorging all of the superparticles which later decayed into the familiar leptons, quarks and photons.

Heinz Pagels and David Atkatz at Rockefeller University in 1981 proposed that the triggering agent behind the Creation Event was a tunneling phenomenon of the vacuum from a higher-energy state to a lower energy state. Unlike the Brout-Englert-Gunzig-Spindel model which started from a flat spacetime, Pagels and Atkatz took the complimentary approach that the original nothingness from which the universe emerged was a spatially closed, compact empty space, in other words, it had a geometry like the 2-D surface of a sphere. but the dimensionality of its surface was much higher than 2. Again this space contained no matter what-so-ever. The characteristics (as yet unknown) of the tunneling process determined, perhaps in a random way, how the dimensionality of spacetime would ‘crystallize’ into the 6+4 combination that represents the plenum of our universe.

Alex Vilenkin at Tufts University proposed in 1983 that our spacetime was created out of a ‘nothingness’ so complete that even its dimensionality was undefined. In 1984, Steven Hawkings at Cambridge and James Hartle at UCSB came to a similar conclusion through a series of quantum mechanical calculations. They described the geometric state of the universe in terms of a wavefunction which specified the probability for spacetime to have one of an infinite number of possible geometries. A major problem with the ordinary Big Bang theory was that the universe emerged from a state where space and time vanished and the density of the universe became infinite; a state called the Singularity. Hawkings and Hartle were able to show that this Big Bang singularity represented a specific kind of geometry which would become smeared-out in spacetime due to quantum indeterminacy. The universe seemed to emerge from a non-singular state of ‘nothingness’ similar to the undefined state proposed by Vilenkin. The physicist Frank Wilczyk expresses this remarkable situation the best by saying that, ” The reason that there is Something rather than Nothing is that Nothing is unstable.”

PERFECT SYMMETRY

Theories like those of SUSY GUTS and Superstrings seem to suggest that just a few moments after Creation, the laws of physics and the content of the world were in a highly symmetric state; one superforce and perhaps one kind of superparticle. The only thing breaking the perfect symmetry of this era was the definite direction and character of the dimension called Time. Before Creation, the primordial symmetry may have been so perfect that, as Vilenkin proposed, the dimensionality of space was itself undefined. To describe this state is a daunting challenge in semantics and mathematics because the mathematical act of specifying its dimensionality would have implied the selection of one possibility from all others and thereby breaking the perfect symmetry of this state. There were, presumably, no particles of matter or even photons of light then, because these particles were born from the vacuum fluctuations in the fabric of spacetime that attended the creation of the universe. In such a world, nothing happens because all ‘happenings’ take place within the reference frame of time and space. The presence of a single particle in this nothingness would have instantaneously broken the perfect symmetry of this era because there would then have been a favored point in space different from all others; the point occupied by the particle. This nothingness didn’t evolve either, because evolution is a time-ordered process. The introduction of time as a favored coordinate would have broken the symmetry too. It would seem that the ‘Trans-Creation’ state is beyond conventional description because any words we may choose to describe it are inherently laced with the conceptual baggage of time and space. Heinz Pagels reflects on this ‘earliest’ stage by saying, “The nothingness ‘before’ the creation of the universe is the most complete void we can imagine. No space, time or matter existed. It is a world without place, without duration or eternity…”

A perusal of the scientific literature during the last 20 years suggests that we may be rapidly approaching a major crossroad in physics. One road seems to be leading to a single unification theory that is so unique among all others that it is the only one consistent with all the major laws we know about. It is internally consistent; satisfies the principles of relativity and quantum mechanics and requires no outside information to describe the particles and forces it contains . A prototype of this may be superstring theory with its single adjustable parameter, namely, the string tension. The other road is much more bleak. It may also turn out that we will create several theoretical systems that seem to explain everything but have within them hard to detect flaws. These flaws may stand as barracades to further logical inquiry; to be uncovered only through experiments that may be beyond our technological reach. It is possible that we are seeing the beginning of this latter process even now, with the multiplicity of theories whose significant deviations only occur at energies near 10^19 GeV.

I find it very hard to resist the analogy between our current situation and that of the Grecian geometers. For 2000 years the basic postulates of Eulidean geometry and the consequences of this logical system, remained fixed. It became a closed book with only a few people in the world struggling to find exceptions to it such as refutations of the parallel line postulate. Finally during the 19th century, non-euclidean geometry was discovered and a renaissance in geometry occurred. Are physicists on the verge of a similar great age, finding themselves hamstrung by not being able to devise new ways of thinking about old problems? Egyptian cosmology was based on motifs that the people of that age could see in the world around them; water, sky, land, biological reproduction. Today we still use motifs that we find in Nature in order to explain the origin of the universe; the geometry of space, virtual particles and vacuum fluctuations. We can probably expect that in the centuries to follow, our descendents will find still other motifs and from them, fashion cosmologies that will satisfy the demands of that future age with, possibly, much greater accuracy and efficiency than ours do today. Perhaps, too, in those future ages, scientists will marvel at the ingenuity of modern physicists and astronomers, and how in the space of only 300 years, we had managed to create our own quaint theory as the Egyptians had before us.

In the meantime, physicists and astronomers do the best they can to fashion a cosmology that will satisfy the intellectual needs of our age. Today, as we contemplate the origin of the universe we find ourselves looking out over a dark, empty void not unlike the one that our Egyptian predecessors might have imagined. This void is a state of exquisite perfection and symmetry that seems to defy description in any linguistic terms we can imagine. Through our theories we launch mathematical voyages of exploration, and watch the void as it trembles with the quantum possibilities of universes unimaginable.

Einstein’s Fudge

Einstein’s Cosmic Fudge Factor

Written by Sten Odenwald
Copyright (C) 1991. Sky Publishing Corporation. Reprinted by permission. See April, 1991 issue

Black holes…quarks…dark matter. It seems like the cosmos gets a little stranger every year. Until recently, the astronomical universe known to humans was populated by planets, stars, galaxies, and scattered nebulae of dust and gas. Now, theoretists tell us it may also be inhabited by objects such as superstrings, dark matter and massive neutrinos — objects that have yet to be discovered if they exist at all!
As bizarre as these new constituents may sound, you don’t have to be a rocket scientist to appreciate the most mysterious ingredient of them all. It is the inky blackness of space itself that commands our attention as we look at the night sky; not the sparse points of light that signal the presence of widely scattered matter.

During the last few decades, physicists and astronomers have begun to recognize that the notion of empty space presents greater subtleties than had ever before been considered. Space is not merely a passive vessel to be filled by matter and radiation, but is a dynamic, physical entity in its own right.

One chapter in the story of our new conception of space begins with a famous theoretical mistake made nearly 75 years ago that now seems to have taken on a life of its own.

In 1917, Albert Einstein tried to use his newly developed theory of general relativity to describe the shape and evolution of the universe. The prevailing idea at the time was that the universe was static and unchanging. Einstein had fully expected general relativity to support this view, but, surprisingly, it did not. The inexorable force of gravity pulling on every speck of matter demanded that the universe collapse under its own weight.

His remedy for this dilemma was to add a new ‘antigravity’ term to his original equations. It enabled his mathematical universe to appear as permanent and invariable as the real one. This term, usually written as an uppercase Greek lambda, is called the ‘cosmological constant’. It has exactly the same value everywhere in the universe, delicately chosen to offset the tendency toward gravitational collapse at every point in space.

A simple thought experiment may help illustrate the nature of Lambda. Take a cubic meter of space and remove all matter and radiation from it. Most of us would agree that this is a perfect vacuum. But, like a ghost in the night, the cosmological constant would still be there. So, empty space is not really empty at all — Lambda gives it a peculiar ‘latent energy’. In other words, even Nothing is Something!

Einstein’s fudged solution remained unchallenged until 1922 when the Russian mathematician Alexander Friedmann began producing compelling cosmological models based on Einstein’s equations but without the extra quantity. Soon thereafter, theorists closely examining Einstein’s model discovered that, like a pencil balanced on its point, it was unstable to collapse or expansion. Later the same decade, Mount Wilson astronomer Edwin P. Hubble found direct observational evidence that the universe is not static, but expanding.

All this ment that the motivation for introducing the cosmological constant seemed contrived. Admitting his blunder, Einstein retracted Lambda in 1932. At first this seemed to end the debate about its existence. Yet decades later, despite the great physicist’s disavowal, Lambda keeps turning up in cosmologists’ discussions about the origin, evolution, and fate of the universe.

THEORY MEETS OBSERVATION

Friedmann’s standard ‘Big Bang’ model without a cosmological constant predicts that the age of the universe, t0, and its expansion rate (represented by the Hubble parameter, H0) are related by the equation t0 = 2/3H0. Some astronomers favor a value of H0 near 50 kilometers per second per megaparsec (one megaparsec equals 3.26 million light years). But the weight of the observational evidence seems to be tipping the balance towards a value near 100. In the Friedmann model, this implies that the cosmos can be no more than 7 billion years old. Yet some of our galaxy’s globular clusters have ages estimated by independent methods of between 12 and 18 billion years!

In what’s called the Einstein-DeSitter cosmology, the Lambda term helps to resolve this discrepancy. Now a large value for the Hubble parameter can be attributed in part to “cosmic repulsion”. This changes the relationship between t0 and H0, so that for a given size, the universe is older than predicted by the Friedmann model.

In one formulation of Einstein’s equation, Lambda is expressed in units of matter density. This means we can ask how the cosmological constant, if it exists at all, compares with the density of the universe in the forms of stars and galaxies.

So far, a careful look at the available astronomical data has produced only upper limits to the magnitude of Lambda. These vary over a considerable range – from about 10 percent of ordinary matter density to several times that density.

The cosmological constant can also leave its mark on the properties of gravitational lenses and faint galaxies. One of the remarkable features of Einstein’s theory of general relativity is its prediction that space and time become deformed or ‘warped’ in the vicinity of a massive body such as a planet, star or even a galaxy. Light rays passing through such regions of warped “space-time” have their paths altered. In the cosmological arena, nearby galaxies can deflect and distort the images of more distant galaxies behind them. Sometimes, the images of these distant galaxies can appear as multiple images surrounding the nearby ‘lensing’ galaxy.

At Kyoto University M. Fukugita and his coworkers predicted that more faint galaxies and gravitational lenses will be detected than in a Friedmann universe if Lambda is more than a few times the matter density. Edwin Turner, an astrophysicist at Princeton University also reviewed the existing, scant, data on gravitational lenses and found that they were as numerous as expected for Lambda less that a few times the matter density. By the best astronomical reconning, Lambda is probably not larger than the observed average matter density of the universe. For that matter, no convincing evidence is available to suggest that Lambda is not exactly equal to zero. So why not just dismiss it as an unnecessary complication? Because the cosmological constant is no longer, strictly, a construct of theoretical cosmology.

NOTHING AND EVERYTHING

To understand how our universe came into existence, and how its various ingredients have evolved, we must delve deeply into the fundamental constituents of matter and the forces that dictate how it will interact. This means that the questions we will have to ask will have more to do with physics than astronomy. Soon after the big bang, the universe was at such a high temperature and density that only the details of matter’s composition (quarks, electrons etc) and how they interact via the four fundamental forces of nature were important. They represented the most complex collections of matter in existence, long before atoms, planets, stars and galaxies had arrived on the scene.

For two decades now, physicists have been attempting to unify the forces and particles that make up our world – to find a common mathematical description that encompasses them all. Some think that such a Theory of Everything is just within reach. It would account not only for the known forms of matter, but also for the fundamental interactions among them: gravity, electromagnetism, and the strong and weak nuclear forces.

These unification theories are known by a variety of names: grand unification theory, supersymmetry theory and superstring theory. Their basic claim is that Nature operates according to a small set of simple rules called symmetries.

The concept of symmetry is at least as old as the civilization of ancient Greece, whos art and archetecture are masterworks of simplicity and balance. Geometers have known for a long time that a simple cube can be rotated 90 degrees without changing its outward appearance. In two dimensions, equalateral triangles look the same when they are rotated by 120 degrees. These are examples of the geometric concept of Rotation Symmetry.

There are parallels to geometric symmetry in the way that various physical phenomena and qualities of matter express themselves as well. For example, the well-known principle of the Conservation of Energy is a consequence of the fact that when some collections of matter and energy are examined at different times, they each have precisely the same total energy, just as a cube looks the same when it is rotated in space by a prescribed amount. Symmetry under a ‘shift in time’ is as closely related to the Conservation of Energy as is the symmetry of a cube when rotated by 90 degrees.

Among other things, symmetries of Nature dictate the strengths and ranges of the natural forces and the properties of the particles they act upon. Although Nature’s symmetries are hidden in today’s cold world, they reveal themselves at very high temperatures and can be studied in modern particle accelerators.

The real goal in unification theory is actually two-fold: not only to uncover and describe the underlying symmetries of the world, but to find physical mechanisms for ‘breaking’ them at low energy. After all, we live in a complex world filled with a diversity of particles and forces, not a bland world with one kind of force and one kind of particle!

Theoreticians working on this problem are often forced to add terms to their equations that represent entirely new fields in Nature. The concept of a field was invented by mathematicians to express how a particular quantity may vary from point to point in space. Physicists since the 18th century have adopted this idea to describe quantitatively how forces such as gravity and magnetism change at different distances from a body.

The interactions of these fields with quarks, electrons and other particles cause symmetries to break down. These fields are usually very different than those we already know about. The much sought after Higgs boson field, for example, was introduced by Sheldon Glashow, Abdus Salam and Steven Weinberg in their unified theory of the electromagnetic and weak nuclear forces.

Prior to their work, the weak force causing certain particles to decay, and the electromagnetic force responsible for the attraction between charged particles and the motion of compass needles, were both considered to be distinct forces in nature. By combining their mathematical descriptions into a common language, they showed that this distinction was not fundamental to the forces at all! A new field in nature called the Higgs field makes these two forces act differently at low temperature. But at temperatures above 1000 trillion degrees, the weak and electromagnetic forces become virtually identical in the way that they affect matter. The corresponding particles called the Higgs Boson not only cause the symmetry between the electromagnetic and weak forces to be broken at low temperature, but they are also responsible for confiring the property of mass on particles such as the electrons and the quarks!

There is, however a price that must be paid for introducing new fields into the mathematical machinery. Not only do they break symmetries, but they can also give the vacuum state an enormous latent energy that, curiously, behaves just like Lambda in cosmological models.

The embarrassment of having to resurrect the obsolete quantity Lambda is compounded when unification theories are used to predict its value. Instead of being at best a vanishingly minor ingredient to the universe, the predicted values are in some instances 10 to the power of 120 times greater than even the most generous astronomical upper limits!

It is an unpleasant fact of life for physicists that the best candidates for the Theory of Everything always have to be fine-tuned to get rid of their undesirable cosmological consequences. Without proper adjustment, these candidates may give correct predictions in the microscopic world of particle physics, but predict a universe which on its largest scales looks very different from the one we inhabit.

Like a messenger from the depths of time, the smallness – or absence – of the cosmological constant today is telling us something important about how to craft a correct Theory of Everything. It is a signpost of the way Nature’s symmetries are broken at low energy, and a nagging reminder that our understanding of the physical world is still incomplete in some fundamental way.

A LIKELY STORY

Most physicists expect the Theory of Everything will describe gravity the same way we now describe matter and the strong, weak and electromagnetic forces – in the language of quantum mechanics. Gravity is, after all, just another force in Nature. So far this has proven elusive, due in part to the sheer complexity of the equations of general relativity. Scientists since Einstein have described gravity ( as well as space and time) in purely geometric terms. Thus we speak of gravity as the “curvature of space-time”.

To acheive complete unification, the dialects of quantum matter and geometric space have to be combined into a single language. Matter appears to be rather precisely described in terms of the language of quantum mechanics. Quarks and electrons exchange force-carrying particles such as photons and gluons and thereby feel the electromagnetic and strong nuclear forces. But, gravity is described by Einstein’s theory of general relativity as a purely geometric phenomenon. These geometric ideas of curvature and the dimensionality of space have nothing to do with quantum mechanics.

To unify these two great foundations of physics, a common language must be found. This new language will take some getting used to. In it, the distinction between matter and space dissolves away and is lost completely; matter becomes a geometric phenomenon, and at the same time, space becomes an exotic form of matter.

Beginning with work on a quantum theory of gravity by John Wheeler and Bryce DeWitt in the 1960’s, and continuing with the so-called superstring theory of John Schwartz and Michael Green in the 1980’s, a primitive version of such a ‘quantum-geometric’ language is emerging. Not surprisingly, it borrows many ideas from ordinary quantum mechanics.

A basic concept in quantum mechanics is that every system of elementary particles is defined by a mathematical quantity called a wave function. This function can be used, for example, to predict the probability of finding an electron at a particular place and time within an atom. Rather than a single quantity, the wave function is actually a sum over an infinite number of factors or ‘states’, each representing a possible measurement outcome. Only one of these states can be observed at a time.

By direct analogy, in quantum gravitation, the geometry of space-time, whether flat or curved, is only one of an infinite variety of geometric shapes for space-time, and therefore the universe. All of these possibilities are described as separate states in the wave function for the universe.

But what determines the probability that the universe will have the particular geometry we now observe out of the infinitude of others? In quantum mechanics, the likelihood that an electron is located somewhere within an atom is determined by the external electric field acting on it. That field is usually provided by the protons in the atomic nucleus. Could there be some mysterious field ‘outside’ our universe that determines its probability?

According to Cambridge University theorist Stephen Hawking, this is the wrong way to look at the problem. Unlike the electron acted upon by protons, our universe is completely self-contained. It requires no outside conditions or fields to help define its probability. The likelihood that our universe looks the way it does depends only on the strengths of the fields within it.

Among these internal fields, there may even be ones that we haven’t yet discovered. Could the cosmological constant be the fingerprint in our universe of a new ‘hidden’ field in Nature? This new field could affect the likelihood of our universe just as a kettle of soup may contain unknown ingredients although we can still precisely determine the kettle’s mass.

A series of mathematical considerations led Hawking to deduce that the weaker the hidden field becomes, the smaller will be the value we observe for the cosmological constant, and surprisingly, the more likely will be the current geometry of the universe.

This, in turn, implies that if Lambda were big enough to measure by astronomers in the first place, our universe would be an improbable one. Philosophically, this may not trouble those who see our cosmos as absolutely unique, but in a world seemingly ruled by probability, a counter view is also possible. There may, in fact, exist an infinite number of universes, but only a minority of them have the correct blend of physical laws and physical conditions resembling our life-nurturing one.

Hawking continued his line of speculation by suggesting that, if at the so-called Planck scale of 10 to the power of -33 centimeters the cosmos could be thought of as an effervescent landscape, or “space-time foam”, then perhaps a natural mechanism could exist for eliminating the cosmological constant for good.

One of the curiosities of combining the speed of light and Newton’s constant of gravitation from general relativity, with Planck’s constant from quantum mechanics, is that they can be made to define unique values for length, time and energy. Physicists believe that at these Planck scales represented by 10 to the power of -33 centimeters and 10 to the power of -43 seconds, general relativity and quantum mechanics blend together to become a single, comprehensive theory of the physical world: The Theory Of Everything. The energy associated with this unification, 10 to the power of 19 billion electron volts, is almost unimaginably big by the standards of modern technology.

The universe itself, soon after the Big Bang, must also have passed through such scales of space, time and energy during its first instants of existence. Cosmologists refer to this period as the Planck Era. It marks the earliest times that physicists are able to explore the universe’s physical state without having a complete Theory of Everything to guide them.

WORMHOLES

Harvard University physicist Sidney Coleman has recently pursued this thought to a possible conclusion. Instead of some mysterious new field in Nature, maybe the Lambda term appears in our theories because we are using the wrong starting model for the geometry of space at the Planck scale.

Previous thinking on the structure of space-time had assumed that it behaved in some sense like a smooth rubber sheet. Under the action of matter and energy, space-time could be deformed into a variety of shapes, each a possible geometric state for the universe. Nearly all candidates for the Theory of Everything’s embed their fields and symmetries in such a smooth geometrical arena.

But what if space-time were far more complicated? One possibility is that ‘wormholes’ exist, filling space-time with a network of tunnels. The fabric of space-time may have more in common with a piece of Swiss cheese than with a smooth rubber sheet.

According to Coleman, the addition of wormholes to space-time means that, like the ripples from many stones tossed into a pond, one geometric state for the universe could interfere with another. The most likely states ( or the biggest ripples) would win out. The mathematics suggest that quantum wormhole interference at the Planck scale makes universes with cosmological constants other than zero exceedingly unlikely.

How big would wormholes have to be to have such dramatic repurcussions? Surprisingly, the calculations suggest that small is beautiful. Wormholes the size of dogs and planets would be very rare. Universes containing even a few of them would exist with a vanishingly low probability. But wormholes smaller than 10 to the power of -33 centimeters could be everywhere. A volume the size of a sugar cube might be teeming with uncounted trillions of them flashing in and out of existence!

Coleman proposes that the action of these previously ignored mini- wormholes upon the geometric fabric of the universe that forces Lambda to be almost exactly zero. Like quantum ‘Pac Men’, they gobble up all the latent energy of space-time that would otherwise have appeared to us in the form of a measureable cosmological constant!

The addition of wormholes to the description of space-time admits the possibility that our universe did not spring into being aloof and independent, but was influenced by how other space-times had already evolved – ghostly mathematical universes with which we can never communicate directly.

The most likely of these universes had Lambda near zero, and it is these states that beat out all other contenders. In a bizarre form of quantum democracy, our universe may have been forced to follow the majority, evolving into the high probability state we now observe, without a detectable cosmological constant.

EPILOG

Wormholes? Wave functions? Hidden fields? The answer to the cosmological constant’s smallness, or absence, seems to recede into the farthest reaches of abstract thinking, faster than most of us can catch up.

As ingenious as these new ideas may seem, the final pages in this unusual story have probably not been written, especially since we can’t put any of these ideas to a direct test. It is a tribute to Einstein’s genius that even his ‘biggest blunder’ made near the beginning of this century still plagues physicists and astronomers as we prepare to enter the 21st century. Who would ever have thought that something that may not even exist would lead to such enormous problems!

The Planck Era

The Planck Era

Written by Sten Odenwald. Copyright (C) 1984 Kalmbach Publishing. Reprinted by permission

The Big Bang theory says that the entire universe was created in a tremendous explosion about 20 billion years ago. The enormity of this event is hard to grasp and it seems natural to ask ourselves ‘What was it like then?’ and ‘What happened before the Big Bang?’. To try to answer these queries, lets take a brief journey backwards in time.
We first see the formation of our own sun about 15 billion years after the Big Bang and then by 5 billion years, the formation of the first galaxies. By 700,000 years, the universe is awash with the fireball radiation that keeps all matter at a temperature of 4,000 degrees. Because of this, darkness is completely absent since every point in the sky glows with the brilliance of the sun. No stars, planets or even dust grains exist, just a hot dense plasma of electrons, protons and helium nuclei. By 3 minutes, we see helium form from the fusion of hydrogen atoms while the universe seeths at a temperature of nearly 1 billion degrees. The average density of matter is that of lead. By 1 second, the Lepton Era ends and the ratio of neutrons to protons has become fixed at 1 neutron for every 5 protons. The temperature is now 5 billion degrees everywhere. At about .0001 second, we watch as the Quark Era ends and the temperature of the fireball radiation rises to an incredable 1 trillion degrees. Quarks, for the first time, can combine in groups of two and three to become neutrons, protons and other types of heavy particles. The universe is now packed with matter as densly as the nucleus of an atom. A mountain like Mt. Everest could be squeezed into a volume no greater than the size of a golf ball!

By 1 billionth of a second, the temperature is 1 thousand trillion degrees and we see the electromagnetic and weak forces merge into one force. The density of the universe has increased to the point where the entire earth could be contained in a thimble. Quarks and anti-quarks are no longer confined inside of particles like neutrons and protons but are now part of a superheated plasma of unbound particles. As the remaining history of the universe unfolds, a long period seems to pass when nothing really new happens. Then, at a time 10(-35) second after the Big Bang, a spectac ular change in the size of the universe occurs. This is the GUT Era when the strong nuclear force becomes distinguishable from the weak and electromagnetic forces. The temperature is an incredable 10 thousand trillion trillion degrees and the density of matter has sored to nearly 10(75) gm/cm3. This number is so enormous that even our analogies are almost beyond comprehension. At these densities, the entire Milky Way galaxy could easily be stuffed into a volume no larger than a single hydrogen atom! Electrons and quarks together with their anti-particles, were the major constituents of matter and very massive particles called Leptoquark Bosons caused the quarks to decay into electrons and vice versa. If we now move forward in time we would witness the vacuum of space undergoing a ‘phase transition’ from a higher energy state to a lower energy state. This is analogous to a ball rolling down the side of a mountain and coming to rest in the lowest valley. As the universe ‘rolls down hill’ it begins a brief but stupendous period of expansion. The universe swells to billions of times its former size in almost no time at all.

In addition to this, a slight excess of matter over anti-matter appears becaus of the decay of massive particles called X Higgs Bosons. As we continue to watch the universe age, the remaining pairs of particles and anti-particles find themselves and vanish in a tremendous burst of annihilation. From this paroxysm, the bulk of the fireball radiation that we now observe is born.

The GUT Era is the last stop in our fanciful journey through time. If we had asked what it was like before the GUT Era, we would immediately have entered a vast no mans land where few indisputable facts would serve to gui de us. What does seem clear is that gravity is destined to grow in importance, eventually becoming the dominant force acting between parti cles, even at the microscopic level.

G R A V I T Y

According to theories developed since the 1930’s, what we call a ‘force’ is actually a collective phenomenon caused by the exchange of innumerable, force-carrying particles called gauge bosons. The electromagnetic force, which causes like charges to attract and dissimilar ones to repel, is transmitted by gauge bosons called photons, the strong force that binds nucleii together is transmitted by gluons and the weak force which causes particles to decay is transmitted by the, recently discovered, W and Z Intermediate Vector Bosons. In an analogous way, physicists believe that gravity is transmitted by particles called Gravitons. If gravity really does have such a quantum property, its effects should appear once quarks and electrons can be forced to within 10(-33) centimeter of one another, a distance called the Planck length. To acheive these conditions, quarks and electrons will have to be collided at energies of 10(19) GeV. An accelerator patterned after the 2-mile, Stanford Linear Accelerator would have to be 1 light-year in length to push particles to these incredable energies! Fortunatly, what humans find impossible to do, Nature with its infinite resources finds less difficult. Before the universe was 10(-43) second old, matter routinely experienced collisions at these energies. This period is what we call the Planck Era.

THROUGH A LOOKING GLASS, DARKLEY

Since our technology will not allow us to physically reproduce the conditions during these ancient times, we must use our mathematical theories of how matter behaves to mentally explore what the universe was like then. We know that the appearence of the universe before 10(-43) second can only be adequatly described by modifying the Big Bang theory because this theory is, in turn, based on the General Theory of Relativity. General Relativity tells us how gravity operates on the macroscopic scale of planets, stars and galaxies. At the Planck scale, we need to extend General Relativity so that it includes not only the macroscopic properties of gravity but also is microscopic characteristics as well. The theory of ‘Quantum Gravity’ is still far from completion but physicists tend to agree that, at the very least, Quantum Gravity must combine the conceptual elements of the two great theories of modern physics: General Relativity and Quantum Mechanics.

In the language of General Relativity, gravity is a consequence of the deformati on of space caused by the presence of matter and energy. Gravity is just another name for the amount of curvature in the geometry of 3-dimensional space. In Quantum Gravity theory, gravity is produced by massless gravitons so that gravitons now represent individual packages of curved space that travel through space at the speed of light.

The appearence and dissappearence of innumerable gravitons gives the geometry of space a very lumpy and dynamic appearance. John Wheeler at Princeton University thinks of this as a foamy, sub-structure to space where the geometry of space twists and contorts so that far flung regions of space may suddenly find themselves connected by ‘wormholes’ which constantly appear and dissappear within 10(-43) seconds. Even as you are reading this article, this frenetic activity is occurring in the hyper-microscopic domain, 100 billion billion times smaller than the nucleus of an atom. For a comparison, the size of the sun and the size of a single atom stand in about this same proportion. Although Quantum Gravity effects are completely undetectable today at the atomic and nuclear scale, during the Planck Era, macroscopic and microscopic worlds merged and the Quantum Gravity of the microcosm suddenly became the Quantum Cosmology of the macrocosm!

QUANTUM COSMOLOGY

As we approach the end of the Planck Era, the random appearance and dissappearance of innumerable gravitons will eventually force us to give up the concept of a specific geometry to 3-dimensional space. Instead, the geometry at a given moment will have to be thought of as an average over all 3-dimensional space geometries that are possible. Once again, the reason for this is that particles are squeezed so closely together that we can now see individual gravitons moving around in the space between them causing space to become curved. We can no longer get away with saying that the space between two quarks, for example, is flat. This is what we mean when we say that the gravitational force between them is insignificant when compared to the other three forces of Nature.

To make matters much worse, not only will Quantum Gravity not allow us to calculate the exact 3-dimensional geometry to space but, at the Planck scale, it will not allow us to simultaneously determine its exact geometry and precise rate of change in time. What this means is that we may never be able to calculate with any certainty exactly what the history of the universe was like before 10-43 second. Today, the large-scale geometry of space is one of three possible types: flat and infinite, negatively curved and infinite or positively curved and finite. During the Planck Era, the ‘large-scale’ geometry was contorted by wormholes and and infinite number of possibilities were possible. To probe the history of the universe then would be like trying to trace your ancestral roots if every human being on earth had a possibility of being one of your parents. Now try to trace your family tree back a few generations! The farther back in time you go, the greater are the number of possible ancestors you could have had. An entirely new conception of what we mea n by ‘a history for the universe’ will have to be developed. Even the concepts of space and time will have to be completely re-evaluated in the face of the qua ntum fluctuations of spacetime at the Planck Era!

THE BIRTH OF THE UNIVERSE

The picture that seems to emerge from using our sketchy outline of what Quantum Gravity theory might look like is that as we approach the Planck Era, gravitons are exchanged between quarks and electrons with increasingly higher energy and in greater number. By the time we reach the end of the Planck Era at 10(-43) second, gravitons will begin to carry as much energy as the other force carriers (Gluons, IVBs and Photons). At still earlier times, a period of complet e symmetry and unification between all the natural forces will ensue. Only one super-unified force exists here (gravity) and only one kind of particle dominates the activity of this age(Gravitons).

During the early 70’s, the Russian physicists Ya. Zel’dovitch and A. Starobinski of the USSR Academy of Science proposed that the rapidly changing geometry of space during the Planck Era may actually have created all the matter, anti-matter and radiation that existed soon after Creation. In their picture of Creation, the rapidly changing geometry of space created particles and anti-particles with masses of 10(19) GeV. This production of matter and anti-matter removed energy from the enormous fluctuations occuring in the geometry of space and eventually succeeded in damping them out altogether by the end of the Planck Era. They also found that the rate of particle creation increased as more and more particles were created.

Several recent studies by Physicists Edward Tryon of Hunter College, R. Brout, F. Englert and E. Gunzig of the University of Brussels and david Atkatz and Heinz Pagels of the Rockefeller University have shed additional light on what Creation may have been like. Imagine if you can, nothing at all! This is the primordial vacuum of space. There is complete darkness here, no light yet exists. The number of dimensions to space was probably not the normal 3 that we are so accustomed to but may have been as high as 11 according to Supergravity theory! In this infinite emptiness, random fluctuations occurred that ever so slightly changed the energy of the vacuum at various points in space. Eventually, one of these fluctuations attained a critical energy and began to grow. As it grew, very massive particles called leptoquarks and anti-leptoquarks were created, causing the expansion to accelerate. This is much like a ball rolling down a hill that moves slowly at first and then gains momentum. The expansion of the proto-universe, in turn, caused still more leptoquarks to be created. This furious cycle continued until, at long last, the leptoquarks decayed into quarks, leptons (electrons, muons etc) and their anti-particles and the universe emerged from the Planck Era. Particle creation stopped once the fluctuations in the geometry of space subsided.

So, we are left with the remarkable possibility that, in the beginning, there ex isted quite literally, nothing at all and from it emerged nearly all of the matter and radiation that we now see. This process has been described by the physicist Frank Wilczyk at the University of California, Santa Barbara by saying, ” The reason that there is something instead of nothing is that nothing is unstable”. A ball sitting on the summit of a steep hill needs but the slightest tap to set it in motion. A random fluctuation in space was apparently all that was required to unleash the incredable latent energy of the vacuum, thus creating matter and energy and an expanding universe from ‘nothing at all’.

The universe did not spring into being instantaneously but was created a little bit at a time in a ‘bootstrap’ process. Once a few particles were created by quantum fluctuations of the empty vacuum, it became easier for a few more to appear and so, in a rapidly escalating process, the universe gushed forth from nothingness.

How long did this take? The primordial vacuum could have existed for an eternity before the particular fluctuation that gave rise to our universe happened. Physicist Edward Tryon expresses this best by saying that ” Our universe is simply one of those things that happens from time to time”.

The principles of Quantum Gravity may ultimatly force us to reconsider questions like ‘What happened before the Big Bang?’ because they imply the existence of something (time) that may not have any meaning at all. These questions may be as empty of meaning as an explorer on the north pole asking, ‘Which way is North?’. Only the complete theory of Quantum Gravity may tell us how to ask the right questions!

What is Space? Part II

Space-Time: The Final Frontier

Written by Sten Odenwald. Copyright (C) 1995 Sky Publishing Corporation. See February 1996 issue.
THE NIGHT SKY, when you think about it, is one of the strangest sights imaginable. The pinpoint stars that catch your eye are all but swallowed up by the black nothingness of space – an entity billions of light-years deep with which we here on Earth have no direct ex- perience.
What is empty space, really? At first the question seems silly. There’s nothing to it! But look again in light of what modern physics knows and suspects, and the nature of space emerges as one of the most important “sleeper” issues growing for the last 50 years. “Nature abhors a vacuum,” proclaimed Aristotle more than 2,300 years ago. Today physicists are discovering that this is true in ways the ancient Greeks could never have imagined.

True, the cosmos consists overwhelmingly of vacuum. Yet vacuum itself is proving not to be empty at all. It is much more complex than most people would guess. “But surely,” you might ask, “if you take a container and remove everything from inside it – every atom, every photon – there will be nothing left?” Not by a long shot. Since the 1920s physicists have recognized that on a microscopic scale, the vacuum itself is alive with activity. Moreover, this network of activity may extend right down to include the very structure of space-time itself. The fine structure of the vacuum may ultimately hold the keys to some of the deepest questions facing physics – from why elementary particles have the properties they do, to the cause of the Big Bang and the likelihood of other universes outside our own.

THINGS THAT GO BUMP IN THE DARK

The state of the art in physics – our deepest current understanding of the world – is embodied in the so-called Standard Model, in which all matter and forces are accounted for by an astonishingly few types of particles (see Sky & Telescope – December 1987, page 582). Six quarks and six leptons make up all possible forms of matter. In practice just two of the quarks (the up and down) and one lepton (the electron) account for everything in the world except for a few whiffs of exotica known only to high-energy physicists. The 12 particles of matter (and their 12 corresponding particles of antimatter, or antiparticles) are acted upon by “messenger particles” that carry all the known forces. The photon mediates the electromagnetic force, including all the familiar chemical and structural forces around us on Earth. The members of the gluon family carry the strong force that binds neutrons and protons together in atomic nuclei. The W’, W-, and Zo mediate the weak nuclear force, and the as-yet-undiscovered graviton is believed to carry the force of gravity.

Every possible event involving the 12 matter particles can be completely explained as an exchange of messenger particles. During some of these events, for example when electrons accelerate in a radio-transmitter antenna, messenger particles (in this case photons) materialize and travel through space. At other times, however, the messengers remain almost entirely hidden within the interacting system. When the messengers exist in this hidden form, they are called “virtual particles.” Virtual particles may seem ghostly and unreal by everyday standards. But real they are. Moreover, they are not limited to their role of mediating interactions. Virtual particles can also pop in and out of empty space all by themselves.

Quantum mechanics, the rulebook of the Standard Model, states as a bedrock principle that you need a certain length of time to measure a particle’s energy or mass to a given degree of accuracy. The shorter the observation time, the more uncertain the measurement. If the time is very brief, the uncertainty becomes larger than the particie’s entire mass, and you cannot say whether or not the particle is there at all. The lighter the particle, the longer its uncertainty time. In the case of an electron-positron pair, the uncertainty time scale is about 10^-21″ seconds.

On time scales shorter than this, virtual electrons and positrons can, and do, pop in and out of nothingness like peas in a shell game. It’s as if, just because you can’t say a particle doesn’t exist when you look very briefly, then in a sense it does. This is not mere theorizing. In 1958 a tabletop experiment demonstrated the “Casimir effect,” measuring the force caused by virtual particles appearing and vanishing in total vacuum through the attraction they caused between two parallel metal plates. If the vacuum were truly empty the plates should not have attracted, but the incessant dance of virtual particles in the space between them produces a detectable effect.

Every particle – matter as well as messenger – seems to display a virtual form, each seething in greater or lesser abundances in what physicists call the “physical vacuum.” When it comes to affecting the ordinary world, moreover, virtual particles may do much more than just mediate forces. Some, in fact, may cause matter to have the property we call mass. The electron is the simplest of matter particles. Our knowledge of the physical world rests upon a solid understanding of its properties. Yet despite its abundance in the circuitry around us, the electron harbors an enigma. The fact that it has mass cannot be explained in the Standard Model, at least the parts of it that have been experimentally verified. More than 30 years ago particle physicist Peter Higgs suggested that the existence of mass has to do with a new ingredient of nature that is now called the Higgs field, which provides a new type of messenger particle that interacts with the electron to make it “weigh.”

The Higgs field has yet to be discovered, but many physicists expect it to exist everywhere in the physical vacuum, ensuring through its interactions with electrons and other particles that they will display mass. Even now, particle accelerators at CERN in Switzerland and at Fermilab near Chicago are straining at their maximum capabilities to cause just one “Higgs boson,” the presumed messenger particle for this field, to break loose from the vacuum and leave a detectable trace. Success would provide a triumphant completion of the Standard Model.

So to answer our question about whether a container of empty space is truly empty, the best anyone can do is remove the normal, physical particles that nature allows us to see and manipulate. The virtual particles can never be evicted. And in addition there may exist the ever-present Higgs field.

QUANTUM GRAVITY

For most of this century, physicists have struggled to bring gravity into the scheme of forces that are mediated by virtual messenger particles. To put this another way, the theory of general relativity, which shows the force of gravity to be a curvature of space-time, needs to be integrated with quantum mechanics, which shows forces to be virtual particle exchanges. Working on the assumption that such a marriage is possible, physicists named gravity’s messenger particle the graviton. But general relativity requires that gravitons be more than just quanta of gravity. In essence, gravitons define the structure of space-time itself.

The reconciliation of quantum mechanics and general relativity may lead us to dramatically new notions of the nature of space and time. Some theorists have suggested that points in space-time become defined only when a particle (such as a graviton or photon) interacts with other particles. In this view, what they are doing between interactions is a nonphysical question, since only an interaction defines a measurable time and place. Gravitational forces (and thus gravitons) exert an influence at distances much larger than the subatomic realm, as anyone who has fallen down a flight of stairs can attest. But only at an extremely small scale – the Planck length of 10^-33 centimeters – does the quantum nature of gravity become important.

Suppose you could magically look through a microscope that magnified an atomic nucleus to be some 10 light-years across. Under this magnification the smallest gravitons – that is, the most energetic and massive ones – would be about a millimeter in size. Here we might see a strange world in which space-time itself was defined by gravitons intersecting and looping around each other. In a similar vein, Roger Penrose has suggested that the gravitational field and space-time are built up from still more primitive mathematical entities called twistors, and that “ultimately the [space-time] concept may possibly be eliminated from the basis of physical theory altogether.” In essence, space and time become factored out as less- than-fundamental parts of the physical world.

In such a view, only the interactions between twistors, or perhaps gravitons, define when and where space-time is and is not. Are there gaps in the physical vacuum, voids of true and absolute nothing where space and time themselves do not exist?

Another viewpoint on the structure of space-time is offered by “superstring theory.” String theories posit that the fundamental objects of nature are one-dimensional lines rather than points; the “elementary” particles we measure are only oscillations of these strings. Superstring theory only seems to work, however, if space-time has not just four dimensions (three of space and one of time), but 10 dimensions. This hardly seems like the world we live in. To hide the extra six dimensions, mathematicians roll them up into conceptual corners that go by such cryptic names as “Calabi-Yau manifolds” and “orbifold space.” A recent textbook on the subject concludes on a wistful note that “if the string idea is correct, we may never catch more than a glimpse of the full ex- tent of reality.”

More recently, theorists Carlo Rovelli (University of Pittsburgh) and Lee Smolin (Pennsylvania State University) completed their analysis of a quantum gravity model developed by Abhay Ashtekar at Syracuse University in 1985. Unlike string theory, Ashtekar’s work applies only to gravity. However, it posits that at the Planck scale, space-time dissolves into a network of “loops” that are held together by knots. Somewhat like a chain-mail coat used by knights of yore, space-time resembles a fabric fashioned in four dimensions from these tiny one-dimensional loops and knots of energy.

Is this the way the world really is on its most fundamental level, or have mathematicians become detached from reality? Superstring theory has enticed physicists for over a decade now because it hints at a super unification of all four fundamental forces of nature. But it remains frustratingly hard to plant anchors down from these cloud castles into the real world of observation and experiment. The famous remark that superstring theory is “a piece of 21st- century physics that accidentally fell into the 20th century” captures both the excitement and frustration of workers stuck with 20th-century tools.

Surprisingly, string theory, Ashtekar’s loopy space-time, and twistors are not entirely independent ways of looking at space-time. In 1986 theorists discovered that superstrings have some things in common with twistors. A deep connection had been uncovered between two very different, independent theories. Like two teams of tunnelers starting on opposite sides of a mountain, they had met at the middle – a sign, perhaps, that they are dealing with a single real mountain, not separate ones in their own imaginations. And in 1995 Rovelli and Smolin also found that their graviton loops are very closely related to both the twistors and superstrings, though not identical in all respects.

THE COSMIC CONNECTION

Space-time could be strange in other ways too. Theorist John A. Wheeler (In- stitute for Advanced Study) has long advocated that at the Planck scale, space-time has a complex shape that changes from instant to instant. Wheeler called his picture “space-time foam” – a sea of quantum black holes and worm holes appearing and vanishing on a time scale of about 10^-43″ seconds. This is the Planck time, the time it takes light to cross the Planck length. Shorter than that, time, like space, presumably cannot exist – or, at least, our everyday notions of them cease to be valid.

Wheeler’s idea of space-time foam is a natural extrapolation from the idea of virtual particles. According to quantum mechanics, the higher the energy and mass of a particle, the smaller it must appear. A virtual particle as small as 10^-33″ cm, lasting only 10^-43 second, has so great a mass (10^-5 gram) in such a tiny volume that its own surface gravity would give it an escape velocity greater than the speed of light. In other words, it is a tiny black hole. But a black hole is not an ordinary object sitting in space- time like a particle; it is a structure of distorted, convoluted space-time itself. Although the consequences of such phe- nomena are not understood, it is rea- sonable to assume that these virtual par- ticles dramatically distort all space-time at the Planck scale.

If we take this reasoning at face value, and consider the decades-old experiments proving that the virtual particle phenomenon in a vacuum is real, it is hard to believe that space-time is smooth at or below the Planck scale. Space must be broken up and quantized. The only question is how. Wheeler’s original idea of space-time foam is especially potent because according to recent proposals by Sidney Coleman (Harvard) and Stephen Hawking (Cambridge University), its worm holes not only connect different points very close together within our space-time, but connect our space-time to other universes that, as far as we are concerned, exist only as ghostly probabilities. These connections to other universes cause the so-called cosmological constant – an annoying intrusion into the equations of cosmology ever since Einstein (see Sky & Telescope- April 1991, page 362) – to neatly vanish within our own universe.

Space-time foam has also been implicated as the spawning ground for baby universes. In several theories explaining the cause of the Big Bang and what came before, big bangs can bud off from a previously existing space-time, break away completely while still microscopic, and inflate with matter to become new universes of their own, completely disconnected (“disjoint”) from their space- time of origin. This process, proposed by Alan Guth (MIT) and others, gives a handle on what many expect to be another key issue of 21st-century physics: was our Big Bang unique? Or was it just a routine spinoff of natural processes happening all the time in some larger, outside realm? (see Sky & Telescope- September 1988, page 253).

Yet there are problems. The amount of latent energy in the quantum fluctuations of space-time foam is staggering: 10^105 ergs per cubic centimeter. This amounts to 10 billion billion times the mass of all the galaxies in the observ- able universe – packed into every cubic centimeter! Fortunately, Mother Nature seems to have devised some means of exactly canceling out this phenomenon to an accuracy of about 120 decimal places. The problem is that we haven’t a clue how.

It’s unnerving to think that in the 16 inches separating this page from your eyes, new big bangs are perhaps being spawned out and away from our quiet space-time every instant. By comparison, it seems positively dull that the photons by which you see this page might be playing a hop-scotch game to avoid gaps where space-time doesn’t exist.

REALITY CHECK

Some physicists have begun to throw cold water on these fantastic ideas. For instance, in 1993 Matt Visser (Washington University) studied the mathematical properties of quantum worm holes and discovered that, once they are formed, they become stable: they can’t foam at all. Kazuo Ghoroku (Fukuoka Institute of Technology, Japan) also found that quantum worm holes become stable even when their interactions with other fields are considered. What Wheeler called space-time foam may be something else entirely.

Among the unresolved problems facing theorists is the nature of time, which has been recognized as inextricably bound up with space ever since Einstein posited a constant speed for light. In general relativity, it isn’t always obvious how to define what we mean by time, especially at the Planck scale where time seems to lose its conventional meaning. Central to any quantum theory is the concept of measurement, but what does this imply for physics at the Planck scale, which sets an ultimate limit to the possibility of measurement? How any of these ideas about space- time can be tested is currently unknown. Some physicists believe this makes these ideas not real scientific inquiry at all. And it’s worth remembering that mathematics can sometimes introduce concepts that are only a means to an end and have no independent reality.

In the abstract world of mathematical symbolism, it isn’t always clear what is real and what’s not. For example, when we do long division on paper to divide 54,162 by 2 to get 27,081, we generate the intermediate numbers 14, 16, and 2, which we then just throw away. Are virtual particles, compact 6-dimensional manifolds, and twistors simply nonphysical means to an end – mere artifacts of how we humans do our mathematics? Particle physicists often have to deal with “ghost fields” that are simply the temporary scaffolding used for calculations, and that vanish when the calculations are complete. Nonphysical devices such as negative probability and faster- than-light tachyon particles are grudgingly tolerated so long as they disappear before the final answers. Even in super- string theory, recent work suggests that it may be possible to build consistent models entirely within ordinary four-dimensional space-time, without recourse to higher dimensions.

ANGEL FOOD CAKE

So, how should we think of the great, dark void that we gaze into at night? All clues point to space-time being a kind of layer cake of busy phenomena on the submicroscopic scale. The topmost layer contains the quarks and electrons comprising ordinary matter, scattered here and there like raisins in the frosting. These raisins can be plucked away to make a region of space appear empty. The frosting itself consists of virtual particles, primarily those carrying the electromagnetic, weak, and strong forces, filling the vacuum with incessant activity that can never be switched off. Their quantum comings and goings may completely fill space-time so that no points are ever really missing. This layer of the cake of “empty space” seems pretty well established by laboratory experiment.

Beneath this layer we have the domain of the putative Higgs field. No matter where the electron and quark “raisins” go, in this view, there is always a piece of the Higgs field nearby to affect them and give them mass. Below the Higgs layer there may exist other layers, representing fields we have yet to discover. But eventually we arrive at the lowest stratum, that of the gravitational field. There is more of this field wherever mass is present in the layers above it, but there is no place where it is entirely absent. This layer recalls the Babylonian Great Turtle that carried the universe on its back. Without it, all the other layers above would vanish into nothingness.

We know that space-time is quite smooth down to at least the scale of the electron, 10^-20 cm – 10 million times smaller than an atomic nucleus. This is the size limit set for any internal component of the electron, based on careful comparisons between experiment and the predictions of quantum electrodynamics. But near the Planck horizon of 10^-33 cm, space-time must change its structure drastically. It may be a world in which conventional notions of dimensionality, time, and space need to be redefined and possibly eliminated altogether.

The conceit of our universe’s uniqueness may disappear, with big bangs becoming viewed as run-of-the-mill events in some much larger outside realm, and with physical constants being attributed to causes in space-times forever beyond human experience.

There is much that’s spooky about the physical vacuum. This spookiness may be rooted more in the way our brains work than in some objective aspect of nature. Einstein stressed, “Space and time are not conditions in which we live, but modes in which we think.” Our understanding of space remains in its infancy. With Aristotle smiling at us down the centuries, we now see the vacuum as much more than a vacancy. It will take many decades, if not centuries, before a complete understanding of it is fashioned. In the meantime, enjoy the nighttime view!

FURTHER READING

Davies, Paul. The New Physics. Cambridge: Cambridge University Press, 1989.

Mallove, Eugene. “The Self-Reproducing Universe.” Sky & Telescope, September 1988, page 253.

Matthews, Robert. “Nothing Like a Vacuum.” New Scientist, February 25, 1995, page 30.

Pagels, Heinz. “Perfect Symmetry.” New York: Simon & Schuster, 1985.

Quantum Gravity…Oh my!

So here’s the big problem.

Right now, physicists have a detailed mathematical model for how the fundamental forces in nature work: electromagnetism, and the strong and weak nuclear forces. Added to this is a detailed list of the fundamental particles in nature like the electron, the quarks, photons, neutrinos and others. Called the Standard Model, it has been extensively verified and found to be an amazingly accurate way to describe nearly everything we see in the physical world. It explains why some particles have mass and others do not. It describes exactly how forces are generated by particles and transmitted across space. Experimenters at the CERN Large Hadron Collider are literally pulling out their hair to find errors or deficiencies in the Standard Model that go against the calculated predictions, but have been unable to turn up anything yet. They call this the search for New Physics.

Along side this accurate model for the physical forces and particles in our universe, we have general relativity and its description of gravitational fields and spacetime. GR provides no explanation for how this field is generated by matter and energy. It also provides no description for the quantum structure of matter and forces in the Standard Model. GR and the Standard Model speak two very different languages, and describe two very different physical arenas. For decades, physicists have tried to find a way to bring these two great theories together, and the results have been promising but untestable. This description of gravitational fields that involves the same principles as the Standard Model has come to be called Quantum Gravity.

The many ideas that have been proposed for Quantum Gravity are all deeply mathematical, and only touch upon our experimental world very lightly. You may have tried to read books on this subject written by the practitioners, but like me you will have become frustrated by the math and language this community has developed over the years to describe what they have discovered.

The problem faced by Quantum Gravity is that gravitational fields only seem to display their quantum features at the so-called Planck Scale of 10^-33 centimeters and  10^-43 seconds. I cant write this blog using scientific notation, so I am using the shorthand that 10^3 means 1000 and 10^8 means 100 million. Similarly, 10^-3 means 0.001 and so on. Anyway, the Planck scale  also corresponds to an energy of 10^19 GeV or 10 billion billion GeV, which is an energy 1000 trillion times higher than current particle accelerators can reach.

There is no known technology that can reach the scales where these effects can be measured in order to test these theories. Even the concept of measurement itself breaks down! This happens because the very particles (photons) you try to use to study physics at the Planck scale carry so much energy  they turn into quantum black holes and are unable to tell you what they saw or detected!

One approach to QG is called Loop Quantum Gravity.  Like relativity, it assumes that the gravitational field is all there is, and that space and time become grainy or ‘quantized’ near the Planck Scale. The space and time we know and can experience in-the-large is formed from individual pieces that come together in huge numbers to form the appearance of a nearly-continuous and smooth gravitational field.

The problem is that you cannot visualize what is going on at this scale because it is represented in the mathematics, not by nuggets of space and time, but by more abstract mathematical objects called loops and spin networks. The artist rendition above is just that.

So here, as for Feynman Diagrams, we have a mathematical picture that represents a process, but the picture is symbolic and not photographic. The biggest problem, however, is that although it is a quantum theory for gravity that works, Loop Quantum Gravity does not include any of the Standard Model particles. It represents a quantum theory for a gravitational field (a universe of space and time) with no matter in it!

In other words, it describes the cake but not the frosting.

The second approach is string theory. This theory assumes there is already some kind of background space and time through which another mathematical construct called a string, moves. Strings that form closed loops can vibrate, and each pattern of vibrations represents a different type of fundamental particle. To make string theory work, the strings have to exist in 10 dimensions, and most of these are wrapped up into closed balls of geometry called Calabi-Yau spaces. Each of these spaces has its own geometry within which the strings vibrate. This means there can be millions of different ‘solutions’ to the string theory equations: each a separate universe with its own specific type of Calabi-Yau subspace that leads to a specific set of fundamental particles and forces. The problem is that string theory violates general relativity by requiring a background space!

In other words, it describes the frosting but not the cake!

One solution proposed by physicist Lee Smolin is that Loop Quantum Gravity is the foundation for creating the strings in string theory. If you looked at one of these strings at high magnification, its macaroni-like surface would turn into a bunch of loops knitted together, perhaps like a Medieval chainmail suit of armor. The problem is that Loop Quantum Gravity does not require a gravitational field with more than four dimensions ( 3 of space and one of time) while strings require ten or even eleven. Something is still not right, and right now, no one really knows how to fix this. Lacking actual hard data, we don’t even know if either of these theories is closer to reality!

What this hybrid solution tries to do is find aspects of the cake that can be re-interpreted as particles in the frosting!

This work is still going on, but there are a few things that have been learned along the way about the nature of space itself. At our scale, it looks like a continuous gravitational field criss-crossed by the worldlines of atoms, stars and galaxies. This is how it looks even at the atomic scale, because now you get to add-in the worldlines of innumerable ‘virtual particles’ that make up the various forces in the Standard Model.  But as we zoom down to the Planck Scale, space and spacetime stop being smooth like a piece of paper, and start to break up into something else, which we think reveals the grainy nature of gravity as a field composed of innumerable gravitons buzzing about.

But what these fragmentary elements of space and time ‘look’ like is impossible to say. All we have are mathematical tools to describe them, and like our attempts at describing the electron, they lead to a world of pure abstraction that cannot be directly observed.

If you want to learn a bit more about the nature of space, consider reading my short booklet ‘Exploring Quantum Space‘ available at amazon.com. It describes the amazing history of our learning about space from ancient Greek ‘common sense’ ideas, to the highlights of mind-numbing modern quantum theory.

Check back here on Thursday, December 22 for the last blog in this series!

What IS space?

One thing that is true about physics is that it involves a lot of mathematics. What this means is that we often use the mathematics to help us visualize what is going on in the world. But like I said in an earlier blog, this ‘vision thing’ in math can sometimes let you mistake the model for the real thing, like the case of the electron. The same problem emerges when we try to understand an invisible  thing like space.

The greatest discovery about space  was made by Einstein just before 1915 as he was struggling to turn his special theory of relativity into something more comprehensive.

Special relativity was his theory of space and time that described how various observers would see a consistent world despite their uniform motion at high speeds. This theory alone revolutionized physics, and has been the main-stay of modern quantum mechanics, as well as the designs of powerful accelerators that successfully and accurately push particles to nearly the speed of light. The problem was that special relativity did not include a natural place for accelerated motion, especially in gravitational fields, which are of course very common in the universe.

Geometrically, special relativity only works when worldlines are perfectly straight, and  form lines within a perfectly flat, 4-dimensional spacetime (a mathematical arena where 3 dimensions of space are combined with one dimension of time). But accelerated motion causes worldlines to be curved, and you cannot magically make the curves go straight again and keep the spacetime geometrically flat just by finding another coordinate system.

Special relativity, however, promised that so long as motion is at constant speed and worldlines are straight, two different observers (coordinate systems) would agree about what they are seeing and measuring by using the mathematics of special relativity. With curved worldlines and acceleration, the equations of special relativity, called the Lorentz Transformations, would not work as they were. Einstein was, shall we say, annoyed by this because clearly there should be some mathematical process that would allow the two accelerated observers to again see ( or calculate) consistent physical phenomena.

He began his mathematical journey to fix this problem by writing his relativity equations in a way that was coordinate independent using the techniques of tensor analysis. But he soon found himself frustrated by what he needed in order to accomplish this mathematical miracle, versus his knowledge of advanced analytic geometry in four dimensions. So he went to his classmate and math wiz, Marcel Grossman, who immediately recognized that Einstein’s mathematical needs were just an awkward way of stating certain properties of non-Euclidean geometry developed by Georg Riemann and others in the mid-to-late 1800s.

This was the missing-math that Einstein needed, who being a quick learner, mastered this new language and applied it to relativity. After an intense year of study, and some trial-and-error mathematical efforts, he published his complete Theory of General Relativity in November 1915. Just like the concept of spacetime did away with space and time as independent ideas in special relativity, his new theory made an even bigger, revolutionary, discovery.

It was still a theory of the geometry of worldlines that he was proposing, but now the geometric properties of these worldlines was controlled by a specific mathematical term called the metric tensor. This mathematical object was fundamental to all geometry as Grossman had showed him, and allowed you to calculate distances between points in space. It also defined what a ‘straight line’ meant, as well as how curved the space was. Amazingly, when you translated all this geometric talk into the hard, cold reality of physics in 4-dimensions, this metric tensor turned into the gravitational field through which the worldline of a particle was defined as the straightest-possible path.

An interesting factoid, indeed, but why is it so revolutionary?

All other fields in physics (e.g like the electromagnetic field) are defined by some quantity, call it A, that is specified at each coordinate point in space and time: A(x,y,z,t). If you take-away the field, the coordinate grid remains intact. But with the gravitational field, there is no background coordinate grid to define its intensity, instead, the gravitational field provides its own coordinate grid because it is identical to the metric tensor!!

This is why Einstein and physicists say that gravity is not a force like the others we know about, but instead it is a statement about the shape of the geometry of spacetime through which particles move. (Actually, particles do not move through spacetime. Their histories from start to finish simply exist all at once like a line drawn on a piece of paper!)

So, imagine a cake with frosting on it. The frosting represents the various fields in space, and you can locate where they are and how much frosting is on the cake from place to place. But the bulk of the cake, which is supporting the frosting and telling you that ‘this is the top, center, side, etc of the cake’ is what supports the frosting. Take away the cake, and the frosting is unsupported, and can’t even be defined in the first place. Similarly, take away the gravitational field, symbolized by Einstein’s metric tensor, and spacetime actually disappears!

Amazingly, Einstein’s equations say that although matter and energy produce gravitational fields, you can have situations where there is no matter and energy and spacetime still doesn’t vanish! These vacuum solutions are real head-scratchers when physicists try to figure out how to combine quantum mechanics, our premier theory of matter, with general relativity: our premier theory of gravity and spacetime. These vacuum solutions represent gravitational fields in their purest form, and are the starting point for learning how to describe the quantum properties of gravitational fields. They are also important to the existence of gravity waves, which move from place to place as waves in the empty spacetime between the objects producing them.

But wait a minute. Einstein originally said that ‘space’ isn’t actually a real thing. Now we have general relativity, which seems to be bringing space (actually spacetime) back as something significant in its own right as an aspect of the gravitational field.

What gives?

To see how some physicists resolve these issues, we have to delve into what is called quantum gravity theory, and this finally gets us back to some of my earlier blogs about the nature of space, and why I started this blog series!

 

Check back here on Wednesday, December 21 for the last installment on this series about space!

Relativity and Space

Psychologists and physicists often use a similar term to describe one of the most fundamental characteristics of humans and matter: The Story. Here, for example, is the timeline story for key events in the movie The Hunger Games.

Oliver Sacks, in his book ‘The Man Who Mistook His Wife for a Hat’ describes the case of Jimmy G who was afflicted with Korsakov’s Syndrome. He could not remember events more than a few minutes in the past, and so he had to re-invent his world every few minutes to account for new events. As Sacks notes ‘If we wish to know about a man, we ask ‘what is his story – his real, inmost story? – for each of us is a biography, a story..[and a] singular narrative, which is constructed, continually, unconsciously, by, through, and in us – through our perceptions, our feelings, our thoughts, our actions..and our narratives…we must constantly recollect ourselves’.

Physicist Lee Smolin, in his book ‘Three Roads to Quantum Gravity’ , describes the essential foundation of relativity as the ‘story’ about processes and not the things-as-objects.   “A marble is not an inert thing, it is a process…There are only relatively fast processes and relatively slow processes. And whether it is a short story or a long story, the only kind of explanation of a process  that is truly adequate is a story.”

In both cases, we cannot define an object, be it a human, a table, or an electron by merely describing its properties at one instant in time. We can only define an object in terms of a process consisting of innumerable events, which create the story that defines it. This is very obvious when we are talking about humans, but it also applies to every object in the universe.

In relativity, the history or ‘story’ of a process such as a football or a galaxy, consists of a series of events that are tied together by cause-and-effect to create the process that you see at any particular moment. These events include the interactions of one process with others that cumulatively create what you see as the history of the process at a particular moment. In relativity, we call this history of a process its worldline.

This is a worldline map (Credit Aaron Koblin / BBC)of airlines traveling to and from the United States. The lines give the history of each flight on the 2-d surface of Earth. Each worldline consists of a huge number of ‘hidden’ events contributed by each passenger! By carefully studying these worldlines you could mathematically deduce that Earth is a sphere.

What Einstein said is that only worldlines matter, because that is the only thing we have access to. Even better than that, we are only able to see that part of a processes that can be communicated to us by using light, which is the fastest signal we can ever use to transfer information. When we are ‘looking’ at something, like a car or a star, what we are actually doing is looking back along its history carried to us as information traveling by photons of light.

In an earlier essay, I mentioned how we do not see objects in space, but only the end points of a light ray’s history as, for example, it leaves the surface of an object (Event 1) arrives at dust mote along the way and was re-emitted (Event 2) to arrive at our retina, and cause a rod or a cone cell to fire (Event 3). Because these events are strictly determined by cause-and-effect, and travel times are limited by the speed of light, we can organize these events in a strict history for the object we viewed (which was in fact a ‘process’ in and of itself!).

So, what does this say about space? Space  is irrelevant, because we can completely define our story only in terms of the ‘geometry’ of these history worldlines and the causal connections between events on these worldlines, without any mention of space as a ‘background’ through which things move.

This leads to another problem.

Einstein’s new relativistic theory of gravity makes use of a convenient mathematical tool called 4-dimensional spacetime. Basically we live in a world with three dimensions of space and one dimension of time, making a 4-dimensional thing called spacetime. Without knowing, you live and work in 4-dimensions because there is nothing about you that does not ‘move’ in time as well as space from second to second. All physical process take place in 4 dimensions, so all theories of physics and how things work are necessarily statements about 4-dimensional things.

It is common to refer to gravity as a curvature in the geometry of this spacetime ‘fabric’, but we can just as easily talk about the curvature of worldlines defining gravity and not even bother with the idea of spacetime at all! Remember, when you look at an object, you are ‘just’ looking back through its history revealed by the network of photons of light.

So we have used a mathematical tool, namely spacetime, to make visualizing the curvature of worldlines easier to describe, but we now make the mistake of thinking that spacetime is real because we have now used the mathematical tool to represent the object itself. This is similar to what we did with the idea of Feynman Diagrams in the previous blog! As Lee Smolin says ‘When we imagine we are seeing into an infinite three-dimensional space, we are actually falling for a fallacy in which we substitute what we actually see [a history of events] for an intellectual construct [space]. This is not only a mystical vision, it is wrong.”

But what about infinity?

In my next essay I will discuss why infinity is probably not a real concept in the physical world.

 

Check back here on Friday, December 16 for the next installment!

Physicist Lee Smolin’s book ‘The Three Roads to Quantum Gravity’ discusses many of these ideas in more detail.