Category Archives: Physics

Glueballs anyone?

Today, physicists are both excited and disturbed by how well the Standard Model is behaving, even at the enormous energies provided by the CERN Large Hadron Collider. There seems to be no sign of the expected supersymmetry property that would show the way to the next-generation version of the Standard Model: Call it V2.0. But there is another ‘back door’ way to uncover its deficiencies. You see, even the tests for how the Standard Model itself works are incomplete, even after the dramatic 2012 discovery of the Higgs Boson! To see how this backdoor test works, we need a bit of history.

Glueballs found in a quark-soup (Credit: Alex Dzierba, Curtis Meyer and Eric Swanson)

Over fifty years ago in 1964, physicists Murray Gell-Mann at Caltech and George Zweig at CERN came up with the idea of the quark as a response to the bewildering number of elementary particles that were being discovered at the huge “atom smasher” labs sprouting up all over the world. Basically, you only needed three kinds of elementary quarks, called “up,” “down” and “strange.” Combining these in threes, you get the heavy particles called baryons, such as the proton and neutron. Combining them in twos, with one quark and one anti-quark, you get the medium-weight particles called the mesons. In my previous blog, I discussed how things are going with testing the quark model and identifying all of the ‘missing’ particles that this model predicts.

In addition to quarks, the Standard Model details how the strong nuclear force is created to hold these quarks together inside the particles of matter we actually see, such as protons and neutrons. To do this, quarks must exchange force-carrying particles called gluons, which ‘glue’ the quarks together in to groups of twos and threes. Gluons are second-cousins to the photons that transmit the electromagnetic force, but they have several important differences. Like photons, they carry no mass, however unlike photons that carry no electric charge, gluons carry what physicist call color-charge. Quarks can be either ‘red’, ‘blue’ or ‘green’, as well as anti-red, anti-green and anti-blue. That means that quarks have to have complex color charges like (red, anti-blue) etc. Because the gluons carry color charge, unlike photons which do not interact with each other, gluons can interact with each other very strongly through their complicated color-charges. The end result is that, under some circumstances, you can have a ball of gluons that resemble a temporarily-stable particle before they dissipate. Physicists call these glueballs…of course!

Searching for Glueballs.

Glueballs are one of the most novel, and key predictions of the Standard Model, so not surprisingly there has been a decades-long search for these waifs among the trillions of other particles that are also routinely created in modern particle accelerator labs around the world.

Example of glueball decay into pi mesons.

Glueballs are not expected to live very long, and because they carry no electrical charge they are perfectly neutral particles. When these pseudo-particles decay, they do so in a spray of other particles called mesons. Because glueballs consist of one gluon and one anti-gluon, they have no net color charge. From the various theoretical considerations, there are 15 basic glueball types that differ in what physicists term parity and angular momentum. Other massless particles of the same general type also include gravitons and Higgs bosons, but these are easily distinguished from glueball states due to their mass (glueballs should be between 1 and 5GeV) and other fundamental properties. The most promising glueball candidates are as follows:

Scalar candidates: f0(600), f0(980), f0(1370), f0(1500), f0(1710), f0(1790)
Pseudoscalar candidates: η(1405), X(1835), X(2120), X(2370), X(2500)
Tensor candidates: fJ(2220), f2(2340)

By 2015, the f-zero(1500) and f-zero(1710) had become the prime glueball candidates. The properties of glueball states can be calculated from the Standard Model, although this is a complex undertaking because glueballs interact with nearby quarks and other free gluons very strongly and all these factors have to be considered.

On October 15, 2015 there was a much-ballyhooed announcement that physicists had at last discovered the glueball particle. The articles cited Professor Anton Rebhan and Frederic Brünner from TU Wien (Vienna) as having completed these calculations, concluding that the f-zero(1710) was the best candidate consistent with experimental measurements and its predicted mass. More rigorous experimental work to define the properties and exact decays of this particle are, even now, going on at the CERN Large Hadron Collider and elsewhere.

So, between the missing particles I described in my previous blog, and glueballs, there are many things about the Standard Model that still need to be tested. But even with these predictions confirmed, physicists are still not ‘happy campers’ when it comes to this grand theory of matter and forces. Beyond these missing particles, we still need to have a deeper understanding of why some things are the way they are, and not something different.

Check back here on Wednesday, April 5 for my next topic!

Crowdsourcing Gravity

The proliferation of smartphones with internal sensors has led to some interesting opportunities to make large-scale measurements of a variety of physical phenomena.

The iOS app ‘Gravity Meter’ and its android equivalent have been used to make  measurements of the local surface acceleration, which is nominally 9.8 meters/sec2. The apps typically report the local acceleration to 0.01 (iOS) or even 0.001 (android) meters/secaccuracy, which leads to two interesting questions: 1)How reliable are these measurements at the displayed decimal limit, and 2) Can smartphones be used to measure expected departures from the nominal surface acceleration due to Earth rotation? Here is a map showing the magnitude of this (centrifugal) rotation effect provided by The Physics Forum.

As Earth rotates, any object on its surface will feel a centrifugal force directed outward from the center of Earth and generally in the direction of local zenith. This causes Earth to be slightly bulged-out at the equator compared to the poles, which you can see from the difference between its equatorial radius of 6,378.14 km versus its polar radius of 6,356.75 km: a polar flattening difference of 21.4 kilometers. This centrifugal force also has an effect upon the local surface acceleration  by reducing it slightly at the equator compared to the poles. At the equator, one would measure a value for ‘g’ that is about 9.78 m/sec2 while at the poles it is about 9.83 m/sec2. Once again, and this is important to avoid any misconceptions, the total acceleration defined as gravity plus centrifugal is reduced, but gravity is itself not changed because from Newton’s Law of Universal Gravitation, gravity is due to mass not rotation.

Assuming that the smartphone accelerometers are sensitive enough, they may be able to detect this equator-to-pole difference by comparing the surface acceleration measurements from observers at different latitudes.

 

Experiment 1 – How reliable are ‘gravity’ measurements at the same location?

To check this, I looked at the data from several participating classrooms at different latitudes, and selected the more numerous iOS measurements with the ‘Gravity Meter’ app. These data were kindly provided by Ms. Melissa Montoya’s class in Hawaii (+19.9N), George Griffith’s class in Arapahoe, Nebraska (+40.3N), Ms. Sue Lamdin’s class in Brunswick, Maine (+43.9N), and Elizabeth Bianchi’s class in Waldoboro, Maine (+44.1N).

All four classrooms measurements, irrespective of latitude (19.9N, 40.3N, 43.9N or 44.1N) showed distinct ‘peaks’, but also displayed long and complicated ‘tails’, making these distributions not Gaussian as might be expected for random errors. This suggests that under classroom conditions there may be some systematic effects introduced from the specific ways in which students may be making the measurements, introducing  complicated and apparently non-random,  student-dependent corrections into the data.

A further study using the iPad data from Elizabeth Bianchi’s class, I discovered that at least for iPads using the Gravity Sensor app, there was a definite correlation between when the measurement was made and the time it was made during a 1.5-hour period. This resembles a heating effect, suggesting that the longer you leave the technology on before making the measurement, the larger will be the measured value. I will look into this at a later time.

The non-Gaussian behavior in the current data does not make it possible to assign a normal average and standard-deviation to the data.

 

Experiment 2 – Can the rotation of Earth be detected?

Although there is the suggestion that in the 4-classroom data we could see a nominal centrifugal effect of about the correct order-of-magnitude, we were able to get a large sample of individual observers spanning a wide latitude range, also using the iOS platform and the same ‘Gravity Meter’ app. Including the median values from the four classrooms in Experiment 1, we had a total of 41 participants: Elizabeth Abrahams, Jennifer Arsenau, Dorene Brisendine, Allen Clermont, Hillarie Davis, Thom Denholm, Heather Doyle, Steve Dryer, Diedra Falkner, Mickie Flores, Dennis Gallagher, Robert Gallagher, Rachael Gerhard, Robert Herrick, Harry Keller, Samuel Kemos, Anna Leci, Alexia Silva Mascarenhas, Alfredo Medina, Heather McHale, Patrick Morton, Stacia Odenwald, John-Paul Rattner, Pat Reiff, Ghanjah Skanby, Staley Tracy, Ravensara Travillian, and Darlene Woodman.

The scatter plot of these individual measurements is shown here:

The red squares are the individual measurements. The blue circles are the android phone values. The red dashed line shows the linear regression line for only the iOS data points assuming each point is equally-weighted. The solid line is the predicted change in the local acceleration with latitude according to the model:

G =   9.806   –  0.5*(9.832-9.78)*Cos(2*latitude)    m/sec2

where the polar acceleration is 9.806 m/sec2 and the equatorial acceleration is 9.780 m/sec2. Note: No correction for lunar and solar tidal effects have been made since these are entirely undetectable with this technology.

Each individual point has a nominal variation of +/-0.01 m/sec2 based on the minimum and maximum value recorded during a fixed interval of time. It is noteworthy that this measurement RMS is significantly smaller than the classroom variance seen in Experiment 1 due to the apparently non-Gaussian shape of the classroom sampling. When we partition the iOS smartphone data into 10-degree latitude bins and take the median value in each bin we get the following plot, which is a bit cleaner:

The solid blue line is the predicted acceleration. The dashed black line is the linear regression for the equally-weighted individual measurements. The median values of the classroom points are added to show their distribution. It is of interest that the linear regression line is parallel, and nearly coincident with, the predicted line, which again suggests that Earth’s rotation effect may have been detected in this median-sampled data set provided by a total of 37 individuals.

The classroom points clustering at ca +44N represent a total of 36 measures representing the plotted median values, which is statistically significant. Taken at face value, the classroom data would, alone, support the hypothesis that there was a detection of the rotation effect, though they are consistently 0.005 m/sec2 below the predicted value at the mid-latitudes. The intrinsic variation of the data, represented by the consistent +/-0.01 m/sec2 high-vs-low range of all of the individual samples, suggests that this is probably a reasonable measure of the instrumental accuracy of the smartphones. Error bars (thin vertical black lines) have been added to the plotted median points to indicate this accuracy.

The bottom-line seems to be that it may be marginally possible to detect the Earth rotation effect, but precise measurements at the 0.01 m/sec2 level are required against what appears to be a significant non-Gaussian measurement background. Once again, some of the variation seen at each latitude may be due to how warm the smartphones were at the time of the measurement. The android and iOS measurements do seem to be discrepant with the android measurements leading to a larger measurement variation.

Check back here on Wednesday, March 29 for the next topic!

Fifty Years of Quarks!

Today, physicists are both excited and disturbed by how well the Standard Model is behaving, even at the enormous energies provided by the CERN Large Hadron Collider. There seems to be no sign of the expected supersymmetry property that would show the way to the next-generation version of the Standard Model: Call it V2.0. But there is another ‘back door’ way to uncover its deficiencies. You see, even the tests for how the Standard Model itself works are incomplete, even after the dramatic 2012 discovery of the Higgs Boson! To see how this backdoor test works, we need a bit of history.

Over fifty years ago in 1964, physicists Murray Gell-Mann at Caltech and George Zweig at CERN came up with the idea of the quark as a response to the bewildering number of elementary particles that were being discovered at the huge “atom smasher” labs sprouting up all over the world. Basically, you only needed three kinds of elementary quarks, called “up,” “down” and “strange.” Combining these in threes, you get the heavy particles called baryons, such as the proton and neutron. Combining them in twos, with one quark and one anti-quark, you get the medium-weight particles called the mesons.

This early idea was extended to include three more types of quarks, dubbed “charmed,” “top” and “bottom” (or on the other side of the pond, “charmed,” “truth” and “beauty”) as they were discovered in the 1970s. These six quarks form three generations — (U, D), (S, C), (T, B) — in the Standard Model.

Particle tracks at CERN/CMS experiment (credit: CERN/CMS)

Early Predictions

At first the quark model easily accounted for the then-known particles. A proton would consist of two up quarks and one down quark (U, U, D), and a neutron would be (D, D, U). A pi-plus meson would be (U, anti-D), and a pi-minus meson would be (D, anti-U), and so on. It’s a bit confusing to combine quarks and anti-quarks in all the possible combinations. It’s kind of like working out all the ways that a coin flipped three times give you a pattern like (T,T,H) or (H,T,H), but when you do this in twos and threes for U, D and S quarks, you get the entire family of the nine known mesons, which forms one geometric pattern in the figure below, called the Meson Nonet.

The basic Meson Nonet (credit: Wikimedia Commons)

If you take the three quarks U, D and S and combine them in all possible unique threes, you get two patterns of particles shown below, called the Baryon Octet (left) and the Baryon Decuplet (right).

Normal baryons made from three-quark triplets

The problem was that there was a single missing particle in the simple 3-quark baryon pattern. The Omega-minus (S,S,S) at the apex of the Baryon Decuplet was nowhere to be found. This slot was empty until Brookhaven National Laboratory discovered it in early 1964. It was the first indication that the quark model was on the right track and could predict a new particle that no one had ever seen before. Once the other three quarks (C, T and B) were discovered in the 1970s, it was clear that there were many more slots to fill in the geometric patterns that emerged from a six-quark system.

The first particles predicted, and then discovered, in these patterns were the J/Psi “charmonium” meson (C, anti-C) in 1974, and the Upsilon “bottomonium” meson (B, anti-B) in 1977. Apparently there are no possible top mesons (T, anti-T) because the top quark decays so quickly it is gone before it can bind together with an anti-top quark to make even the lightest stable toponium meson!

The number of possible particles that result by simply combining the six quarks and six anti-quarks in patterns of twos (mesons) is exactly 39 mesons. Of these, only 26 have been detected as of 2017. These particles have masses between 4 and 11 times more massive than a single proton!

For the still-heavier three-quark baryons, the quark patterns predict 75 baryons containing combinations of all six quarks. Of these, the proton and neutron are the least massive! But there are 31 of these predicted baryons that have not been detected yet. These include the lightest missing particle, the double charmed Xi (U,C,C) and the bottom Sigma (U, D, B), and the most massive particles, called the charmed double-bottom Omega (C, B, B) and the triple-bottom omega (B,B,B). In 2014, CERN/LHC announced the discovery of two of these missing particles, called the bottom Xi baryons (B, S, D), with masses near 5.8 GeV.
To make life even more interesting for the Standard Model, other combinations of more than three quarks are also possible.

Exotic Baryons
A pentaquark baryon particle can contain four quarks and one anti-quark. The first of these, called the Theta-plus baryon, was predicted in 1997 and consists of (U, U, D, D, anti-S). This kind of quark package seems to be pretty rare and hard to create. There have been several claims for a detection of such a particle near 1.5 GeV, but experimental verification remains controversial. Two other possibilities called the Phi double-minus (D, D, S, S, anti-U) and the charmed neutral Theta (U, U, D, D, anti-C) have been searched for but not found.

Comparing normal and exotic baryons (credit: Quantum Diaries)

There are also tetraquark mesons, which consist of four quarks. The Z-meson (C, D, anti-C, anti-U) was discovered by the Japanese Bell Experiment in 2007 and confirmed in 2014 by the Large Hadron Collider at 4.43 GeV, hence the proper name Z(4430). The Y(4140) was discovered at Fermilab in 2009 and confirmed at the LHC in 2012 and has a mass 4.4 times the proton’s mass. It could be a combination of charmed quarks and charmed anti-quarks (C, anti-C, C, anti-C). The X(3830) particle was also discovered by the Japanese Bell Experiment and confirmed by other investigators, and could be yet another tetraquark combination consisting of a pair of quarks and anti-quarks (q, anti-q, q, anti-q).

So the Standard Model, and the six-quark model it contains, makes specific predictions for new baryon and meson states to be discovered. All totaled, there are 44 ordinary baryons and mesons that remain to be discovered! As for the ‘exotics’ that opens up a whole other universe of possibilities. In theory, heptaquarks (5 quarks, 2 antiquarks), nonaquarks (6 quarks, 3 antiquarks), etc. could also exist.

At the current pace of a few particles per year or so, we may finally wrap up all the predictions of the quark model in the next few decades. Then we really get to wonder what lies beyond the Standard once all the predicted particle slots have been filled. It is actually a win-win situation, because we either completely verify the quark model, which is very cool, or we discover anomalous particles that the quark model can’t explain, which may show us the ‘backdoor’ way to the Standard Model v.2.0 that the current supersymmetry searches seem not to be providing us just yet.

Check back here on Wednesday, March 22 for the next topic!

The Mystery of Gravity

In grade school we learned that gravity is an always-attractive force that acts between particles of matter. Later on, we learn that it has an infinite range through space, weakens as the inverse-square of the distance between bodies, and travels exactly at the speed of light.

But wait….there’s more!

 

It doesn’t take a rocket scientist to remind you that humans have always known about gravity! Its first mathematical description as a ‘universal’ force was by Sir Isaac Newton in 1666. Newton’s description remained unchanged until Albert Einstein published his General Theory of Relativity in 1915. Ninety years later, physicists, such as Edward Witten, Steven Hawkings, Brian Greene and Lee Smolin among others, are finding ways to improve our description of ‘GR’ to accommodate the strange rules of quantum mechanics. Ironically, although gravity is produced by matter, General Relativity does not really describe matter in any detail – certainly not with the detail of the modern quantum theory of atomic structure. In the mathematics, all of the details of a planet or a star are hidden in a single variable, m, representing its total mass.

 

The most amazing thing about gravity is that is a force like no other known in Nature. It is a property of the curvature of space-time and how particles react to this distorted space. Even more bizarrely, space and time are described by the mathematics of  GR as qualities of the gravitational field of the cosmos that have no independent existence. Gravity does not exist like the frosting on a cake, embedded in some larger arena of space and time. Instead, the ‘frosting’ is everything, and matter is embedded and intimately and indivisibly connected to it. If you could turn off gravity, it is mathematically predicted that space and time would also vanish! You can turn off electromagnetic forces by neutralizing the charges on material particles, but you cannot neutralize gravity without eliminating spacetime itself.  Its geometric relationship to space and time is the single most challenging aspect of gravity that has prevented generations of physicists from mathematically describing it in the same way we do the other three forces in the Standard Model.

Einstein’s General Relativity, published in 1915, is our most detailed mathematical theory for how gravity works. With it, astronomers and physicists have explored the origin and evolution of the universe, its future destiny, and the mysterious landscape of black holes and neutron stars. General Relativity has survived many different tests, and it has made many predictions that have been confirmed. So far, after 90 years of detailed study, no error has yet been discovered in Einstein’s original, simple theory.

Currently, physicists have explored two of its most fundamental and exotic predictions: The first is that gravity waves exist and behave as the theory predicts. The second is that a phenomenon called ‘frame-dragging’ exists around rotating massive objects.

Theoretically, gravity waves must exist in order for Einstein’s theory to be correct. They are distortions in the curvature of spacetime caused by accelerating matter, just as electromagnetic waves are distortions in the electromagnetic field of a charged particle produced by its acceleration. Gravity waves carry energy and travel at light-speed. At first they were detected indirectly. By 2004, astronomical bodies such as the  Hulse-Taylor orbiting pulsars were found to be losing energy by gravity waves emission at exactly the predicted rates. Then  in 2016, the  twin  LIGO gravity wave detectors detected the unmistakable and nearly simultaneous pulses of geometry distortion created by colliding black holes billions of light years away.

Astronomers also detected by 1997 the ‘frame-dragging’ phenomenon in  X-ray studies of distant black holes. As a black hole (or any other body) rotates, it actually ‘drags’ space around with it. This means that you cannot have stable orbits around a rotating body, which is something totally unexpected in Newton’s theory of gravity. The  Gravity Probe-B satellite orbiting Earth also confirmed in 2011 this exotic spacetime effect at precisely the magnitude expected by the theory for the rotating Earth.

Gravity also doesn’t care if you have matter or anti-matter; both will behave identically as they fall and move under gravity’s influence. This quantum-scale phenomenon was searched for at the Large Hadron Collider ALPHA experiment, and in 2013 researchers placed the first limits on how matter and antimatter ‘fall’ in Earth’s gravity. Future experiments will place even more stringent limits on just how gravitationally similar matter and antimatter are. Well, at least we know that antimatter doesn’t ‘fall up’!

There is only one possible problem with our understanding of gravity known at this time.

Applying general relativity, and even Newton’s Universal Gravitation, to large systems like galaxies and the universe leads to the discovery of a new ingredient called Dark Matter. There do not seem to be any verifiable elementary particles that account for this gravitating substance. Lacking a particle, some physicists have proposed modifying Newtonian gravity and general relativity themselves to account for this phenomenon without introducing a new form of matter. But none of the proposed theories leave the other verified predictions of general relativity experimentally intact. So is Dark Matter a figment of an incomplete theory of gravity, or is it a here-to-fore undiscovered fundamental particle of nature? It took 50 years for physicists to discover the lynchpin particle called the Higgs boson. This is definitely a story we will hear more about in the decades to come!

There is much that we now know about gravity, yet as we strive to unify it with the other elementary forces and particles in nature, it still remains an enigma. But then, even the briefest glance across the landscape of the quantum world fills you with a sense of awe and wonderment at the improbability of it all. At its root, our physical world is filled with improbable and logic-twisting phenomena and it simply amazing that they have lent themselves to human logic to the extent that they have!

 

Return here on Monday, March 13 for my next blog!

Death By Vacuum

As an astrophysicist, this has GOT to be one of my favorite ‘fringe’ topics in physics. There’s a long preamble story behind it, though!

The discovery of the Higgs Boson with a mass of 126 GeV, about 130 times more massive than a proton, was an exciting event back in 2012. By that time we had a reasonably complete Standard Model of how particles and fields operated in Nature to create everything from a uranium atom and a rainbow, to lighting the interior of our sun. A key ingredient was a brand new fundamental field in nature, and its associated particle called the Higgs boson. The Standard Model says that all fundamental particles in Nature have absolutely no mass, but they all interact with the Higgs field. Depending on how strong this interaction, like swimming through a container of molasses, they gain different amounts of mass. But the existence of this Higgs field has led to some deep concerns about our world that go well beyond how this particle creates the physical property we call mass.

In a nutshell, according to the Standard Model, all particles interact with the ever-present Higgs field, which permeates all space. For example, the W-particles interact very strongly with the Higgs field and gain the most mass, while photons interact not at all, remain massless.

The Higgs particles come from the Higgs field, which as I said is present in every cubic centimeter of space in our universe. That’s why electrons in the farthest galaxy have the same mass as those here on Earth. But Higgs particles can also interact with each other. This produces a very interesting effect, like the tension in a stretched spring. A cubic centimeter of space anywhere in the universe is not at all perfectly empty, and actually has a potential energy ‘stress’ associated with it. This potential energy is  related to just how massive the Higgs boson is. You can draw a curve like the one below that shows the vacuum energy  and how it changes with the Higgs particle mass:

Now the Higgs mass actually changes as the universe expands and cools. When the universe was very hot, the curve looked like the one on the right, and the mass of the Higgs was zero at the bottom of the curve. As the universe expanded and cooled, this Higgs interaction curve turned into the one on the left, which shows that the mass of the Higgs is now X0 or 126 GeV. Note, the Higgs mass represented by the red ball used to be zero, but ‘rolled down’ into the lower-energy pit as the universe cooled.

The Higgs energy curve shows a very stable situation for ‘empty’ space at its lowest energy (green balls) because there is a big energy wall between where the field is today, and where it used to be (red ball). That means that if you pumped a bit of energy into empty space by colliding two particles there, it would not suddenly turn space into the roaring hot house of the Higgs field at the top of this curve.

We don’t actually know exactly what the Higgs curve looks like, but physicists have been able to make models of many alternative versions of the above curve to test out how stable the vacuum is. What they  found is something very interesting.

The many different kinds of Higgs vacuua can be defined  by using two masses: the Higgs mass and the mass of the top quark. Mathematically, you can then vary the values for the Higgs boson and the Top quark and see what happens to the stability of the vacuum. The results are summarized in the plot below.

The big surprise is that, from the observed mass of the Higgs boson and our top quark shown in the small box, their values are consistent with our space being inside a very narrow zone of what is called meta-stability. We do not seem to be living in a universe where we can expect space to be perfectly stable. What does THAT mean? It does sound rather ominous that empty space can be unstable!

What it means is that, at least in principle, if you collided particles with enough energy that they literally blow-torched a small region of space, this could change the Higgs mass enough that the results could be catastrophic. Even though the collision region is smaller than an atom, once created, it could expand at the speed of light like an inflating bubble. The interior would be a region of space with new physics, and new masses for all of the fundamental particles and forces. The surface of this bubble would be a maelstrom of high-energy collisions leaking out of empty space! You wouldn’t see the wall of this bubble coming. The walls can contain a huge amount of energy, so you would be incinerated as the bubble wall ploughed through you.

Of course the world is not that simple. These are all calculations based on the Standard Model, which may be incomplete. Also, we know that cosmic rays collide with Earth’s atmosphere at energies far beyond anything we will ever achieve…and we are still here.

So sit back and relax and try not to worry too much about Death By Vacuum.

Then again…

 

Return here on Wednesday, February 22 for my next blog!

The Particle Desert

For the last 100 years physicists have built exotic “atom smashers” to probe the innermost constituents of matter. Along the way they created a breathtakingly elegant mathematical theory called the Standard Model that seems to explain all of the physics we see at the atomic scale. It describes how a collection of twelve fundamental matter particles (electrons, quarks neutrinos etc) generate three fundamental forces in Nature, and how these forces are related to twelve other particles called the gauge bosons. A final 25th particle, the Higgs boson, rounds out the ensemble and embues some of the 24 particles with that mystical property we call mass.

During all this time, teams of physicists working in the “data dumps” of billion-dollar colliders have sifted through terabytes of information to refine the accuracy of the Standard Model and compare its predictions with the real world. The predictions always seemed to match reality and push the testing of the Standard Model to still higher energies. But at the Large Hadron Collider at CERN, among the trillions of interactions studied up to energies of 13,000 GeV, no new physics has been seen in the furthest decimal points of the Standard Model predictions since 2012; not so much as a hint that something else has to be added to bring it back in line with actual data. It is a theory that appears not to be broken at energies over 1000 times higher than it was designed for!

In the history of all previous colliders beginning in the 1950s, something new has always been found to move the development of physic’s explanatory capabilities forward. In the 1970s it was the discovery of quarks. In the 1980s it was the W and Z0 particles, and even recently in 2012 it was the Higgs boson. All these particles were found below an energy of 200 GeV. But now, as the LHC has spent the last year at 13,000 GeV, a desperate mood has set in. No new particles or forces have been discovered in this new energy landscape.

Nada. Nothing. Zippo.

Most of the theories that go beyond the Standard Model provide ways to unify the strong force with the electromagnetic and weak forces. At some very high energy, they say, all three forces have the same strength, unlike their present circumstance where the strong nuclear force is 100 times as strong as the electromagnetic force. The predicted energy where this unification happens is about 1000 trillion GeV — the so-called grand unification theory (GUT) energy.

According to popular supersymmetry theory calculations, there should be a large population of new particles above an energy of 1000 GeV. Each a partner to the known 25 particles, but far more massive. Some of these particles, such as the neutralino, are even candidates for dark matter! Above the masses of these new supersymmetry particles, however, there ought to be no new particles to discover from perhaps 100, 000 GeV to the GUT energy of 1000 trillion GeV. They say that without this energy desert, any particles there would cause the proton to decay much faster than current limits predict.

So although it is a frustrating prospect that no new particles may exist in this desert, this is a vital feature of our physical world that literally prevents all matter (protons) from disintegrating! But unlike the Sahara Desert, where we can at least drive through it to get to a different world beyond, there are apparently no easily reachable oases of new particles along the way to which physicists can target new generations of expensive colliders.

It remains to be seen whether the results from the Large Hadron Collider after 2016 will confirm our greatest hopes or validate our worst fears. Either way, stay tuned for some exciting news headlines!

 

Return here on Thursday, February 9 for my next blog!

 

Running on Empty

I don’t think that many people realize that right now, but a few miles north of Geneva, Switzerland and a few hundred meters below the sleepy, bucolic countryside, a leviathan machine 27 km in diameter works round-the-clock to recreate a glimpse of the Big Bang.

Called the Large Hadron Collider (LHC) it is the world’s largest particle collider created in 1998-2008 by a collaboration of over 10,000 scientists and engineers from over 100 countries. It cost $13 billion dollars, and consumes 1 billion kilowatt-hours of electricity each year; enough to run a city of 300,000 homes!


The aim of the LHC is to allow physicists to test the predictions of different theories of particle physics, including measuring the properties of the Higgs boson and searching for the large family of new particles predicted by supersymmetric theories, as well as other unsolved questions of physics.

Since it began full-power operation in 2009, it has made a series of major discoveries that have rocked the physics world. The first of these, some 50 years after its prediction, was the discovery in 2012 of the Higgs Boson; a missing particle in the Standard Model that imbues most particles with their masses. Two years later, two new heavy sub-atomic particles were discovered called the Xi-prime and the Xi-star. Each consists of a bottom-quark, a down-quark and a strange-quark. Then a four-quark particle called the Z(4430) was found in 2014, along with signs of a five-quark combination that decayed into other, more stable particles. By 2016, LHC physicists had detected the light from antimatter hydrogen atoms (an anti-proton orbited by a positron). The findings were that this light was exactly the same as from a normal-matter hydrogen atom, so you really can’t tell the difference between matter and anti-matter from the light it emits!

But there was one other search going on that hasn’t been quite as successful, and with serious consequences.
Since 1978, physicists have explored the mathematical wonders of a new symmetry in nature called supersymmetry. This one shows how the particles that carry forces in the Standard Model (photons, gluons, w-bosons) are related to the matter particles they interact with (electrons, neutrinos, quarks). This symmetry unifies all of the known particles in the Standard Model, and also predicts that all normal particles will have supersymmetry partners with far-higher masses. The electron has a partner called the slectron. The quark has a partner called the squark, and so on. The simplest change in the Standard Model that includes this new mathematical symmetry ( called the Minimal Supersymmetric Standard Model or MSSM) adds 12 new matter particles and 12 new force particles beginning at masses that are 3 times greater than the mass of the normal proton. Amazingly, one of these partners to the normal-matter neutrinos would exactly fit the bill for the dark matter that astronomers have detected across the universe! MSSM predicts the eventual unification of all three forces; electromagnetism, strong and weak. MSSM would also let us accurately predict what happened before the universe was a microsecond old after the Big Bang!

Unfortunately, the search for any signs of supersymmetry particles is not going so well after seven years of sifting through data.

In one strategy, physicists look for any signs that the predictions made by the Standard Model are not accurate as a sign of New Physics. In another strategy, physicists look for the specific signs of the decays of these new, massive particles into the familiar matter particles.

On 8 November 2012, physicists reported on an experiment seen as a “golden” test of supersymmetry by measuring the very rare decay of the Bs meson into two muons. Disconcertingly, the results matched those predicted by the Standard Model rather than the predictions of many alternate versions of the Standard Model with supersymmetry included. New data with the collider now operating at higher energies than ever before, 13 trillion electron volts; 13 TeV) show no traces of superpartners.

As of the start of 2017, it really does look as though the LHC has given us a complete and well-tested Standard Model including the Higgs Boson, but has refused to hand over any signs of new particles existing between 190 GeV and 2,500 GeV (see the predicted particle plot below). This represents a distressing, barren desert which, had it looked as supersymmetry had predicted, should have been filled with dozens of new particles never before seen.

Many physicists are adopting an attitude of “keep calm and carry on.” “I am not yet particularly worried,” says theoretical physicist Carlos Wagner of the University of Chicago. “I think it’s too early. We just started this process.” The LHC has delivered only 1 percent of the data it will collect over its lifetime. But if no firm evidence is found for supersymmetry with the LHC, that also spells the end of another marvelous, mathematical theory that proposes to also unify gravity with the Standard Model. You know this theory by its popular name String Theory.

So a lot is still to come with the LHC in the next 5 years, and in no way is the current machine ‘running on empty’! Even its failure to find new physics will be revolutionary in its own right, and force an exciting re-boot of the entire theoretical approach in physics during the 2020s and beyond.

Check back here on Thursday, January 26 for the next blog!

Cancer and Cosmology

For the treatment of my particular cancer, small B-cell follicular non-Hodgkins Lymphoma, I will soon be starting a 6-month course of infusions of Rituximab and Bendamustine. The biology of these miracle drugs seems to be very solid and logically sound. This one-two chemical punch to my lymphatic system will use targeted antibodies to bind with the CD20 receptor on the cancerous B-cells. This will set in motion several cellular mechanisms that will kill the cells. First, the antibody bound to the CD20 receptor attracts T-cells in the immune system to treat the cancerous B-cell as an invader. Thus begins my immune system’s process of killing the invader. The antibody also triggers a reaction in the cell to commit suicide called apoptosis. Even better, Rituximab does not set in motion the process to kill normal B-cells!

The promise is that my many enlarged lymph nodes chock-a-block with the cancerous B-cells will be dramatically reduced in size to near-normal levels as they are depopulated of the cancerous cells. So why do some patients not all show the same dramatic reductions? About 70% respond to this therapy to various degrees while 10% do not. Why, given the impeccable logic of the process, aren’t the response rates closer to 100%?

Meanwhile, in high-energy physics, supersymmetry is a deeply beautiful and lynch-pin mathematical principle upon which the next generations of theories about matter and gravity are based. By adding a teaspoon of it to the Standard Model, which currently accounts in great mathematical detail for all known particles and forces, supersymmetry provides an elegant way to explore an even larger universe that includes dark matter, unifying all natural forces, and explaining many of the existing mysteries not answered by the Standard Model.
Called the Minimal Supersymmetric Standard Model (MSSM), Nature consistently rewards the simplest explanations for physical phenomena, so why has there been absolutely no sign of supersymmetry at the energies predicted by MSSM, and being explored by the CERN Large Hadron Collider?

In both cases, I have a huge personal interest in these logically compelling strategies and ideas: One to literally save my life, and the other to save the intellectual integrity of the physical world I have so deeply explored as an astronomer during my entire 40 year career. In each case, the logic seems to be flawless, and it is hard to see how Nature would not avail itself of these simple and elegant solutions with high fidelity. But for some reason it chooses not to do so. Rituximab works only imperfectly, while supersymmetry seems an un-tapped logical property of the world.

So what’s going on here?

In physics, we deal with dumb matter locked into simple systems controlled by forces that can be specified with high mathematical accuracy. The fly in the ointment is that, although huge collections of matter on the astronomical scale follow one set of well-known laws first discovered by Sir Isaac Newton and others, at the atomic scale we have another set of laws that operate on individual elementary particles like electrons and photons. This is still not actually a problem, and thanks to some intense mathematical reasoning and remarkable experiments carried out between 1920 and 1980, our Standard Model is a huge success. One of the last hold-outs in this model was the discovery of the Higgs Boson in 2012, some 50 years after its existence was predicted! But as good as the Standard Model is, there seem to be many loose ends that are like red flags to the inquiring human mind.

One major loose end is that astronomers have discovered what is popularly called ‘dark matter’, and there is no known particle or force in the Standard Model to account for it. Supersymmetry answers the question, why does nature have two families of particles when one would be even simpler? Amazingly, and elegantly, supersymmetry answers this question by showing how electrons, and quarks, which are elementary matter particles, are related to photons and gluons, which are elementary force-carrying particles. But in beautifully unifying the particles and forces, it also offers up a new family of particles, the lightest of which would fit the bill as missing dark matter particles!

This is why physicists are desperately trying to verify supersymmetry, not only to simplify physics, but to explain dark matter on the cosmological scale. As an astronomer, I am rooting for supersymmetry because I do not like the idea that 80% of the gravitating stuff in the universe is not stars and dust, but inscrutable dark matter. Nature seems not to want to offer us this simple option that dark matter is produced by ‘supersymmetric neutralinos’. But apparently Nature may have another solution in mind that we have yet to stumble upon. Time will tell, but it will not be for my generation to discover.

On the cancer-side of the equation, biological systems are gears-within-gears in a plethora of processes and influences. A logically simple idea like the Rituximab treatment looks compelling if you do not look too closely at what the rest of the cancerous B-cells are doing, or how well they like being glommed onto by a monoclonal antibody like Rituximab. No two individuals apparently have the same B-cell surfaces, or the same lymphatic ecology in a nearly-infinite set of genetic permutations, so a direct chemical hit by a Rituximab antibody to one cancerous B-cell may be only a glancing blow to another. This is why I am also rooting for my upcoming Rituximab treatments to be a whopping success. Like supersymmetry, it sure would simplify my life!

The bottom line seems to be that, although our mathematical and logical ideas seem elegant, they are never complete. It is this incompleteness that defeats us, sometimes by literally killing us and sometimes by making our entire careers run through dark forests for decades before stumbling into the light.

 

Check back here on Wednesday, December 28 for the next installment!

Rainbow image credit: Daily Mail: UK
http://www.dailymail.co.uk/news/article-1354580/UK-weather-Rainbow-dominates-skyline-winter-storms.html

Oops…One more thing!

After writing thirteen essays about space, I completely forgot to wrap up the whole discussion with some thoughts about the Big Picture! If you follow the links in this essay you will come to the essay where I explained the idea in more detail!

Why did I start these essays with so much talk about brain research? Well, it is the brain, after all, that tries to create ideas about what you are seeing based on what the senses are telling it. The crazy thing is that what the brain does with sensory information is pretty bizarre when you follow the stimuli all the way to consciousness. In fact, when you look at all the synaptic connections in the brain, only a small number have anything to do with sensory inputs. It’s as though you could literally pluck the brain out of the body and it would hardly realize it needed sensory information to keep it happy. It spends most of its time ‘taking’ to itself.

The whole idea of space really seems to be a means of representing the world to the brain to help it sort out the rules it needs to survive and reproduce. The most important rule is that of cause-and-effect or ‘If A happens then B will follow’. This also forms the hardcore basis of logic and mathematical reasoning!
But scientifically, we know that space and time are not just some illusion because objectively they seem to be the very hard currency through which the universe represents sensory stimuli to us. How we place ourselves in space and time is an interesting issue in itself. We can use our logic and observations to work out the many rules that the universe runs by that involve the free parameters of time and space. But when we take a deep dive into how our brains work and interfaces with the world outside our synapses, we come across something amazing.

The brain needs to keep track of what is inside the body, called the Self, and what is outside the body. If it can’t do this infallibly, it cannot keep track of what factors are controlling its survival, and what factors are solely related to its internal world of thoughts, feelings, and imaginary scenarios. This cannot be just a feature of human brains, but has to also be something that many other creatures also have at some rudimentary level so that they too can function in the external world with its many hazards. In our case, this brain feature is present as an actual physical area in the cerebral cortex. When it is active and stimulated, we have a clear and distinct perception of our body and its relation to space. We can use this to control our muscles, orient ourselves properly in space, walk and perform many other skills that require a keen perception of this outside world. Amazingly, when you remove the activity in this area through drugs or meditation, you can no longer locate yourself in space and this leads to the feeling that your body is ‘one’ with the world, your Self has vanished, and in other cases you experience the complete dislocation of the Self from the body, which you experience as Out of Body travel.

What does this have to do with space in the real world? Well, over millions of years of evolution, we have made up many rules about space and how to operate within it, but then Einstein gave us relativity, and this showed that space and time are much more plastic than any of the rules we internalized over the millennia. But it is the rules and concepts of relativity that make up our external world, not the approximate ‘common sense’ ideas we all carry around with us. Our internal rules about space and time were never designed to give us an accurate internal portrayal of moving near the speed of light, or functioning in regions of the outside world close to large masses that distort space.

But now that we have a scientific way of coming up with even more rules about space and time, we discover that our own logical reasoning wants to paint an even larger picture of what is going on and is happy to do so without bothering too much with actual (sensory) data. We have developed for other reasons a sense of artistry, beauty and aesthetics that, when applied to mathematics and physics, has taken us into the realm of unifying our rules about the outside world so that there are fewer and fewer of them. This passion for simplification and unification has led to many discoveries about the outside world that, miraculously, can be verified to be actual objective facts of this world.

Along this road to simplifying physics, even the foundations of space and time become players in the scenery rather than aloof partners on a stage. This is what we are struggling with today in physics. If you make space and time players in the play, the stage itself vanishes and has to somehow be re-created through the actions of the actors themselves .THAT is what quantum gravity hopes to do, whether you call the mathematics Loop Quantum Gravity or String Theory. This also leads to one of the most challenging concepts in all of physics…and philosophy.

What are we to make of the ingredients that come together to create our sense of space and time in the first place? Are these ingredients, themselves, beyond space and time, just as the parts of a chain mail vest are vastly different than the vest that they create through their linkages? And what is the arena in which these parts connect together to create space and time?

These questions are the ones I have spent my entire adult life trying to comprehend and share with non-scientists, and they lead straight into the arms of the concept of emergent structures: The idea that elements of nature come together in ways that create new objects that have no resemblance to the ingredients, such as evolution emerging from chemistry, or mind emerging from elementary synaptic discharges. Apparently, time and space may emerge from ingredients still more primitive, that may have nothing to do with either time or space!

You have to admit, these ideas certainly make for interesting stories at the campfire!

Check back here on Monday, December 26 for the start of a new series of blogs on diverse topics!

Quantum Gravity…Oh my!

So here’s the big problem.

Right now, physicists have a detailed mathematical model for how the fundamental forces in nature work: electromagnetism, and the strong and weak nuclear forces. Added to this is a detailed list of the fundamental particles in nature like the electron, the quarks, photons, neutrinos and others. Called the Standard Model, it has been extensively verified and found to be an amazingly accurate way to describe nearly everything we see in the physical world. It explains why some particles have mass and others do not. It describes exactly how forces are generated by particles and transmitted across space. Experimenters at the CERN Large Hadron Collider are literally pulling out their hair to find errors or deficiencies in the Standard Model that go against the calculated predictions, but have been unable to turn up anything yet. They call this the search for New Physics.

Along side this accurate model for the physical forces and particles in our universe, we have general relativity and its description of gravitational fields and spacetime. GR provides no explanation for how this field is generated by matter and energy. It also provides no description for the quantum structure of matter and forces in the Standard Model. GR and the Standard Model speak two very different languages, and describe two very different physical arenas. For decades, physicists have tried to find a way to bring these two great theories together, and the results have been promising but untestable. This description of gravitational fields that involves the same principles as the Standard Model has come to be called Quantum Gravity.

The many ideas that have been proposed for Quantum Gravity are all deeply mathematical, and only touch upon our experimental world very lightly. You may have tried to read books on this subject written by the practitioners, but like me you will have become frustrated by the math and language this community has developed over the years to describe what they have discovered.

The problem faced by Quantum Gravity is that gravitational fields only seem to display their quantum features at the so-called Planck Scale of 10^-33 centimeters and  10^-43 seconds. I cant write this blog using scientific notation, so I am using the shorthand that 10^3 means 1000 and 10^8 means 100 million. Similarly, 10^-3 means 0.001 and so on. Anyway, the Planck scale  also corresponds to an energy of 10^19 GeV or 10 billion billion GeV, which is an energy 1000 trillion times higher than current particle accelerators can reach.

There is no known technology that can reach the scales where these effects can be measured in order to test these theories. Even the concept of measurement itself breaks down! This happens because the very particles (photons) you try to use to study physics at the Planck scale carry so much energy  they turn into quantum black holes and are unable to tell you what they saw or detected!

One approach to QG is called Loop Quantum Gravity.  Like relativity, it assumes that the gravitational field is all there is, and that space and time become grainy or ‘quantized’ near the Planck Scale. The space and time we know and can experience in-the-large is formed from individual pieces that come together in huge numbers to form the appearance of a nearly-continuous and smooth gravitational field.

The problem is that you cannot visualize what is going on at this scale because it is represented in the mathematics, not by nuggets of space and time, but by more abstract mathematical objects called loops and spin networks. The artist rendition above is just that.

So here, as for Feynman Diagrams, we have a mathematical picture that represents a process, but the picture is symbolic and not photographic. The biggest problem, however, is that although it is a quantum theory for gravity that works, Loop Quantum Gravity does not include any of the Standard Model particles. It represents a quantum theory for a gravitational field (a universe of space and time) with no matter in it!

In other words, it describes the cake but not the frosting.

The second approach is string theory. This theory assumes there is already some kind of background space and time through which another mathematical construct called a string, moves. Strings that form closed loops can vibrate, and each pattern of vibrations represents a different type of fundamental particle. To make string theory work, the strings have to exist in 10 dimensions, and most of these are wrapped up into closed balls of geometry called Calabi-Yau spaces. Each of these spaces has its own geometry within which the strings vibrate. This means there can be millions of different ‘solutions’ to the string theory equations: each a separate universe with its own specific type of Calabi-Yau subspace that leads to a specific set of fundamental particles and forces. The problem is that string theory violates general relativity by requiring a background space!

In other words, it describes the frosting but not the cake!

One solution proposed by physicist Lee Smolin is that Loop Quantum Gravity is the foundation for creating the strings in string theory. If you looked at one of these strings at high magnification, its macaroni-like surface would turn into a bunch of loops knitted together, perhaps like a Medieval chainmail suit of armor. The problem is that Loop Quantum Gravity does not require a gravitational field with more than four dimensions ( 3 of space and one of time) while strings require ten or even eleven. Something is still not right, and right now, no one really knows how to fix this. Lacking actual hard data, we don’t even know if either of these theories is closer to reality!

What this hybrid solution tries to do is find aspects of the cake that can be re-interpreted as particles in the frosting!

This work is still going on, but there are a few things that have been learned along the way about the nature of space itself. At our scale, it looks like a continuous gravitational field criss-crossed by the worldlines of atoms, stars and galaxies. This is how it looks even at the atomic scale, because now you get to add-in the worldlines of innumerable ‘virtual particles’ that make up the various forces in the Standard Model.  But as we zoom down to the Planck Scale, space and spacetime stop being smooth like a piece of paper, and start to break up into something else, which we think reveals the grainy nature of gravity as a field composed of innumerable gravitons buzzing about.

But what these fragmentary elements of space and time ‘look’ like is impossible to say. All we have are mathematical tools to describe them, and like our attempts at describing the electron, they lead to a world of pure abstraction that cannot be directly observed.

If you want to learn a bit more about the nature of space, consider reading my short booklet ‘Exploring Quantum Space‘ available at amazon.com. It describes the amazing history of our learning about space from ancient Greek ‘common sense’ ideas, to the highlights of mind-numbing modern quantum theory.

Check back here on Thursday, December 22 for the last blog in this series!