“A gamma-ray burst from a star a few thousand light-years away would have a threefold impact on Earth. First it would tear up nitrogen and oxygen molecules in the atmosphere and they would combine to form nitrous oxides. These would eat up the ozone layer causing a flood of ultraviolet radiation to reach Earth. The smoggy oxides would darken the atmosphere, and cool the Earth. The nitrous oxides would rain as nitric acid, devastating vegetation. Did I say nitrous oxide? Does that mean we would die laughing?”

Late light reveals what space is made of
BY Anil Ananthaswamy / 12 August 2009

ON THE night of 30 June 2005, the sky high above La Palma in Spain’s Canary Islands crackled with streaks of blue light too faint for humans to see. Atop the Roque de los Muchachos, the highest point of the island, though, a powerful magic eye was waiting and watching. MAGIC – the Major Atmospheric Gamma-ray Imaging Cherenkov Telescope – scans the sky each night for high-energy photons from the distant cosmos. Most nights, nothing remarkable comes. But every now and again, a brief flash of energetic light bears witness to the violent convulsions of a faraway galaxy. What MAGIC saw on that balmy June night came like a bolt from the blue. That is because something truly astounding may have been encoded in that fleeting Atlantic glow: evidence that the fabric of space-time is not silky smooth as Einstein and many others have presumed, but rough, turbulent and fundamentally grainy stuff.

It is an audacious claim that, if verified, would put us squarely on the road to a quantum theory of gravity and on towards the long-elusive “theory of everything”. If it were based on a single chunk of MAGIC data, it might easily be dismissed as a midsummer night’s dream. But it is not. Since that first sighting, other telescopes have started to see similar patterns. Is this a physics revolution through the barrel of a telescope? Such incendiary thoughts were far away from Robert Wagner’s mind when the MAGIC data filtered through to the Max Planck Institute of Physics in Munich, Germany, the morning after. He and his fellow collaborators were enjoying a barbecue. Not for long. “We put our beers aside and started downloading the full data set,” says Wagner.

It was easy to pinpoint the source of the data blip – a 20-minute burst of hugely energetic gamma rays from a galaxy some 500 million light years away known as Markarian 501. Its occasional tempestuous outbursts had already made it familiar to gamma-ray telescopes worldwide. This burst was different. As Wagner and his colleagues analysed the data in the weeks and months that followed, an odd pattern emerged. Lower-energy photons from Markarian 501 had outpaced their higher-energy counterparts, arriving up to 4 minutes earlier (Physics Letters B, vol 668, p 253).

This should not happen. If an object is 500 million light years away, light from it always takes 500 million years to get to us, no more, no less. Whatever their energy, photons always travel at the same speed, the implacable cosmic speed limit: the speed of light. Perhaps the anomaly has a mundane explanation. We do not really understand the processes within objects such as Markarian 501 that accelerate particles to phenomenal energies and catapult them towards us. They are thought ultimately to have something to do with the convulsions of supermassive black holes at the objects’ hearts. It could be that these mechanisms naturally spew out low-energy particles before high-energy ones. Or they might not. “The more fascinating explanation would be that this delay is not intrinsic to the source, but that it happens along the way from the source to us,” says Wagner.

Quantum signature
What piqued the interest of Wagner and his colleagues was that the MAGIC observations were showing just the sort of effect that quite a few models of quantum gravity predict. Physicists have been on the lookout for experimental signposts to the right theory for the best part of a century. “All approaches to quantum gravity, in their own very different ways, agree that empty space is not so empty after all,” says theorist Giovanni Amelino-Camelia of Sapienza University of Rome in Italy. Many models based on string theory suggest that space-time is a foamy froth of particles, and even microscopic black holes, that spark up out of nothing and disappear again with equal abandon. The alternative approach favoured by Amelino-Camelia, loop quantum gravity, posits that space-time comes in indivisible chunks of about 10-35 metres, a size known as the Planck length.

Last year, it was suggested that the signature of just such a quantum space-time had popped up in unexplained noise plaguing a gravitational-wave detector in northern Germany (New Scientist, 17 January 2009, p 24). But that interpretation is far from a done deal, and most experts agree that a more substantive sighting could only come from observing the possible interactions of space-time with particles passing through it. According to many string theory models, particles of different energies should speed up or slow down by different amounts as they interact with a foamy space-time. A minimum size for space-time grains, as predicted by loop quantum gravity, could violate the cherished principle of special relativity known as Lorentz invariance, which states that the maximum speed of all particles, regardless of their energy, is the speed of light in a vacuum.

The trouble is that these effects would be observable only with particles far more energetic than even the beefiest terrestrial particle accelerators can produce. Even if we could make these particles, the tiny interactions between them and the fabric of space-time would not add up to a hill of beans, even over many laps of the Large Hadron Collider’s 27-kilometre-long loop at CERN, near Geneva, Switzerland. Summed over hundreds of millions or billions of light years, such interactions could account for the MAGIC travel-time anomaly. It looks like nature might have provided us with particle accelerators – distant galaxies – whose products could, for the first time, allow us to test predictions of quantum gravity against hard experimental evidence.

As yet, we have only seen a handful of gamma-ray bursts of the energy and intensity needed to see whether the delay effect is a consistent feature. In July 2006, the High Energy Stereoscopic System (HESS), an array of gamma-ray telescopes in the desert of Namibia, saw a high-energy flare erupt from an active galaxy nearly four times as far away as Markarian 501. The burst contained marginal evidence for a time-lag of around half a minute for the most energetic photons, which were considerably less energetic than those in the flare spotted by MAGIC. The uncertainties in the data resulting from the detection process, however, made a definitive statement impossible (Physical Review Letters, vol 101, p 170402).

It is recent results from NASA’s Fermi Gamma-ray Space Telescope, launched last year, that provide the most tantalising glimpse yet of something extraordinary going on out there. Last September, it spied a burst of gamma rays from a source nearly 12 billion light years away. According to an analysis by Amelino-Camelia and Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, the zippiest low-energy photons beat some of the high-energy stragglers to Earth by anything up to 20 minutes. Two much closer bursts seem to contain much smaller delays.

The individual observations are pretty consistent with each other, too, says theorist John Ellis at CERN. He and colleagues have taken data from the MAGIC and HESS bursts to calibrate a theoretical model inspired by string theory that assumes the delay effect increases linearly with distance and photon energy. Using it to estimate the delay that the highest-energy photon in the Fermi space telescope’s September burst should have experienced, they came up with a figure of 25 seconds, plus or minus 11 seconds. What Fermi had measured for that particular photon was 16.5 seconds – within the model prediction’s admittedly large margin of error.

The only way to find out conclusively whether the delays are a consistent signature of a quantised or foam-like space-time, says Ellis, is to get more data – ideally from sources at many different distances. “Then we’ll be able to see whether we can distinguish between effects at the source and effects in the propagation,” he says.

Worldwide cover
We also need to observe the same burst with more than one instrument. Each telescope is sensitive to a different energy range, owing to its altitude and detector set-up. Combining different data sets will provide a wider spread of energies from which to tease out any energy-dependent effect, and also help us get round a persistent irritant to consistent astronomical observations: Earth’s rotation. Not only does our planet’s spin mean that multitudes of photons from the sun overwhelm any cosmic source for a large proportion of the day, but it also makes observing a highly directed beam of gamma rays from one specific direction tricky, even at night: as you train your telescope on your target, the Earth moves beneath your feet and eventually the source slips out of sight.

That means MAGIC can observe any burst for a maximum of only 6 hours on any given night, assuming it is pointing in the right direction when a new burst arrives. That period could be doubled by using it in conjunction with a similar instrument – the Very Energetic Radiation Imaging Telescope Array System (VERITAS) – that sits atop Mount Hopkins in southern Arizona.

A further gamma-ray telescope, the Major Atmospheric Cherenkov Telescope Experiment (MACE), 4500 metres up on the Tibetan plateau in the remote region of Ladakh, India, will open that observational window still further. When completed in 2011, MACE will be the highest-altitude gamma-ray telescope in the world, capable of observing gamma rays with a wide range of energies. “Then we will have another observatory 5 to 6 hours in front of MAGIC,” says Wagner. “That could lead the way to a continuous, 24-hour observation of certain objects.”

What with that and the new high-accuracy data from the Fermi space telescope, gamma ray telescopes could well uncover quantum space-time within the next few years. Even so, they still might be beaten to the line. The definitive answer might come from a very different source, and a very different quarter of Earth’s surface – the South Pole.

That is because a cubic kilometre of ice under the South Pole will soon be home to the IceCube Neutrino Observatory, whose strings of detectors will watch for faint flashes of blue light emitted when neutrinos from cosmic sources smash into the Antarctic ice. Neutrinos are ghostly particles thought to be produced in the same violent events that produce high-energy gamma rays. As yet, we have not seen any neutrinos from outside our galaxy, barring some that burst on us from a supernova in a neighbouring galaxy, the Large Magellanic Cloud, in 1987. The neutrinos we do see are lower-energy ones that come from nuclear reactions in the sun and particle interactions in Earth’s atmosphere. IceCube aims to change that.

And it could see something big. Because the quantum-mechanical wavelengths associated with neutrinos of the very highest energies are even smaller than those of high-energy photons, they could be more susceptible to disruption through interactions with a space-time that is grainy on very small scales. Francis Halzen of the University of Wisconsin, Madison, who leads the IceCube experiment, has calculated together with his colleagues that in one favoured model of quantum space-time such interactions could dramatically speed up higher-energy neutrinos (Physical Review D, vol 72, p 065019). “It’s a beautiful signal that could not be explained by conventional astrophysics,” he says.

Humble constructions
That’s not the only attractive property of neutrinos when it comes to testing the idea of a frothy space-time, says Dan Hooper of Fermilab in Batavia, Illinois. Neutrinos come in three distinct “flavours”, named after the chunkier particles they are associated with – the electron, the muon and the tau. They tend to morph back and forth between these different states as they travel, a phenomenon known as neutrino oscillation. If a distant source is emitting only electron neutrinos, theory tells us how many should have changed flavours by the time they reach us.

If neutrinos were interacting with the quantum foam, though, they would forget their original flavour along the way, leading to equal numbers of all flavours by the time they arrive here. “That effect would be hard to explain with normal astrophysics,” says Hooper. He suggests a possible, albeit disputed, source of electron neutrinos in the Cygnus region of the Milky Way that could be ripe for investigation (Physics Letters B, vol 609, p 206).

Uncertainties in models of neutrino oscillations make exact calculations of the expected extent of the flavour-equalising effect difficult, as Hooper himself points out. And even if we do strike it lucky and find indisputable signs that either neutrinos or gamma rays are being affected by the structure of space-time, it will be a long, hard slog to convert that evidence into a viable theory of quantum gravity. Amelino-Camelia likens the situation now to that of a century ago, when anomalous observations – such as the spectrum of black-body radiation, or the photoelectric effect – that could not be explained by classical means set physics on the decades-long path towards a fully fledged quantum theory. It did not come easy.

And so it will be for quantum gravity. “We have to build, humbly, very humbly, from what we know,” says Amelino-Camelia. “Construct simple theories, which are very far from being a theory of everything, but intelligible enough that they can guide us to the next spark.” Whether on Atlantic islands, in the Himalayas, deep in the Antarctic ice or high above Earth’s atmosphere, watchful eyes are waiting for signs from the universe’s quantum fabric.

Quantum gravity: why we care
On the scale of profound things in physics, quantum gravity scores an easy 10 out of 10. Currently, three of the four fundamental forces of nature can be explained by the exchange of force-carrying particles that follow the rules of quantum theory. Gravity cannot. According to Einstein’s general theory of relativity, the force arises from the smooth warping of space-time by massive objects. As such, it remains resolutely outside the purview of quantum physics.

That must change, physicists agree. Without a quantum theory of gravity, we not only lack an overarching theory of the workings of the world, but we are also never going to be able to probe back to the first tiny fractions of a second after the big bang – a crucial and eventful period in the evolution of the universe.

The trouble is, there is no agreement on how to get to that theory. String theory, the avenue preferred by most physicists, melds gravity and quantum mechanics by arguing that everything in nature arises from the vibration of tiny strings in 10-dimensional space-time. It has been roundly criticised, though, for failing to come up with any prediction that experiments might verify. A rival approach, called loop quantum gravity, shows mathematically that space-time is woven out of loops of gravitational field lines. In the evidence stakes, it has fared no better.

“For many decades, research on quantum gravity was being monopolised by the idea that we needed to get a perfect theory, with geniuses producing perfect mathematics, and with no guidance from experiments,” says Giovanni Amelino-Camelia of Sapienza University of Rome in Italy. The geniuses desperately need something to tether their models to reality. For that, they could do with a touch of MAGIC.

Probing quantum gravity with gamma ray bursters : Researchers use gamma ray bursters to show the influence of quantum gravity on the refractive index of the vacuum
BY Chris Lee / August 23, 2007

With quantum mechanics the undisputed king of the small, and general relativity governing the very large, physics has developed fantastic descriptive power. With one exception: where quantum mechanics meets general relativity, neither theory is very good. This meeting point occurs when there is a lot of energy in a very small space. Now, even though the distance suggests that quantum mechanics should rule, the energy tells us that gravity is just as influential. However, our understanding of gravity is based on a smooth space-time, while quantum mechanics tells us that, at some scale, everything is discrete. Applying general relativity to a discrete space-time yields results that are absurd. So, for the last 40 odd years, physicists have been searching for a way to unify general relativity and quantum mechanics.

Despite the fact that there is no unified theory, we know a lot about what features it must possess. This gives experimental physicists and astronomers something to do while the theorists sit around waiting for their muse to turn up. One of the common features of most quantum gravity theories is the prediction that the vacuum will be dispersive for very high-energy photons (e.g., 150GeV or more). What that implies is that higher energy photons experience a slightly higher refractive index than lower energy photons, which means that the high energy photons travel slower through the vacuum and arrive later than the low energy photons.

In research submitted to Physical Review Letters, a large collaboration of astronomers are hoping to publish the first observational evidence for quantum gravity. By observing the burst characteristics of two gamma ray bursters, the team was able to conclude that the vacuum was indeed dispersive. This is not quite as simple as it sounds. Getting the data was simply the first part. To show that the high-energy photons did indeed travel slower, they also had to reconstruct the emission profile of the source. That is, using only the data they collected, they had to figure out when the photons were emitted relative to each other.

Various methodologies of source reconstruction were used. For instance, we know that the apparent duration of the gamma ray burster is only going to be increased by dispersion, so the dispersion can be figured out by “undoing” the dispersion such that as much energy as possible is emitted during the most active part of the flare. In a related method, the total duration of the flare can be minimized, giving another value for the dispersion. They found that both methods give overlapping values for the time delay and a related quantum gravity mass (a scaling parameter used to obtain the dispersion). As befits a ground breaking observation, the error bars are rather large, coming in at around +70 percent and -25 percent—they really needed that tighter lower bound.

Their results—a time difference of 3–4 seconds after traversing huge distances—rely heavily on the reconstruction of the source’s emission profile so they also checked that using computer models. Essentially, they made up a bunch of sources that gave photons arbitrary emission times and tested their ability to calculate the vacuum dispersion on those. More specifically, they used computer generated profiles that were then dispersed by a vacuum that was either dispersive or non-dispersive. In all cases they recovered the original source profile to within the uncertainty of they experimental observations. This tells us that the source profile is probably the leading cause of uncertainty in these observations.

Has quantum gravity made a sudden leap forward? Probably not, but this is the first real data against which such a theory can be tested, which means that theorists will suddenly have to start paying attention to experimental results again and modify their theories appropriately.


Robert Marcus Wagner
email : rwagner [at] mppmu.mpg [dot] de / rw [at] rwagner [dot] de

Giovanni Amelino-Camelia
email : amelino [at] roma1.infn [dot] it

Francis Halzen
email : halzen [at] icecube.wisc [dot] edu

Dan Hooper
email : dhooper [at] fnal [dot] gov

John Ellis
email : john.ellis [at] cern [dot] ch

Lee Smolin
email : lsmolin [at] perimeterinstitute [dot] ca


This is an idea that was originally proposed by Nobel physicist John Wheeler back in the early 1960’s to describe what space-time ‘looks like’ at scales of 10^-33 centimeters. The basic idea is that gravity is a field with many of the same fundamental properties as the other fundamental ‘force’ fields in Nature. This means that the state of this field is, at some level, uncertain and described by quantum mechanics. Since Einstein’s general theory of relativity requires that gravitational fields and space-time be one and the same mathematical objects, this means that space-time itself is also subject to the kinds of uncertainty required by quantum systems. This indeterminacy means that you cannot know with infinite precision BOTH the geometry of space-time, and the rate of change of the space-time geometry, in direct analogy with Heisenberg’s Uncertainty Principle for quantum systems.

Wheeler imagined that this indeterminacy for space-time required that at the so-called Planck Scale of 10^-33 centimeters and 10^-43 seconds, space-time has a foaminess to it with sudden changes in its geometry into a wealth of complex shapes and textures. You would have quantum black holes appear at 10^-33 centimeters, then evaporate in 10^-43 seconds. Wormholes would form and dissolve, and later theorists even postulated ‘baby universe’ production could happen under these conditions. The problem is that we have no evidence that 1) gravity is a quantum field and 2) that space-time has this type of structure at these scales.

John Hagelin
email : JHagelin [at] malawpc [dot] com

Is the fabric of the Universe a seething mass of black holes and wormholes?
BY Michael Brooks Lewes / New Scientist / 19 June 1999

On your kitchen table are the following implements: a chainsaw, a wooden mallet and a pair of boxing gloves. Your mission, should you choose to accept it, is to use one of these tools to split an atom. It is, of course, a ridiculous assignment, but it would sound like child’s play to researchers studying quantum gravity. They believe that the very fabric of space-time is a seething foam of wormholes and tiny black holes a hundred billion billion times smaller than a proton. But the experimental tools available to test this idea are absurdly clumsy: the best particle accelerators can barely examine scales a million billion times larger.

“Many people have said it’s going to be impossible to test quantum gravity, so there’s no use even thinking about it,” says John Ellis, a theorist at CERN, the Geneva-based European centre for particle physics. But, he says, it’s too important to ignore. Quantum gravity is needed to describe the first instants of creation, when quantum fluctuations ruled the Universe, and it could even lead us to a full understanding of how our Universe works-the elusive Theory of Everything that will tie all the forces of nature together. “This is the grand theoretical challenge the 20th century has left physics to solve in the 21st century,” says Ellis. “Even if it looks hopeless you should nevertheless think about it.”

Astonishingly, it doesn’t look hopeless any more. Since the beginning of this year, physicists have proposed a handful of foam-probing experiments that could shed light on quantum gravity. Against all the odds, they can now embark on a journey down to the lowest level of reality, where quantum mechanics and gravity meet. Quantum mechanics describes how particles interact with each other to generate all but one of the forces in nature. So most physicists believe it must work for gravity, too. But how? The best description of gravity we have is Einstein’s theory of general relativity, which says that what we feel as gravity is actually the effect of curved space-time. General relativity works beautifully for gravitational forces in the Universe, successfully predicting the existence of such outlandish objects as black holes.

But problems are looming, Ellis says. “We know there are inconsistencies in these theories. It’s just a question of when the inconsistencies are going to show up in the data.” The best solution would be to find the underlying theory from which relativity and quantum mechanics can be inferred. There’s no telling what insights such a theory would yield. Physicists struggling to marry Einstein with quantum mechanics have already made one startling discovery. In 1971, Russian physicist Yakov Zel’dovich guessed that black holes aren’t truly black, but instead combine with quantum-mechanical fluctuations to emit photons and other particles. Stephen Hawking proved the idea three years later, and these emissions are now called Hawking radiation.

All fledgling theories of quantum gravity also make a more general and even weirder prediction: the structure of space and time is very different from the gentle curves predicted by general relativity. The American physicist John Wheeler realised in the 1950s that if you look at things on a scale of about 10-35 metres, quantum fluctuations become powerful enough to play tricks with the geometry of the Universe. Space and time break down into “fuzziness” or “foaminess”. A spaceship that size could find itself negotiating virtual black holes, or getting sucked into one wormhole after another and tossed back and forth in time and space.

If you think this idea of a space-time foam sounds horribly vague, you’re in good company. So do the researchers. “It’s a very vague thing,” says Chris Isham, a theoretician at Imperial College, London. “General relativity is about space-time, and quantum theory tends to involve quantum fluctuations in things. Therefore, if you talk about quantum gravity, there might be some sort of fluctuation in something to do with space-time. It’s that sort of level of argument.”

In the race to create a more substantial theory of quantum gravity, there are two main contenders. Abhay Ashtekar of Pennsylvania State University contends that space and time aren’t fundamental properties of the Universe. Instead, they are supposed to emerge from a purely mathematical theory (“Beyond space and time”, New Scientist, 17 May 1997, p 38). But impressive as the mathematical framework is, no one is sure how to pull physical realities, like space, time and gravity, from it.

Cat’s cradle
The other idea is based on superstrings: minuscule loops or strings about 10-35 metres long, floating through space-time. Matter arises from specific kinds of vibration in these strings, just as notes are the result of certain vibrations of a violin string. There are a huge number of variants of the strings idea, but researchers believe that they are merely different versions of a single, all-encompassing structure called M-theory (“Into the eleventh dimension”, New Scientist, 18 January 1997, p 32). This is physicists’ favourite Theory of Everything, with the potential to unite all the forces of nature and explain the properties of every subatomic particle. But it is still in its infancy, and so far has little to say about how quantum gravity manifests itself in the Universe.

Giovanni Amelino-Camelia of the University of Neuchâtel in Switzerland decided not to wait around for the theorists to agree on what exactly is going on. Earlier this year, he published some calculations in Nature which imply that quantum gravity is accessible to experiments after all. If space-time is a frothing mess, he reasoned, the distance between two objects should always have some random fluctuations as the bubbles constantly form and burst. And by measuring the amounts of fluctuation, we might be able to rule out some of the theories-or even discover some real quantum foam.

So rather than the usual tool of fundamental physics-a superpowerful particle accelerator-what he needed was a good tape measure. The California Institute of Technology has just such a device. Their interferometer splits a laser beam in two, and bounces the resulting beams off two mirrors, each 40 metres away but in different directions (see Diagram). The reflected beams are then recombined, producing an interference pattern that reveals tiny changes in the paths they took to reach the mirrors. If the path lengths fluctuate, the interference pattern will fluctuate too-it will be “noisy”.

Amelino-Camelia compared the [Detecting quantum foam] noise levels in the Caltech Detecting quantum foam interferometer with the noise that quantum gravity theories predict. So far, he reckons this experiment has seen off at least one approach to quantum gravity. Theories based on “deformed Poincaré symmetry” say that quantum mechanics distorts certain symmetries of space-time-its immunity to rotation, inversion and other similar changes. But it turns out that that would produce bigger random fluctuations than the Caltech system’s noise limit, so Amelino-Camelia politely suggests that this approach is almost certainly wrong. This is no mean feat, as the fluctuations he’s talking about are equivalent to a change of 1 metre in the diameter of the Universe.

That still leaves superstrings and the Ashtekar approach undamaged. But finally, quantum gravity theories are tethered on an experimental leash, and there are other plans in the making to help pin down this fuzzy foaminess. Last year, working with Amelino-Camelia and researchers from the University of Athens, Houston Advanced Research Center and Texas A&M University, Ellis suggested using gamma-ray bursts. These flashes of high-energy photons arrive at Earth from the other side of the cosmos, and if they have travelled through a space-time that is fuzzy, says Ellis, they should have become distorted. Roughly speaking, the shorter wavelength photons in the burst should arrive at Earth later than their long wavelength companions, because they fall down the microscopic holes in space-time more easily. Using today’s gamma-ray detectors, it should be possible to see this effect. Unfortunately, the researchers are still working out exactly what a quantum gravity signature would look like.

Decay and transformation
Ellis has helped to develop yet another plan for unveiling quantum gravity, one first suggested in 1995. The delicate physics of neutral kaons, subatomic particles that exist for less than a millionth of a second, could be affected by quantum fluctuations in space-time. Kaons and their antiparticles (antikaons) decay and transform into each other, but they do it at very slightly different rates. Ellis believes that quantum gravity may affect-in a very small way-these decay and transformation rates. As with the gamma-ray bursts, predicting the effect precisely is still beyond the theorists, but it might be possible to isolate it in future particle accelerator experiments

While we wait for these experiments to mature, a new generation of interferometers could eliminate a few more theories. These interferometers are designed to search for another peculiar gravitational phenomenon: gravity waves. Although gravity waves have nothing to do with quantum gravity directly, they could still have a big impact on its theory-makers. When massive objects such as stars move very suddenly, general relativity says that they should send space-time ripples out across the Universe. Astrophysicists hope to see these gravity waves emitted by supernova explosions, or by black holes orbiting one another or even colliding.

The biggest new gravity-wave detector, the Laser Interferometer Gravitational-Wave Observatory (LIGO), is being built at Hanford in Washington State, and Livingston, Louisiana (two versions are needed to rule out the effects of seismic waves). As in the Caltech interferometer, laser light from a single source is split and sent down two perpendicular arms, and reflected by mirrors suspended at the end of each. But LIGO’s arms are 4 kilometres long, and two more mirrors at the junction of the arms send the light back along the same path so the beams can bounce back and forth many times before recombining. A gravitational wave passing though this apparatus would change the lengths of the two arms by different amounts, and so change the interference pattern caused when the two light beams recombine.

When it is fully operational by 2002, LIGO will be the world’s largest precision optical instrument. The device is so sensitive that, despite its massive scale, it should detect movements in the mirrors as small as 10-18 metres, or a thousandth of the diameter of a proton. VIRGO, a slightly smaller European interferometer, will have about the same sensitivity.

Amelino-Camelia says LIGO’s noise levels will set new limits on quantum gravity. Mark Coles, head of the LIGO Livingston observatory, is unsure. “We don’t have any operational experience as yet, so all the predictions of noise performance are simply extrapolations from the Caltech interferometer.” But even if that is true, there is a grander scheme to look forward to. LISA, the Laser Interferometer Space Antenna project, will consist of six spacecraft arranged in pairs at the corners of an equilateral triangle orbiting the Sun-an interferometer stretching over millions of kilometres. LISA is due for completion in 2015.

In the meantime, atom interferometry could provide yet another avenue for quantum gravity research. Ian Percival, a theoretical physicist at London University’s Queen Mary and Westfield College, believes that atom interferometers, which replace laser light with a beam of atoms, should be able to detect fluctuations in the time element of the foam.

It’s not just space that is beaten to a froth: time is also stretched and squashed, fluctuating by around 10 -44 seconds as the bubbles appear and disappear. Small, but possibly detectable, Percival says. According to quantum mechanics, atoms have a wave-like nature, so a single atom can be split into two separate waves and sent along two different paths. When the two atomic waves recombine, any difference in their “internal clocks” due to the effects of quantum gravity should destroy the atomic wave interference pattern.

Steven Chu of Stanford University and Mark Kasevich of Yale University have managed to separate atomic wave packets by 1 centimetre before recombining them. They saw an interference pattern. According to Percival, that could be interpreted in two ways. Either space-time fluctuations don’t exist-in which case quantum gravity theories are in real trouble-or both paths experienced the same fluctuations. He favours the latter: the fluctuations could be “correlated” over these distances, he says. They might even spread from one place to another. As yet, however, no one really knows.

Few people believe that a satisfactory theory of quantum gravity is just around the corner. “It may be that the actual theory is so different from anything we know about that we are hundreds of years away from it,” Ellis says. But now experiments are now becoming possible, things are looking up. Eventually we should narrow in on one true description of the fabric of the Universe. The apple, one might say, has fallen from the tree.

You are made of space-time
BY Davide Castelvecchi & Valerie Jamieson / 12 August 2006

Lee Smolin is no magician. Yet he and his colleagues have pulled off one of the greatest tricks imaginable. Starting from nothing more than Einstein’s general theory of relativity, they have conjured up the universe. Everything from the fabric of space to the matter that makes up wands and rabbits emerges as if out of an empty hat. It is an impressive feat. Not only does it tell us about the origins of space and matter, it might help us understand where the laws of the universe come from. Not surprisingly, Smolin, who is a theoretical physicist at the Perimeter Institute in Waterloo, Ontario, is very excited. “I’ve been jumping up and down about these ideas,” he says. This promising approach to understanding the cosmos is based on a collection of theories called loop quantum gravity, an attempt to merge general relativity and quantum mechanics into a single consistent theory.

The origins of loop quantum gravity can be traced back to the 1980s, when Abhay Ashtekar, now at Pennsylvania State University in University Park, rewrote Einstein’s equations of general relativity in a quantum framework. Smolin and Carlo Rovelli of the University of the Mediterranean in Marseille, France, later developed Ashtekar’s ideas and discovered that in the new framework, space is not smooth and continuous but instead comprises indivisible chunks just 10-35 metres in diameter. Loop quantum gravity then defines space-time as a network of abstract links that connect these volumes of space, rather like nodes linked on an airline route map. From the start, physicists noticed that these links could wrap around one another to form braid-like structures. Curious as these braids were, however, no one understood their meaning. “We knew about braiding in 1987,” says Smolin, “but we didn’t know if it corresponded to anything physical.”

Enter Sundance Bilson-Thompson, a theoretical particle physicist at the University of Adelaide in South Australia. He knew little about quantum gravity when, in 2004, he began studying an old problem from particle physics. Bilson-Thompson was trying to understand the true nature of what physicists think of as the elementary particles – those with no known sub-components. He was perplexed by the plethora of these particles in the standard model, and began wondering just how elementary they really were. As a first step towards answering this question, he dusted off some models developed in the 1970s that postulated the existence of more fundamental entities called preons.

Just as the nuclei of different elements are built from protons and neutrons, these preon models suggest that electrons, quarks, neutrinos and the like are built from smaller, hypothetical particles that carry electric charge and interact with each other. The models eventually ran into trouble, however, because they predicted that preons would have vastly more energy than the particles they were supposed to be part of. This fatal flaw saw the models abandoned, although not entirely forgotten. Bilson-Thompson took a different tack. Instead of thinking of preons as particles that join together like Lego bricks, he concentrated on how they interact. After all, what we call a particle’s properties are really nothing more than shorthand for the way it interacts with everything around it. Perhaps, he thought, he could work out how preons interact, and from that work out what they are.

To do this, Bilson-Thompson abandoned the idea that preons are point-like particles and theorised that they in fact possess length and width, like ribbons that could somehow interact by wrapping around each other. He supposed that these ribbons could cross over and under each other to form a braid when three preons come together to make a particle. Individual ribbons can also twist clockwise or anticlockwise along their length. Each twist, he imagined, would endow the preon with a charge equivalent to one-third of the charge on an electron, and the sign of the charge depends on the direction of the twist.

The simplest braid possible in Bilson-Thompson’s model looks like a deformed pretzel and corresponds to an electron neutrino (see Graphic). Flip it over in a mirror and you have its antimatter counterpart, the electron anti-neutrino. Add three clockwise twists and you have something that behaves just like an electron; three anticlockwise twists and you have a positron. Bilson-Thompson’s model also produces photons and the W and Z bosons, the particles that carry the electromagnetic and weak forces. In fact, these braided ribbons seem to map out the entire zoo of particles in the standard model. Bilson-Thompson published his work online last year ( Despite its achievements, however, he still didn’t know what the preons were. Or what his braids were really made from. “I toyed with the idea of them being micro-wormholes, which wrapped round each other. Or some other extreme distortions in the structure of space-time,” he recalls.

It was at this point that Smolin stumbled across Bilson-Thompson’s paper. “When we saw this, we got very excited because we had been looking for anything that might explain braiding,” says Smolin. Were the two types of braids one and the same? Are particles nothing more than tangled plaits in space-time?

Smolin invited Bilson-Thompson to Waterloo to help him find out. He also enlisted the help of Fotini Markopoulou at the institute, who had long suspected that the braids in space might be the source of matter and energy. Yet she was also aware that this idea sits uneasily with loop quantum gravity. At every instant, quantum fluctuations rumple the network of space-time links, crinkling it into a jumble of humps and bumps. These structures are so ephemeral that they last for around 10-44 seconds before morphing into a new configuration. “If the network changes everywhere all the time, how come anything survives?” asks Markopoulou. “Even at the quantum level, I know that a photon or an electron lives for much longer that 10-44 seconds.”

Markopoulou had already found an answer in a radical variant of loop quantum gravity she had been developing together with David Kribs, an expert in quantum computing at the University of Guelph in Ontario. While traditional computers store information in bits that can take the values 0 or 1, quantum computers use “qubits” that, in principle at least, can be 0 and 1 at the same time, which is what makes quantum computing such a powerful idea. Individual qubits’ delicate duality is always at risk of being lost as a result of interactions with the outside world, but calculations have shown that collections of qubits are far more robust than one might expect, and that the data stored on them can survive all kinds of disturbance.

In Markopoulou and Kribs’s version of loop quantum gravity, they considered the universe as a giant quantum computer, where each quantum of space is replaced by a bit of quantum information. Their calculations showed that the qubits’ resilience would preserve the quantum braids in space-time, explaining how particles could be so long-lived amid the quantum turbulence. Smolin, Markopoulou and Bilson-Thompson have now confirmed that the braiding of this quantum space-time can produce the lightest particles in the standard model – the electron, the “up” and “down” quarks, the electron neutrino and their antimatter partners (

All from nothing at all
So far the new theory reproduces only a few of the features of the standard model, such as the charge of the particles and their “handedness”, a quantity that describes how a particle’s quantum-mechanical spin relates to its direction of travel in space. Even so, Smolin is thrilled with the progress. “After 20 years, it is wonderful to finally make some connection to particle physics that isn’t put in by hand,” he says. The correspondence between braids and particles suggests that more properties may be waiting to be derived from the theory. The most substantial achievement, Smolin says, would be to calculate the masses of the elementary particles from first principles. It is a hugely ambitious goal: predicting the masses and other fundamental constants of nature was something string theorists set out to do more than 20 years ago – and have now all but given up on. As with string theory, devising experiments to test for the new theory will also be difficult. This is a problem that plagues loop quantum gravity in all its guises, because no conceivable experiment can probe space down to 10-35 metres.

Ironically, the best arena in which to look for experimental proof might be the largest scales in the universe, not the smallest. “The closest anyone is getting to making predictions is in the area of cosmology,” says John Baez, a mathematician and expert on quantum gravity at the University of California, Irvine. Markopoulou is now trying to think of ways of testing the braid model using the fossil radiation left over from the big bang, the so-called cosmic microwave background that permeates the universe. Physicists believe that the patterns we see today in that radiation may have originated from quantum fluctuations during the earliest moments of the big bang, when all of the matter in the universe was crammed into a space small enough for quantum effects to be significant.

Meanwhile, Markopoulou’s vision of the universe as a giant quantum computer might be more than a useful analogy: it might be true, according to some theorists. If so, there is one startling consequence: space itself might not exist. By replacing loop quantum gravity’s chunks of space with qubits, what used to be a frame of reference – space itself – becomes just a web of information. If the notion of space ceases to have meaning at the smallest scale, Markopoulou says, some of the consequences of that could have been magnified by the expansion that followed the big bang. “My guess is that the non-existence of space has effects that are measurable, if you can only see it right.” Because it’s pretty hard to wrap your mind around what it means for there to be no space, she adds. Hard indeed, but worth the effort. If this version of loop quantum gravity can reproduce all of the features of the standard model of particle physics and be borne out in experimental tests, we could be onto the best idea since Einstein. “It’s a beautiful idea. It’s a brave, strange idea,” says Rovelli. “And it might just work.”

Of course, most physicists are reserving judgement. Joe Polchinski, a string theorist at Stanford University in California, believes that Smolin and his colleagues still have a lot of work to do to show that their braids capture all of the details of the full standard model. “This is in a very preliminary stage. One has to play with it and see where it goes,” Polchinski says. If the new loop quantum gravity does go the distance, though, it could give us a new sense of our place in the universe. If electrons and quarks – and thus atoms and people – are a consequence of the way space-time tangles up on itself, we could be nothing more than a bundle of stubborn dreadlocks in space. Tangled up as we are, we could at least take comfort in knowing at last that we truly are at one with the universe.

Supersizing quantum gravity
For loop quantum gravity to succeed as a fundamental theory of gravity, it should at the very least predict that apples fall to Earth. In other words, Newton’s law of gravity should naturally arise from it. It is a tall order for a theory that generates space and time from scratch to describe what happens in the everyday world, but Carlo Rovelli at the University of the Mediterranean in Marseille, France, and his team have succeeded in doing just that. “Essentially we have calculated Newton’s law starting from a world with no space and no time,” he says (

Newton’s law of gravity describes the attractive force between two masses separated by a given distance. However, it is not so simple to measure this separation when space has a complex quantum architecture of the sort in loop quantum gravity, where it is not even clear what is meant by distance. This has been the biggest obstacle to showing how Newton’s law can emerge from quantised space.

The naive way to measure length in quantised space is to hop from one quantum to another, counting how many steps it takes to reach the final destination. According to loop quantum gravity, however, the fabric of space seethes with quantum fluctuations, so the distance between two points is forever changing, and can even take several values at the same time.

Working with Eugenio Bianchi of the University of Pisa, Leonardo Modesto of the University of Bologna and Simone Speziale of the Perimeter Institute in Waterloo, Ontario, Rovelli circumvented the problem. The team found a mathematical way of isolating regions of space for long enough to measure the separation between two points. When they zoomed out and used this mathematics to look at space-time on much larger scales, they found that Newton’s law popped out of their theory.

The calculation by Rovelli’s team does not yet reproduce the full complexity of Einstein’s general relativity, which also describes masses large enough to curve space appreciably. Their result does point in the right direction, however. Lee Smolin of the Perimeter Institute calls it a major step forward. “Their work shows that loop quantum gravity definitely has gravity in it,” he says. “It’s no longer just pie in the sky.”


Darwin could not have anticipated, for example, the work of physicist Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario. Smolin has utilized Darwinian concepts to shape a theory of the universe that he calls “cosmological natural selection.” He developed a theory that posits the existence of a vast number of unseen universes, each generated by the collapse of a black hole. The conditions of those collapses bestow each universe with its own set of fundamental parameters, such as the masses of its various subatomic particles. Just as life diversified on Earth, the “multiverse” in Smolin’s theory evolved from simple beginnings into a complex and varied assemblage of universes, each exhibiting a distinctive set of traits.

Cosmological natural selection could help to solve one of the main conundrums in physics: the seemingly arbitrary values of the fundamental constants in our universe. Why is a neutron, for example, more massive than a proton rather than the other way around? If a wealth of universes with unique parameters exists, Smolin says, then our own case does not seem so special or so unlikely. In fact, cosmological natural selection specifically favors universes—like ours—in which massive stars can form and give rise to new black holes. “By using Darwinian methodology, I was able to get an explanation for the improbable complexity of our universe,” Smolin says.

BY Lee Smolin

What is space and what is time? This is what the problem of quantum gravity is about. In general relativity, Einstein gave us not only a theory of gravity but a theory of what space and time are — a theory that overthrew the previous Newtonian conception of space and time. The problem of quantum gravity is how to combine the understanding of space and time we have from relativity theory with the quantum theory, which also tells us something essential and deep about nature. If we can do this, we’ll discover a single unified theory of physics that will apply to all phenomena, from the very smallest scales to the universe itself. This theory will, we’re quite sure, require us to conceive of space and time in new ways that take us beyond even what relativity theory has taught us.

But, beyond even this, a quantum theory of gravity must be a theory of cosmology. As such, it must also tell us how to describe the whole universe from the point of view of observers who live in it — for by definition there are no observers outside the universe. This leads directly to the main issues we’re now struggling with, because it seems very difficult to understand how quantum theory could be extended from a description of atoms and molecules to a theory of the whole universe. As Bohr and Heisenberg taught us, quantum theory seems to make sense only when it’s understood to be the description of something small and isolated from its observer — the observer is outside of it. For this reason, the merging of quantum theory and relativity into a single theory must also affect our understanding of the quantum theory. More generally, to solve the problem of quantum gravity we’ll have to invent a good answer to the question: How can we, as observers who live inside the universe, construct a complete and objective description of it?

Most of my work as a scientist has been directed to the problem of quantum gravity. I like working on this problem a great deal, especially as it’s the only area of physics I know of where one is daily confronted by deep philosophical problems while engaged in the usual craft of a theoretical physicist, which is to make calculations to try to extract predictions about nature from our theoretical pictures. Also, I like the fact that one needs to know a lot of different things to think about this problem. For example, it’s likely that quantum gravity may be relevant for understanding the observational data from astronomy, and it’s also likely that the new theory we’re trying to construct will make use of new mathematical ideas and structures that are only now being discovered. So although I’ve worked almost solely on this problem for almost twenty years, I’ve never been bored.

I have days in which I spend the morning working on a calculation, to check an idea I had the night before, and then I’ll go to a lunch seminar, where I hear astronomers discuss the latest evidence for some crucial question, like how much dark matter there is. Then I spend the afternoon studying the paper of a friend who’s a pure mathematician, after which I meet a philosopher for dinner and continue an argument we’re having on the nature of time. And what’s wonderful is the way that these different subjects, which until recently were disconnected from one another, often seem to illuminate one another. Of course, sometimes it’s not so ideal; teaching and bureaucracy take up a lot of time — although in reasonable doses, I must say. I love teaching also. But there are really many days when I feel very fortunate and can’t imagine that I’m being paid to live like this.

For the last eight years or so — really, it doesn’t seem so long! — I’ve been working with several friends on a new approach to combining relativity and quantum theory. We call this approach “nonperturbative quantum gravity.” It’s enabling us to investigate the implications of combining general relativity and quantum theory more deeply and thoroughly than was possible before. We aren’t yet finished, but we’re making progress steadily, and recently we’ve got the theory well enough in hand that we’ve been able to extract some experimental predictions from it. Unfortunately, the predictions we’ve been able to make so far can’t be tested, because they’re about the geometry of space at scales twenty orders of magnitude smaller than an atomic nucleus. But this is further toward a solution to the problem than anyone has gotten before — and, I must say, further than I sometimes expected we’d be able to go in my lifetime.

In this work, we’ve been combining a very beautiful formulation of Einstein’s general theory of relativity discovered by my friend Abhay Ashtekar with some ideas about how to construct a quantum theory of the geometry of space and time in which everything is described in terms of loops. That is, rather than describing the world by saying where each particle is, we describe it in terms of how loops are knotted and linked with one another. This approach to quantum theory was invented by another friend — Carlo Rovelli — and myself, and also by the very interesting Uruguayan physicist Rodolfo Gambini.

The main result of this work is that at the Planck scale, which is twenty powers of ten smaller than an atomic nucleus, space looks like a network or weave of discrete loops. In fact, these loops are something like the atoms out of which space is built. We’re able to predict that — just as the possible energies an atom can have come in discrete units — when one probes the structure of space at this Planck scale, one finds that the possible values the area of a surface or the volume of some region can have also come in discrete units. What seems to be the smooth geometry of space at our scale is just the result of an enormous number of these elementary loops joined and woven together, as an apparently smooth piece of cloth is really made out of many individual threads.

Furthermore, what’s wonderful about the loop picture is that it’s entirely a picture in terms of relations. There’s no preexisting geometry for space, no fixed reference points; everything is dynamic and relational. This is the way Einstein taught us we have to understand the geometry of space and time — as something relational and dynamic, not fixed or given a priori. Using this loop picture, we’ve been able to translate this idea into the quantum theory.

Indeed, for me the most important idea behind the developments of twentieth-century physics and cosmology is that things don’t have intrinsic properties at the fundamental level; all properties are about relations between things. This idea is the basic idea behind Einstein’s general theory of relativity, but it has a longer history; it goes back at least to the seventeenth-century philosopher Leibniz, who opposed Newton’s ideas of space and time because Newton took space and time to exist absolutely, while Leibniz wanted to understand them as arising only as aspects of the relations among things. For me, this fight between those who want the world to be made out of absolute entities and those who want it to be made only out of relations is a key theme in the story of the development of modern physics. Moreover, I’m partial. I think Leibniz and the relationalists were right, and that what’s happening now in science can be understood as their triumph.

Indeed, in the last few years, I’ve also realized that the relational point of view can inspire ideas about other problems in physics and astronomy. These include the basic problem in elementary particle physics, which is accounting for all the masses and charges of the fundamental particles. I’ve come to believe that this problem is connected as well to two other basic questions that people have been wondering about for many years. The first of these is: Why are the laws of physics and the conditions of the universe special in ways that make the universe hospitable for the existence of living things? Closely related to this is the second question: Why, so long after it was formed, is the universe so full of structures? Beyond even the question of life, it’s a remarkable fact that our universe seems, rather than having come to a uniform and boring state of thermal equilibrium, to have evolved to a state in which it’s full of structure and complexity on virtually every scale, from the subnuclear to the cosmological.

The picture that emerges from both relativity and quantum theory is of a world conceived as a network of relations. Newton’s hierarchical picture, in which atoms with fixed and absolute properties move against a fixed background of absolute space and time, is quite dead. This doesn’t mean that atomism or reductionism are wrong, but it means that they must be understood in a more subtle and beautiful way than before. Quantum gravity, as far as we can tell, goes even further in this direction, as our description of the geometry of spacetime as woven together from loops and knots is a beautiful mathematical expression of the idea that the properties of any one part of the world are determined by its relationships and entanglement with the rest of the world.

As we began to develop this picture, I also began to wonder whether the basic philosophy behind it might extend to other aspects of nature, beyond just the description of space and time. More precisely, I began to wonder whether the world as a whole might be understood in a way that was more interrelated and relational than in the usual picture, in which everything is determined by fixed laws of nature. We usually imagine that the laws of nature are fixed, once and for all, by some absolute mathematical principle, and that they govern what goes on by acting at the level of the smallest and most fundamental particles. There are good reasons why we believe that the fundamental forces should act only on the elementary particles. But in particle physics we have been making another assumption as well: that there are mechanisms or principles that pick out which laws are actually expressed in nature, and that these mechanisms or principles also work only at enormously tiny scales, much smaller than the atomic nucleus; an example of such a mechanism is something called “spontaneous symmetry breaking.” Given that the choice of laws makes a great difference for the universe as a whole, it began to seem strange to me that the mechanisms that choose the laws should not somehow be influenced by the overall history or structure of the universe at very large scales. But, for me, the real blow to the idea that the choice of which laws govern nature is determined only by mechanisms acting at the smallest scales came from the dramatic failure of string theory.

Like many of the young people trained in elementary-particle physics in the 1970s and ’80s, I had great hopes for string theory, since it seemed to have the best possible chance of providing a fundamental unified theory. Indeed, I still think there are ideas in string theory that may be right, and its exploration has led to the uncovering of some beautiful and deep mathematics. But as a theory of the elementary particles, it has certainly so far failed, for while it initially seemed that there was only one possible consistent string theory, we now know there are a great many such theories, each apparently as consistent as the others and all leading to different universes. Thus, string theory hasn’t solved the problem of how the world chooses to have the particular collection of particles and forces it does. And whatever the theory’s future, I’ve come to doubt that it ever will.

This crisis led me to wonder whether the search for the principles that determine which laws of nature govern our world could succeed, if we continue to look only at mechanisms that act on very small scales. Instead, I began to ask myself whether there might be mechanisms that could in some way couple the properties of the elementary particles to the properties of the universe created by their interactions — perhaps even on astronomical and cosmological scales. By this I mean nothing mystical. Since the universe has a history, and did apparently pass through a stage when it was very small, there might be some mechanism that coupled the properties of things on the largest scales to the properties of things on the smallest scales. Thus, about five years ago I began to wonder whether there might be some way in which the properties of the elementary particles are chosen by the universe itself, during its evolution. Wondering about this made me notice and take seriously what many people had pointed out previously — that the properties of the elementary particles and the conditions of the universe seem very well chosen for the universe to develop structure and life. It does seem that this is true — that if almost any other set of forces and particles had been chosen, the universe would not only not contain life, it would be much less rich in structure and variety of phenomena than our world is.

Many of the people who’ve noticed this have become advocates of the anthropic principle. This is the idea that the properties of the world have somehow been chosen because of — or at least are explained by — the fact that with this choice intelligent life like us can exist. I’d always resisted this idea, and I still do. The anthropic principle is said to come in two forms, a weak form and a strong form. In its weak form, I think it’s just the observation that the world in which we find ourselves is very special. This doesn’t explain anything, it only points out the need for an explanation of how the world got to be special — an explanation that must be made in terms of some mechanism acting in its past. The strong form — that the laws of physics are somehow chosen in order that life can exist — is, to me, really more religion than science. Indeed, I’m not surprised to find that several advocates of the strong form of the anthropic principle are writing books and papers connecting their belief in the anthropic principle with Christian theology. This is fine, for religion, but it isn’t science. Instead, when I realized that people like Martin Rees and Bernard Carr were right — that the world is very special in ways that seem a priori extremely unlikely — I began to wonder whether there might be some real mechanism, something taking place earlier in the history of the universe, that might explain how the properties of the elementary particles have been selected so that the world has the enormous amount of structure and variety it does.

At this time, I was reading a lot of biology: Richard Dawkins on evolution, Harold Morowitz on self-organization, and James Lovelock and Lynn Margulis on the Gaia idea. And I remember wondering whether, if the earth can be understood as a self- organized system, maybe the same thing was true for larger systems, such as a galaxy or the universe as a whole. This was also summertime, and I was sailing a lot, and I spent a lot of time letting the boat drift and wondering what kind of mechanisms of self-organization might have acted early in the history of the universe to select the properties of the elementary particles and forces in nature. It seemed to me that the only principle powerful enough to explain the high degree of organization of our universe — compared to a universe with the particles and forces chosen randomly — was natural selection itself. The question then became: Could there be any mechanism by which natural selection could work on the scale of the whole universe?

Once I asked the question, an answer appeared very quickly: the properties of the particles and the forces are selected to maximize the number of black holes the universe produces. This idea came right away, because of two ideas I was familiar with from my work on quantum gravity. The first is that inside a black hole, quantum effects remove the singularity that general relativity says is there — and that we know is there from the theorems of Penrose and Hawking — and a new region of the universe begins to expand as if from a big bang, there inside the black hole. I remember Bryce DeWitt, who is one of the great pioneers of quantum gravity, telling me about this idea shortly after I began to work for him, on my first postdoc. The second idea — which comes from John A. Wheeler, another great pioneer of the field — is that at such events the properties of the elementary particles and forces might change randomly. All I then needed to make a mechanism for natural selection was to assume that these changes are small, because reading Dawkins had taught me the importance for natural selection of incremental change by the accumulation of small changes in the gene. Then, with the universes as animals and the properties of the elementary particles as genes, I had a mechanism by which natural selection would act to produce universes with whatever choices of parameters would lead to the most production of black holes, since a black hole is the means by which a universe reproduces — that is, spawns another.

This was in 1989. I still don’t know if the idea is right. But what I’m very proud of is that the idea is testable. Most ideas about why the elementary particles have the properties they do which have been proposed in the past few years aren’t testable. This is the main reason the field is in such a crisis. But this idea leads to a prediction, which is that if I could change any of the properties of the elementary particles the result should be either to decrease or to leave alone the number of black holes the universe makes. This is because the idea implies that almost every universe, and therefore most likely our own, has parameters that maximize the numbers of black holes it can make.

When this idea first came to me, I didn’t take its prospects very seriously, and I imagine neither did most of my colleagues. I also didn’t know much astrophysics, and I imagined that it would be an easy matter to test what would happen to the rate of production of black holes if you changed, for example, the mass of one or another sort of elementary particle, or the strength of one of the forces. So to test the idea, I started to learn some astronomy and astrophysics. So far, I haven’t found a way to change the properties of the particles and forces to make a universe that makes more black holes, and I have found several changes that decrease their number. I’ve also brought the question to a number of astrophysicists, who know the field much better than I do. I’ve been very pleased that these people, some of whom I admire very much, were interested enough to spend the time to examine such an unusual idea. They made some interesting suggestions, and although no one was able to propose a change of parameters that clearly leads to the production of more black holes, several interesting possibilities, which I’m studying now, did emerge from these conversations. Certainly, if the idea’s wrong, I’ll be grateful if someone proposes a test that would kill it. I believe more in the general idea that there must be mechanisms of self-organization involved in the selection of the parameters of the laws of nature than I do in this particular mechanism, which is only the first one I was able to invent. But it seems that the situation at present is that there’s much more testing that needs to be done, and lately I’ve been spending more time on this. Perhaps what’s most amazing to me is that after five years this rather improbable idea is still not dead.

Whether it dies or not, I’ve learned enough astronomy to discover something that’s completely changed my view of cosmology. This is that the idea that there are principles of self-organization acting on astronomical scales seems really to be true. During the last ten years or so, people who study galaxies have discovered evidence that feedback effects and mechanisms of self-organization are indeed happening at the level of the galaxies; they are, in fact, essential for galaxies to form stars. They’re also necessary to the existence of spiral galaxies. The idea that a galaxy is a self organized system — more an ecology than a nonliving clump of stars and gas — has become common among astronomers and physicists who study galaxies.

Thus, it seems to me quite likely that the concept of self- organization and complexity will more and more play a role in astronomy and cosmology. I suspect that as astronomers become more familiar with these ideas, and as those who study complexity take time to think seriously about such cosmological puzzles as galaxy structure and formation, a new kind of astrophysical theory will develop, in which the universe will be seen as a network of self-organized systems.

Leave a Reply