“Gravity waves, mysterious waves that ripple unseen throughout the atmosphere, may be a major source of airplane turbulence, a new study suggests. The new findings, presented Tuesday (Dec. 4) here at the annual meeting of the American Geophysical Union, may help explain why planes get shaky in apparently clear skies. Forecasting those waves may allow planes to reroute around them. “Just like waves on the ocean, as they approach a beach, they can amplify and break. Gravity waves in the atmosphere can amplify and break, and we’re finding now that’s a major contributor to turbulence in the atmosphere that affects aircrafts.”

Gravity waves form when air traveling up and down in the atmosphere meets resistance. For instance, clouds rising in the troposphere, the lower level of the atmosphere where air mixes freely, will bump up against the boundary of the much more stable stratosphere, forming ripples in the process. These waves can travel up to 180 miles (300 kilometers) before breaking, said Robert Sharman, a meteorologist at the National Center for Atmospheric Research (NCAR), who conducted the study. “They’re waves running around in the atmosphere all the time,” Sharman told LiveScience.

Sharman and his colleagues wanted to understand when and where these waves occur. They collected data from commercial aircraft flight recorders, which record the location, duration and intensity of turbulence. Then they recreated these turbulent events using a computer simulation that models the atmosphere. They found that gravity waves “break” on the surfaces of planes, just like ocean waves breaking on the beach, causing much of the turbulence that occurs out of the blue in clear air. In the past, pilots thought airplanes moving up and down in the jet stream caused such turbulence.

Many of the waves were formed in storm clouds that tracked the jet stream, but traveled miles away and broke in areas where airplanes were flying. Big mountains like the Colorado Rockies often form gravity waves as air flows over the mountains and then overshoots as it reaches the other side. Luckily, gravity waves don’t span a large height in the atmosphere, so it’s pretty easy for airplanes to avoid such waves, Sharman said. “They could either climb over it or go beneath it,” he said. The team is now using their simulations to forecast gravity waves throughout the world. While the forecasts can predict the waves’ occurrence most of the time, they would need to reach about 85 percent accuracy before pilots would use such predictions to avoid choppy air, he said. “Anytime they change course, it costs the airlines fuel. They have to be pretty certain that that forecast is right before they’ll make any deviation,” he said.”

Gravitational Wave Detector
Goddard physicist Babak Saif is part of a team from Stanford University and AOSense, Inc., a Sunnyvale, Calif.-based company, that has received NASA funding to use atom optics to detect theoretically predicted gravitational waves.{photo: NASA | Pat Izzo}

by Jeremy Hsu  /  October 22 2012

“Albert Einstein predicted the existence of gravitational waves that ripple outward from moving celestial objects such as stars or black holes — but such waves are so weak by the time they reach Earth that the planet quivers by less than an atom’s width in response. NASA wants to harness the spooky quantum behavior of atoms to help detect the gravitational waves. The U.S. space agency has funded the possible solution, called atom interferometry, so that it might someday enable a mission consisting of three identical spacecraft flying in a triangle formation between 310 miles (500 kilometers) and 3,107 miles (5,000 kilometers). If a gravitational wave swept through the area, the spacecraft interferometers would sense the tiny disturbances. “The NASA funding is basically for a preliminary design study for what a gravitational wave detector would look like,” said Mark Kasevich, a physicist at Stanford University.

The technology would enable scientists to detect gravitational waves related to events such as a black hole or two stars merging in a distant star system. It could also lead to more sensitive sensors for steering U.S. military submarines or aircraft — Kasevich’s Stanford lab has been working on gyroscopes, gravimeters, accelerometers and gravity gradiometers for the U.S. Department of Defense.  But for NASA, a gravitational wave detector is “probably a decade away,” Kasevich told TechNewsDaily. An actual space mission would probably take even longer to launch. Normal interferometry — a 200-year-old technique — gets accurate measurements by comparing light that has been split into two equal halves by a beam splitter. Scientists shine one of the beams through something they want to measure, and compare it to the other untouched beam by bouncing both off mirrors to reflect back onto a detector or camera. [Space Quantum Experiment Has First Balloon Flight]

The atomic interferometry funded by NASA’s Innovative Advanced Concepts program takes advantage of quantum mechanics, the physics theory that describes how matter behaves at the tiniest scales. That effort is led by researchers at NASA’s Goddard Space Flight Center in Greenbelt, Md.; Stanford University in California; and AOSense Inc., in Sunnyvale, Calif. Researchers would first fire a laser to slow and cool the atoms down to a frigid temperature near absolute zero (minus 273.15 degrees Celsius), so that the atoms behave like waves rather than particles. Then they would fire more laser pulses that put the atoms into a “superposition of states,” which allows them to exist in multiple states simultaneously. The superposition means a single atom can split into different states that exist independently and go flying off on different trajectories like separate particles, before they recombine at a detector. If an atom’s path is altered even a bit by a passing gravitational wave, the atom interferometer can detect the difference. NASA’s funding does not cover the full spacecraft mission just yet. First, the researchers plan to test the atomic interferometer at a 33-foot drop tower in the basement of a Stanford University physics laboratory — firing lasers at a cloud of falling rubidium atoms to cool them and then put them into their “spooky” quantum states. Successful testing could establish the foundation for making the space version of the technology.”

“This illustration shows the variations in the lunar gravity field as measured by the Gravity Recovery and Interior Laboratory (GRAIL). Red corresponds to mass excesses and blue corresponds to mass deficiencies.” 

GRAIL reveals a battered lunar history
Twin spacecraft create a highly detailed gravity map of the moon, finding an interior pulverized by early impacts.
by Jennifer Chu  /  December 5, 2012

“Beneath its heavily pockmarked surface, the moon’s interior bears remnants of the very early solar system. Unlike Earth, where plate tectonics has essentially erased any trace of the planet’s earliest composition, the moon’s interior has remained relatively undisturbed over billions of years, preserving a record in its rocks of processes that occurred in the solar system’s earliest days. Now scientists at MIT, NASA, the Jet Propulsion Laboratory and elsewhere have found evidence that, beneath its surface, the moon’s crust is almost completely pulverized. The finding suggests that, in its first billion years, the moon — and probably other planets like Earth — may have endured much more fracturing from massive impacts than previously thought. The startling observations come from data collected by NASA’s Gravity Recovery and Interior Laboratory (GRAIL) mission. Since March, the mission’s twin spacecraft, named Ebb and Flow, have been orbiting the moon and measuring its gravitational field.

From GRAIL’s measurements, planetary scientists have now stitched together a high-resolution map of the moon’s gravity — a force created by surface structures such as mountains and craters, as well as deeper structures below the surface. The resulting map reveals an interior gravitational field consistent with an incredibly fractured lunar crust. “It was known that planets were battered by impacts, but nobody had envisioned that the [moon’s] crust was so beaten up,” says MIT’s Maria Zuber, who leads the GRAIL mission and is the E.A. Griswold Professor of Geophysics in the Department of Earth, Atmospheric and Planetary Sciences. “This is a really big surprise, and is going to cause a lot of people to think about what this means for planetary evolution.” Zuber and her colleagues detail their findings from GRAIL in three papers published this week in Science. GRAIL’s lunar gravity map has also revealed numerous structures on the moon’s surface that were unresolved by previous gravity maps of any planet, including volcanic landforms, impact basin rings, and many simple, bowl-shaped craters. From GRAIL’s measurements, scientists have determined that the moon’s crust, ranging in thickness from 34 to 43 kilometers, is much thinner than planetary geologists had previously suspected. The crust beneath some major basins is nearly nonexistent, indicating that early impacts may have excavated the lunar mantle, providing a window into the interior.

Lifting a veil
To generate the gravity map, GRAIL’s two probes measure the changing distance between themselves as they orbit in tight formation around the moon. As one of the probes flies over a large mass, such as a mountain or dense, underground rock, the stronger local gravity will pull that probe ahead, widening the space between the two spacecraft. Scientists can translate this changing distance into a gravitational map, representing the gravity produced by both the surface structures and the interior. To find the gravitational field for the moon’s interior alone, Zuber’s team used topographic measurements from another of their instruments, a laser altimeter aboard the Lunar Reconnaissance Orbiter, a separate spacecraft in orbit around the moon. The scientists calculated the gravitational field expected to be produced by the moon’s topography — its surface structures alone — then subtracted that field from the field measured by GRAIL. “It’s essentially like removing a veil to reveal the gravity due to the inside of the planet,” Zuber says. “And when we saw those maps, we were just speechless.”

Compared to the surface, the map of the interior looked extraordinarily smooth. In fact, the team found that most of the moon’s local gravity is due to surface features, such as crater rims and mountains. Except for the large impact basins, the moon’s upper crust, largely lacks dense rock structures, and is instead likely made of porous, pulverized material. The interior map did reveal long, linear structures of denser material, which Zuber and her team believe to be buried lunar dikes — formed from magma that seeped into large fractures in the crust, and then solidified into dense walls of rock. These dikes represent evidence for expansion of the moon in its earliest history. But overall, 98 percent of the lunar crust is fragmented — a clear remnant of very early, very massive impacts. “This is interesting for the moon,” Zuber says. “But what it also means is that every other planet was being bombarded like this.” The resulting fractures, she says, affect the way a planetary body loses heat and also provide a pathway for the transport of interior fluids. David Kring, a senior staff scientist at the Lunar and Planetary Institute in Houston, says knowing the extent of pulverization in the moon’s crust is an essential detail needed to determine the moon’s bulk composition. Such information would go a long way toward identifying the processes the formed the moon and other planets. “The staggering quality of the data reported by Professor Zuber and her colleagues is amazing,” says Kring, who was not involved in the research. “The data are exciting because they foretell far more insights than are captured in these initial three papers.”

Perfect times two
In addition to GRAIL’s discoveries, Zuber says another major accomplishment has been the performance of the spacecraft themselves. To achieve the mission’s science goals, the two probes, which can travel more than 200 kilometers apart, needed to be able to measure changes in the distance between them to within a few tenths of a micron per second. But GRAIL actually outperformed its measurement requirements by about a factor of five, resolving changes in spacecraft distance to several hundredths of a micron per second — one twenty-thousandth the velocity that a snail travels. “On this mission, with two spacecraft, everything had to go perfectly twice,” Zuber says, adding proudly: “Imagine you’re a parent raising a twins, and your children sit down at the piano and play a duet perfectly. That’s how it feels.”

Gravitational wave detectors to get major upgrade
by David Shiga  /  02 April 2008

“The LIGO project to detect gravitational waves has been given the green light to begin a major upgrade of its detectors. When the upgrade is completed in 2014, the project may be sensitive enough to detect gravitational waves – which have yet to be observed – as often as once a week. Gravitational waves are ripples in the fabric of space predicted by Einstein’s general theory of relativity. They are triggered by the motion of massive objects. “[With the upgrade], either we’ll see a signal or Einstein’s general theory of relativity will be wrong,” says LIGO director Jay Marx of Caltech in Pasadena, US. The ability to listen to gravitational waves would also open up a completely new window for astronomers to observe the universe, allowing them to witness violent events like the collision of pairs of black holes or neutron stars, and even hear the primeval groaning of the universe as it expanded during its earliest moments. LIGO (Laser Interferometer Gravitational-wave Observatories) uses two gravitational wave detectors in the US – one in Livingston, Louisiana, and the other at the Hanford nuclear facility near Richland, Washington. Using lasers, the detectors look for slight changes in the length of tunnels several kilometres long that would occur with the passage of a gravitational wave.

Powerful lasers
Although LIGO is the world’s most sensitive gravitational wave project, scientists estimate that currently it has a chance of only a few percent per year of detecting a source for the waves. Scientists just have to wait and hope that a violent enough event will occur close enough to the Earth to be noticeable. To improve the situation, scientists have been planning a major upgrade called Advanced LIGO. Now, the project has been given approval to begin the upgrades, which should be finished in 2014. The US National Science Foundation’s governing board gave the go-ahead for the $205 million upgrade at a meeting on 27 March. The upgrade will involve replacing the existing 10-watt lasers with 180-watt versions, among other improvements. Put together, the improvements mean Advanced LIGO will be 10 times more sensitive in the frequency range it currently monitors. It will also be able to detect waves at much lower frequencies, down to 10 Hz, compared to its current lower limit of around 40 Hz. The improvements mean Advanced LIGO will be able to detect sources 10 times farther from Earth than it can now, increasing the volume of space it will probe by a factor of 1000. “With Advanced LIGO, we think we’ll be seeing gravitational waves from sources maybe once a week,” Marx told New Scientist. “Advanced LIGO really opens the door to a new form of astronomy.”


Astrophysicist Replaces Supercomputer with Eight PlayStation 3s
by Bryan Gardiner  /  10.17.07

“Suffering from its exorbitant price point and a dearth of titles, Sony’s PlayStation 3 isn’t exactly the most popular gaming platform on the block. But while the console flounders in the commercial space, the PS3 may be finding a new calling in the realm of science and research. Right now, a cluster of eight interlinked PS3s is busy solving a celestial mystery involving gravitational waves and what happens when a super-massive black hole, about a million times the mass of our own sun, swallows up a star. As the architect of this research, Dr. Gaurav Khanna is employing his so-called “gravity grid” of PS3s to help measure these theoretical gravity waves — ripples in space-time that travel at the speed of light — that Einstein’s Theory of Relativity predicted would emerge when such an event takes place.

It turns out that the PS3 is ideal for doing precisely the kind of heavy computational lifting Khanna requires for his project, and the fact that it’s a relatively open platform makes programming scientific applications feasible. “The interest in the PS3 really was for two main reasons,” explains Khanna, an assistant professor at the University of Massachusetts, Dartmouth who specializes in computational astrophysics. “One of those is that Sony did this remarkable thing of making the PS3 an open platform, so you can in fact run Linux on it and it doesn’t control what you do.” He also says that the console’s Cell processor, co-developed by Sony, IBM and Toshiba, can deliver massive amounts of power, comparable even to that of a supercomputer — if you know how to optimize code and have a few extra consoles lying around that you can string together. “The PS3/Linux combination offers a very attractive cost-performance solution whether the PS3s are distributed (like Sony and Stanford’s Folding [at] home initiative) or clustered together (like Khanna’s), says Sony’s senior development manager of research and development, Noam Rimon.

According to Rimon, the Cell processor was designed as a parallel processing device, so he’s not all that surprised the research community has embraced it. “It has a general purpose processor, as well as eight additional processing cores, each of which has two processing pipelines and can process multiple numbers, all at the same time,” Rimon says. This is precisely what Khanna needed. Prior to obtaining his PS3s, Khanna relied on grants from the National Science Foundation (NSF) to use various supercomputing sites spread across the United States “Typically I’d use a couple hundred processors — going up to 500 — to do these same types of things.” However, each of those supercomputer runs cost Khanna as much as $5,000 in grant money. Eight 60 GB PS3s would cost just $3,200, by contrast, but Khanna figured he would have a hard time convincing the NSF to give him a grant to buy game consoles, even if the overall price tag was lower. So after tweaking his code this past summer so that it could take advantage of the Cell’s unique architecture, Khanna set about petitioning Sony for some help in the form of free PS3s. “Once I was able to get to the point that I had this kind of performance from a single PS3, I think that’s when Sony started paying attention,” Khanna says of his optimized code. Khanna says that his gravity grid has been up and running for a little over a month now and that, crudely speaking, his eight consoles are equal to about 200 of the supercomputing nodes he used to rely on. “Basically, it’s almost like a replacement,” he says. “I don’t have to use that supercomputer anymore, which is a good thing.”

“For the same amount of money — well, I didn’t pay for it, but even if you look into the amount of funding that would go into buying something like eight PS3s — for the same amount of money I can do these runs indefinitely.” The point of the simulations Khanna and his team at UMass are running on the cluster is to see if gravitational waves, which have been postulated for almost 100 years but have never been observed, are strong enough that we could actually observe them one day. Indeed, with NASA and other agencies building some very big gravitational wave observatories with the sensitivity to be able to detect these waves, Khanna’s sees his work as complementary to such endeavors. Khanna expects to publish the results of his research in the next few months. So while PS3 owners continue to wait for a fuller range of PS3 titles and low prices, at least they’ll have some reading material to pass the time.”

Gaurav Khanna, Ph. D.
email : gkhanna [at] umassd [dot] edu



“The Sony PlayStation 3 has a number of unique features that make it particularly suited for scientific computation. To start with, the PS3 is an open platform, which essentially means that one can run a different system software on it, for example, PowerPC Linux. Next, it has a revolutionary processor called the Cell processor which was developed by Sony, IBM and Toshiba. This processor has a main CPU, called the PPU and several (six for the PS3) special compute engines, called SPUs available for raw computation. Moreover, each SPU performs vector operations, which implies that it can compute on multiple data, in a single step. Finally, its incredibly low cost makes it very attractive as a scientific computing node, that is part of a cluster. In fact, its highly plausible that the raw computing power per dollar that the PS3 offers, is significantly higher than anything else on the market today!

Thanks to a very generous, partial donation by Sony, we have a sixteen PS3 cluster in our department, which we call PS3 Gravity Grid. Check out some pictures of the cluster here: 1) the PS3’s arrive; 2) the rack arrives; 3) front view of the cluster; 4) side view of the cluster. We are using “stock” PS3s for this cluster, with no hardware modifications. They are networked together using an inexpensive netgear gigabit switch. For Linux installation, there are several guides available on the internet. For YDL Linux, consider using the guide by Terrasoft Solutions. For Fedora Core 5/6, I found this guide particularly useful. For deploying a parallel job on this cluster, we use a code that implements a standard domain decomposition approach, based on message-passing (MPI). There are more details available on our code below. For compiling, we use GCC and also IBM’s XL compilers for the Cell, that are available as part of IBM’s Cell SDK. These are available from IBM’s alphaworks site. The MPI distribution that we are using is the recently released, OpenMPI distribution for PowerPC Linux.

* Binary Black Hole Coalescence using Perturbation Theory (GK) This project broadly deals with estimating properties of the gravity waves produced by the merger of two black holes. Gravitational waves are “ripples” in space-time that travel at the speed of light. These were theoretically predicted by Einstein’s general relativity, but have never been directly observed. Currently, there is an extensive search being performed for these waves by the newly constructed NSF LIGO laboratory and various other such observatories in Europe and Asia. The ESA and NASA also have a mission planned in the near future, the LISA mission, that will also be attempting to detect these waves. To learn more about these waves and the recent attempts to observe them, please visit the LISA mission website.

The evolution code for the extreme-mass-ratio limit of this problem (referred to as EMRI) is essentially like an inhomogeneous wave-equation solver which includes a very complicated source-term. The source-term describes how the smaller black hole (or star) affects the space-time of the larger one. Because of the computational complexity of the source-term, it is often the most numerically intensive part of the whole evolution. On the PS3’s Cell processor, it is precisely this part of the computation that is farmed out to six SPUs. This approach essentially eliminates the entire time spent on the source computation and yields a speed up of over a factor of five over a PPU-only computation. It should be noted that the context of this computation is double-precision floating point operations. In single-precision, the speed-up is significantly higher. Overall, a single PS3 performs better than the highest-end desktops available and compares to as many as 25 nodes of an IBM Blue Gene supercomputer. And there is still tremendous scope left for extracting more performance through further optimization. Furthermore, we distribute the entire computational domain across the sixteen PS3s using MPI (message passing) parallelization. This enables the entire cluster to run together, harmoniously, working on the computation in an efficient way. Each PS3 works on its part of the domain and communicates the appropriate data to the others, as needed.”

“This illustration shows the gravitational waves thought to be produced by two orbiting white dwarf stars in a binary system called J0651, according to an August 2012 study.”

Black Holes Collide, and Gravity Quivers
by Kenneth Chang  /  May 2, 2006

“In the most precise effort yet to detect gravitational waves — the quiverings of space-time predicted by Einstein’s theory of general relativity — the National Science Foundation in the late 1990’s carved two large V’s, one in the barren landscape of central Washington State, the other among the pines outside Baton Rouge, La. The tunnels are part of the Laser Interferometer Gravitational-Wave Observatory, known as LIGO. If something astronomically violent, like a collision of two black holes, shakes the fabric of the universe within 300 million light-years of Earth, an expanse that encompasses several thousand galaxies, LIGO should see the resulting gravitational ripples. The observatory is sensitive enough to detect a change of less than one ten-quadrillionth of an inch, or about a thousandth of the diameter of a proton, in the length of the 2.5-mile-long tunnels. After several years of testing and fine-tuning — special dampers had to be installed at the Louisiana site to counteract vibrations generated when nearby loggers cut down trees, for instance — the observatory began full operation in November. The centers cost nearly $300 million to build and $30 million a year to operate.

The data so far, reported last week at a meeting of the American Physical Society in Dallas, contain nothing of interest. In fact, scientists would not be surprised if the initial run of the experiment over the next year or so found nothing at all. “I would still sleep well about general relativity,” said Peter R. Saulson, a physics professor at Syracuse and an observatory spokesman. Jay Marx, LIGO’s executive director, estimated that the chance of success was “25 percent, if nature’s kind.” General relativity, formulated 90 years ago by Einstein to explain the properties of space and time, fits well with measurements of gravity in and around the solar system. But predictions about what happens around black holes and other places where gravity is extremely strong remain largely untested. One of the predictions is that in such conditions, sizable gravitational waves will be produced. With new research, scientists have a better idea of what LIGO should look for. Researchers led by Joan M. Centrella, chief of the Gravitational Astrophysics Laboratory at NASA’s Goddard Space Flight Center, announced last month that they had succeeded in calculating the shape of the gravitational waves that should result when two black holes, orbiting one another, merge. “This is not something made up like in a science fiction movie,” Dr. Centrella said in a news conference announcing the findings. “Rather, we have confidence that these results are the real deal, that we have the true gravitational wave fingerprint predicted by Einstein for the black hole merger.”

The equations of general relativity can be easily written down but are notoriously hard to solve. Astrophysicists were able to simulate the head-on collision of two black holes three decades ago, but computing the paths of orbiting black holes and their violent merger proved much harder. “This has been a holy grail type of quest for the last 30 years,” Dr. Centrella said. Dr. Centrella’s simulations still contain some simplifications that do not reflect attributes of actual black hole pairs: the two black holes have the same mass, and neither is spinning. The calculations predicted, for example, that 4 percent of the mass of the black holes should be converted into gravitational waves. “That’s a very important number,” Dr. Saulson said. “That tells us that these gravitational waves are going to be about as strong as we hoped they could be.” He added, “And that’s got those of us working on the detectors very excited, making it seem more likely we’ll bump into something.” Einstein’s theory of general relativity changed the idea of gravity from a simple force dragging apples from a tree to a puzzle of geometry. Imagine a rubber sheet pulled taut horizontally and then tossing a bowling ball and a tennis ball onto it. The heavier bowling ball sinks deeper, and the tennis ball will move toward the bowling ball not because of a direct attraction between the two, but because the tennis ball rolls into the depression around the bowling ball. In this two-dimensional analogy of space-time, one can also imagine a sudden collision of objects creating ripples that skitter across the sheet. Those are the gravitational waves LIGO hopes to detect.

At each site, a laser beam generated at the base of the V is split in two and shot through tunnels buried along each 2.5-mile-long arm. The light bounces back and forth in the two tunnels. When a gravitational wave speeds past, it should stretch and shrink the distance that the laser beams travel, causing the laser light to flicker into a detector at the base of the V. Because the instruments are susceptible to tiny disturbances, only signals seen by both LIGO detectors, nearly 2,000 miles apart, would likely be convincing to scientists. The skepticism about whether LIGO will actually spot gravitational waves comes not from questions about general relativity — “People would be incredibly surprised if it wasn’t right,” Dr. Marx said — but uncertainty about how often events that create gravitational waves occur in the universe. Pairs of orbiting black holes should be the end result of star systems consisting of two massive stars. Over time, the black holes would spiral inward and eventually collide. Astronomers can see plenty of pairs of massive stars twirling in the sky, but they cannot be sure that they ultimately collapse into pairs of black holes. Because astrophysicists do not fully understand how stars age, “There are multiple factors of uncertainty,” said Vassiliki Kalogera, a professor of physics and astronomy at Northwestern University. “We don’t know that binary black holes exist.”

At the optimistic end, her calculations suggest that LIGO could detect up to 10 black hole mergers a year. But the calculations are still uncertain by a factor of 100, which means that at the pessimistic end, the rate of detectable black hole mergers may be just one every 50 years or so. A more common event is the merger of neutron stars, the dense, burned-out cores left over by some exploding stars. The most convincing evidence so far for gravitational waves was the observation in 1974 by two Princeton physicists, Joseph H. Taylor and his student Russell A. Hulse. They saw a pair of pulsating neutron stars spiraling inward toward each other. The amount of energy lost in the decaying orbits turned out to match the amount of energy expected to be emitted in gravitational waves. However, the gravitational waves produced by orbiting neutron stars are too weak to be detected by LIGO. And even when the neutron stars slam into each other, the cataclysm is not nearly as violent as the merger of black holes, so a neutron star collision would have to occur much closer in order for LIGO to see it. Dr. Kalogera’s calculations suggest that the observatory will see a neutron star merger once every seven or eight years, at best. For LIGO to detect gravitational waves routinely, the instruments will need a proposed $200 million upgrade, which includes more powerful lasers, to increase their sensitivity by a factor of 10, Dr. Marx said.

Astronomers hope that LIGO and its successors, as well as similar detectors in Europe and Japan, will become a new type of telescope. If the detection of gravitational waves becomes common, astronomers should be able to deduce many physical properties of black holes and neutron stars. They may also find that such objects are more common in certain types of galaxies. The upgraded observatory may also be able to detect gravitational waves produced by exploding stars or even reverberations of the Big Bang 13.6 billion years ago. Sometime in the next decade, NASA and the European Space Agency hope to launch a space-based gravitational wave detector called the Laser Interferometer Space Antenna, or LISA. Consisting of three satellites flying around the sun in the formation of an equilateral triangle 3.1 million miles apart, LISA would be able to detect gravitational waves with much larger wavelengths, like those produced when mega-black holes at the center of galaxies merge. For now, the scientists await their first gravitational wave. “We are all hoping we are lucky,” said Gabriela González, a physics professor at Louisiana State and a LIGO scientist. “Even if we are not, we will know more about nature.”

“The gravity waves of this story should not be confused with the gravitational waves of astrophysics. One is an ordinary wave of water or air; the other is a ripple in the fabric of spacetime itself.”

Gravity Waves Make Tornadoes  /  Mar 20, 2008

“Did you know that there’s a new breakfast food that helps meteorologists predict severe storms? Down South they call it “GrITs.” GrITs stands for Gravity wave Interactions with Tornadoes. “It’s a computer model I developed to study how atmospheric gravity waves interact with severe storms,” says research meteorologist Tim Coleman of the National Space Science and Technology Center in Huntsville, Alabama. According to Coleman, wave-storm interactions are very important. If a gravity wave hits a rotating thunderstorm, it can sometimes spin that storm up into a tornado. What is an atmospheric gravity wave? Coleman explains: “They are similar to waves on the surface of the ocean, but they roll through the air instead of the water. Gravity is what keeps them going. If you push water up and then it plops back down, it creates waves. It’s the same with air.” Coleman left his job as a TV weather anchor in Birmingham to work on his Ph.D. in Atmospheric Science at the University of Alabama in Huntsville. “I’m having fun,” he says, but his smile and enthusiasm already gave that away. “You can see gravity waves everywhere,” he continues. “When I drove in to work this morning, I saw some waves in the clouds. I even think about wave dynamics on the water when I go fishing now.”

Gravity waves get started when an impulse disturbs the atmosphere. An impulse could be, for instance, a wind shear, a thunderstorm updraft, or a sudden change in the jet stream. Gravity waves go billowing out from these disturbances like ripples around a rock thrown in a pond. When a gravity wave bears down on a rotating thunderstorm, it compresses the storm. This, in turn, causes the storm to spin faster. To understand why, Coleman describes an ice skater spinning with her arms held straight out. “Her spin increases when she pulls her arms inward.” Ditto for spinning storms: When they are compressed by gravity waves, they spin faster to conserve angular momentum. “There is also wind shear in a gravity wave, and the storm can take that wind shear and tilt it and make even more spin. All of these factors may increase storm rotation, making it more powerful and more likely to produce a tornado.” “We’ve also seen at least one case of a tornado already on the ground (in Birmingham, Alabama, on April 8, 1998) which may have become more intense as it interacted with a gravity wave.” Coleman also points out that gravity waves sometimes come in sets, and with each passing wave, sometimes the tornado or rotating storm will grow stronger. Tim and his boss, Dr. Kevin Knupp, are beginning the process of training National Weather Service and TV meteorologists to look for gravity waves in real-time, and to use the theories behind the GrITs model to modify forecasts accordingly. Who would have thought grits could predict bad weather? “Just us meteorologists in Alabama,” laughs Coleman. Seriously, though, Gravity wave Interactions with Tornadoes could be the next big thing in severe storm forecasting.”

Sergei Kopeikin
email : kopeikins [at] missouri [dot] edu

MU Physicist Defends Einstein’s Theory And Speed Of Gravity
Measurement  /  Oct 04, 2007

“Scientists have attempted to disprove Albert Einstein’s theory of general relativity for the better part of a century. After testing and confirming Einstein’s prediction in 2002 that gravity moves at the speed of light, a professor at the University of Missouri-Columbia has spent the past five years defending the result, as well as his own innovative experimental techniques for measuring the speed of propagation of the tiny ripples of space-time known as gravitational waves. Sergei Kopeikin, associate professor of physics and astronomy in the College of Arts and Science, believes that his latest article, “Gravimagnetism, causality, and aberration of gravity in the gravitational light-ray deflection experiments” published along with Edward Fomalont from the National Radio Astronomical Observatory, arrives at a consensus in the continuing debate that has divided the scientific community. An experiment conducted by Fomalont and Kopeikin five years ago found that the gravity force of Jupiter and light travel at the same speed, which validates Einstein’s suggestion that gravity and electromagnetic field properties, are governed by the same principle of special relativity with a single fundamental speed. In observing the gravitational deflection of light caused by motion of Jupiter in space, Kopeikin concluded that mass currents cause non-stationary gravimagnetic fields to form in accordance with Einstein’s point of view. The research paper that discusses the gravimagnetic field appears in the October edition of Journal of General Relativity and Gravitation. Einstein believed that in order to measure any property of gravity, one has to use test particles. “By observing the motion of the particles under influence of the gravity force, one can then extract properties of the gravitational field,” Kopeikin said. “Particles without mass – such as photons – are particularly useful because they always propagate with constant speed of light irrespectively of the reference frame used for observations.”

The property of gravity tested in the experiment with Jupiter also is called causality. Causality denotes the relationship between one event (cause) and another event (effect), which is the consequence (result) of the first. In the case of the speed of gravity experiment, the cause is the event of the gravitational perturbation of photon by Jupiter, and the effect is the event of detection of this gravitational perturbation by an observer. The two events are separated by a certain interval of time which can be measured as Jupiter moves, and compared with an independently-measured interval of time taken by photon to propagate from Jupiter to the observer. The experiment found that two intervals of time for gravity and light coincide up to 20 percent. Therefore, the gravitational field cannot act faster than light propagates.” Other physicists argue that the Fomalont-Kopeikin experiment measured nothing else but the speed of light. “This point of view stems from the belief that the time-dependent perturbation of the gravitational field of a uniformly moving Jupiter is too small to detect,” Kopeikin said. “However, our research article clearly demonstrates that this belief is based on insufficient mathematical exploration of the rich nature of the Einstein field equations and a misunderstanding of the physical laws of interaction of light and gravity in curved space-time.”

Leave a Reply