STARS HAVE WEATHER

http://www.windows.ucar.edu/spaceweather/
http://www.spaceweather.com/
http://www.spaceweathercenter.org/
http://www.agu.org/journals/spaceweather/
http://space.rice.edu/ISTP/
http://cdaweb.gsfc.nasa.gov/

CLOUDY with CHANCE of STORMS
http://space.newscientist.com/article.ns?id=dn12134&feedId=online-news_rss20
Weather observed on a star for the first time
by Jeff Hecht  /  25 June 2007

Weather – caused by the same forces as the weather on Earth – has been seen on a star for the first time, reveal observations of mercury clouds on a star called Alpha Andromedae. Previously, astronomers had thought that any structures on stars were caused by magnetic fields. Sunspots, for example, are relatively cool regions on the Sun where strong magnetic fields prevent energy from flowing outwards. But now, seven years of painstaking observations of Alpha Andromedae show that stars do not need magnetic fields to form clouds after all. Lying about 100 light years away, it is one of a class of stars unusually rich in mercury and manganese. Earlier observations of similar stars had revealed uneven distributions of mercury, but all of them had strong magnetic fields. The relatively massive stars do not mix gases in their atmospheres, which less massive stars, like the Sun, do. So the balance between the pull of gravity and the push of radiation pressure concentrates some heavy elements at certain atmospheric levels. At that point, their magnetic fields were thought to continue the separation process, sequestering some chemicals in particular regions. But researchers led by Oleg Kochukhov of Uppsala University in Sweden have found that this last step is not necessary to create chemical clouds on a star.

Driven by the tides
They observed the mercury concentration in Alpha Andromedae – which does not have a detectable magnetic field – for seven years with 1.2- and 6-metre telescopes, detecting the mercury by its signature absorption line in the violet end of the spectrum. They resolved details on the spinning star’s surface by looking at how rapidly the clouds were turning towards or away from Earth. That revealed that the mercury concentration varies by as much as a factor of 10,000 across its surface, and the pattern of concentration changes as well. The evidence for changes in the mercury distribution over time “look very convincing”, comments Gregg Wade of the Royal Military College of Canada in Kingston, who discovered in 2006 that the star lacked a magnetic field. But exactly what causes the clouds to change over time is unclear. Kochukhov and colleagues say the changes “may have the same underlying physics as the weather patterns on terrestrial and giant planets”. The mercury clouds are on the brighter and larger member of a close pair of stars that orbit each other every 97 days. “The second star may create tides on the surface of the main star, much like the Moon creates tides on Earth, which drives evolution of the mercury cloud cover,” Kochukhov told New Scientist. But he adds that other explanations are possible. So for now, the weather on stars, as on Earth, remains hard to fathom.

{Journal reference: Nature Physics (doi: 10.1038/nphys648) http://www.nature.com/nphys/index.html}
http://www.rmc.ca/academic/physics/wade/index_e.htm
http://www.astro.uu.se/~oleg/

SPACE PLASMA / SOLAR WIND
http://www.srl.caltech.edu/ACE/ace_mission.html
http://web.mit.edu/space/www/wind/wind.html

SOUNDS like BERLIN TECHNO
http://local.wasp.uwa.edu.au/~pbourke/other/pulsarsound/
http://www-pw.physics.uiowa.edu/~jrp/sounds/sounds.html
http://chandra.harvard.edu/press/03_releases/press_090903.html
http://www.eso.org/public/outreach/press-rel/pr-2002/pr-10-02.html

PATENT GRANTED – PERSONAL COSMIC RAY DETECTOR
http://www.newscientist.com/blog/technology/2008/03/do-we-need-cosmic-ray-alerts-for.html
Should every computer chip have a cosmic ray detector?  /  March 07, 2008

How can distant supernovae, black holes and other cosmic events cause a desktop computer to crash? The answer is that they produce cosmic rays, which produce high energy particles in the atmosphere that can occasionally hit RAM chips. The moving particles trail electrons, which can infiltrate chips’ circuits and cause errors. That’s why computer chip giant Intel was in December awarded a US patent for the idea of building cosmic ray detectors into every chip. When cosmic rays hit the Earth’s atmosphere, they collide with air molecules, producing an “air shower” of high energy protons and neutrons and other particles. It is these that Intel wants to look for. If they get near the wrong part of a chip, the electrons they trail can create a digital 1 or 0 out of nowhere, something called a “soft error”.

Computer giant IBM thoroughly investigated the problem in the mid 90s, testing nearly 1,000 memory devices at sea level, in mountains and in caves. They showed that at higher altitude, more soft errors occurred, while in the caves there were nearly none. That proved cosmic rays were to blame. As RAM chips became more dense, the problem was predicted to get worse. But better designs and error checking techniques have helped, with systems used in planes and spacecraft getting beefed-up error checking because they are at greater risk.
http://en.wikipedia.org/wiki/ECC_RAM#Error-correcting_memory

But Intel thinks we may still be living on borrowed time: “Cosmic ray induced computer crashes have occurred and are expected to increase with frequency as devices (for example, transistors) decrease in size in chips. This problem is projected to become a major limiter of computer reliability in the next decade.” Their patent suggests built-in cosmic ray detectors may be the best option. The detector would either spot cosmic ray hits on nearby circuits, or directly on the detector itself. When triggered, it could activate error-checking circuits that refresh the nearby memory, repeat the most recent actions, or ask for the last message from outside circuits to be sent again.

But if cosmic ray detectors make it into desktops, would we get to know when they find something? It would be fun to suddenly see a message pop up informing a cosmic ray had been detected. I haven’t seen any recent figures on how often they happen, but back in 1996 IBM estimated you would see one a month for every 256MB of RAM. Perhaps it could even be useful to astronomers, if everyone shared that data, like this idea to use hard-drive wobbles to monitor earthquakes.

COSMIC RAYS
http://www.rcnp.osaka-u.ac.jp/~annurep/2001/genkou/sec3/kobayashi.pdf
http://en.wikipedia.org/wiki/Soft_error

“Once the electronics industry had determined how to control package contaminants, it became clear that other causes were also at work. James F. Ziegler led a program of work at IBM which culminated in the publication of a number of papers (Ziegler and Lanford, 1979) demonstrating that cosmic rays also could cause soft errors. Indeed, in modern devices, cosmic rays are the predominant cause. Many different particles can be present in cosmic rays, but the main cause of soft errors seems to be neutrons. Neutrons are uncharged and cannot disturb electron distribution on their own, but can undergo neutron capture by the nucleus of an atom in a chip, producing an unstable isotope which then causes a soft error when it decays producing an alpha particle.

Cosmic ray flux depends on altitude. Burying a system in a cave reduces the rate of cosmic-ray-induced soft errors to a negligible level. In the lower levels of the atmosphere, the flux increases by a factor of about 2.2 for every 1000 m (1.3 for every 1000 ft) increase in altitude above sea level. Computers operated on top of mountains, or in aircraft, experience an order of magnitude higher rate of soft errors compared to sea level. This is in contrast to package decay induced soft errors, which do not change with location. It happens that one isotope of boron, Boron-10, captures neutrons and undergoes alpha decay very efficiently. It has a very high neutron collision cross section. Boron is used in BPSG, a glass used to cover silicon dies to protect them. In critical designs, depleted boron– consisting almost entirely of Boron-11–is used, to avoid this effect and therefore to reduce the soft error rate. Boron-11 is a by-product of the nuclear industry.”

FULL PATENT
http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=7,309,866.PN.&OS=PN/7,309,866&RS=PN/7,309,866

ABSTRACT
A cosmic ray detector includes a cantilever with a first tip. The detector also includes a second tip and circuitry to provide a signal indicative of a distance between the first and second tips being such as would be caused by a cosmic ray interaction event.

Inventors: Hannah; Eric C. (Pebble Beach, CA)
Assignee: Intel Corporation (Santa Clara, CA)
Appl. No.: 10/882,917 / Filed: June 30, 2004

“The normal background radiation environment on the surface of the earth has ionizing components that sometimes affects the reliability of semiconductor integrated circuit chips, such as memory chips used in computers. If an intruding particle is near a p-n junction in the chip, it may induce a soft error, or single-event upset which can cause signals to change voltage and, accordingly, bits of data to change voltage value. Excess electron-hole pairs may be generated in the wake of the penetrating particle. The field in the neighborhood of the p-n junction, if sufficiently strong, separates these electrons and holes before they recombine, and sweeps the excess carriers of the appropriate sign to a nearby device contact. A random signal may be registered if this collected charge exceeds a critical threshold value.

Cosmic particles in the form of neutrons or protons can collide randomly with silicon nuclei in the chip and fragment some of them, producing alpha-particles and other secondary particles, including the recoiling nucleus. These can travel in all directions with energies which can be quite high (though of course less than the incoming nucleon energy). Alpha-particle tracks so produced can sometimes extend a hundred microns through the silicon. The track of an ionizing particle may extend a fraction of a micron to many microns through the chip volume of interest, generating in its wake electron-hole pairs at a rate of one pair per 3.6-eV (electronvolts) loss of energy. A typical track might represent a million pairs of holes and electron.

Cosmic ray induced computer crashes have occurred and are expected to increase with frequency as devices (for example, transistors) decrease in size in chips. This problem is projected to become a major limiter of computer reliability in the next decade. Various approaches have been suggested to eliminate or reduce the number of soft errors due to cosmic ray interactions in chips. None of these approaches is completely successful, particularly as device size continues to decrease. Another approach is to accept that some soft errors will happen and to design memory and logic circuitry to include redundancy in all calculations. This approach involves more gates and enough spatial separation between contributing redundant elements to avoid mutual soft errors from the same cosmic ray. This approach is not practical for many chips.”

‘SOFT ERROR’
http://findarticles.com/p/articles/mi_qn4158/is_19980728/ai_n14164062
When ‘soft errors’ hit the desktop
BY Brian Prangle  /  Jul 28, 1998

Your system crashes unexpectedly. You are the victim of an event several light years ago in a distant galaxy which hurled a shower of protons and neutrons randomly into space, only to collide with electrons in your system’s RAM. Or, more insidiously, the radiation not crashing your system but subtly altering data without your knowing. Blow away a few electrons and a bit gets altered from a zero to a one with unforeseen consequences. If this bit merely represents the colour of a pixel co-ordinate 720,346 in your latest shoot ’em up game, then so what? If it represents the first digit on a six figure cheque, then you may want to correct it. This may not be as far-fetched as it sounds. These so-called ‘soft’ errors have been observed in integrated circuits for several decades. The new breed of PCs and servers with massive amounts of DRAM increase the possibility of such errors. IBM has long been vocal about soft errors: with a large population of mainframes processing huge quantities of critical data in massive memories they have built in ECC (Error Correcting Code) into memory to safeguard data. ECC memory detects bit errors a nd corrects them.

Mainframes cannot tolerate bit errors. The larger, more expensive workgroup Intel servers increasingly also come configured with ECC memory and motherboard chipsets. Soft errors at the desktop have barely entered the PC industry’s consciousness; DRAM manufacturers have concentrated on removing ‘hard’ errors produced by contaminants at the production stage. Some two years ago, IBM released the results of a survey they had conducted on the effect of cosmic rays on DRAM. Some 800 devices were tested in constant-read mode at sea level, in mountainous regions, and in caves. More soft errors were found at high altitudes, presumably because there are fewer air molecules at higher altitudes to absorb cosmic rays. But after three months, the underground DRAM tested at zero soft errors. IBM estimate that for every 256MB of memory you’ll get one soft error a month.

With Intel’s new chip sets allowing up to 4GB of DRAM per system, DRAM densities at the workgroup server level are approaching mainframe levels of yesteryear. IBM reckon that 1GB is the threshold beyond which robust error correction of mainframe class is necessary and Toshiba reckon that failure rate is directly proportional to the amount of DRAM in the system. As memory designs are set to change dramatically over the next year to increase memory density and throughput, the chances of soft error occurrence can only be set to rise. But convincing a sceptical industry that sub-atomic particles from outer space are a design problem is IBM’s biggest problem, closely followed by the ruthless price competition that makes purchasers and manufacturers reluctant to spend more on a soft err or problem.

A more tangible threat from outer space is facing global communication carriers that rely on satellites. In November the earth is set to pass through a particularly dense area of small meteorites known as the Leonids, which are the debris from the tail of a comet. Satellites and space debris are incompatible, with partial or catastrophic failure resulting from collision with even the smallest object. The regular visits of the space shuttle to repair satellites are witness to this phenomenon. We are becoming increasingly dependent on computer-driven communication, yet paradoxically each new performance boost makes us more vulnerable.

SEE ALSO
TSUNAMI HARD DISK DETECTOR
http://www.ninsight.at/tsunami/
the worlds first freely available P2P warning system

[ the solution ]
The Tsunami-Harddisk-Detector utilizes your existing computer hardware to detect earthquakes, which can lead to tsunamis. It is a pure software solution, therefore it can be distributed free of charge. The computers (nodes) participating in the project connect to a P2P (Peer-to-Peer) network, thereby establishing a distributed computing platform with high reliability. A few of the participating computers act as supernodes, thereby performing data analysis on their attached nodes. In case of emergency, the supernodes inform their attached nodes instantly. Hence, if you decide to participate and install the client software, you will be automatically warned about potential tsunamis.

[ background ]
A tsunami can be generated by any disturbance that rapidly displaces a large mass of water, such as an earthquake, volcanic eruption, landslide or meteorite impact. However, the most common cause is an undersea earthquake. An earthquake which is too small to create a tsunami by itself may trigger an undersea landslide quite capable of generating a tsunami. Waves are formed as the displaced water mass moves under the influence of gravity to regain its equilibrium and radiates across the ocean like ripples on a pond.

If the initial event is sensed, the coupled system of partial differential equations can be utilized to simulate the propagation of tsunami waves and issue a tsunami warning, if necessary. The equations are known as the shallow water wave equations (Pelinovsky et al. 2001, Layton 2002). Here, u and v are the horizontal velocity components of the water surface, x and y are the spatial coordinates of the wave, t is elapsed time, g is the acceleration due to gravity, and h is the height of the wave above the ocean floor topography b. The critical problem is how to sense the initial earthquake. How the Tsunami-Harddisk-Detector copes with these problems is explained here.

[ how it works ]
Tusnamis are generated by earthquakes, volcanic eruptions or large meteorite impacts. These initial events cause seismic waves which can be sensed by the fragile components of computer harddisks. Seismic waves travel with about 5000 m/s (18.000 km/h), while tsunamis travel much slower with a speed of 200 m/s (720 km/h, depending on the local depth) through water. This gives time for a tsunami forecast to be made and warnings to be issued to threatened areas, if warranted.

Software client
The Tsunami Harddisk Detector is a small software client (see Fig. 1) installed on your computer which continuously monitors the vibration of the internal components of your harddisk (see Fig. 2). Since they are extremely fragile, they react to any accelerations of the computer, including those that originate from earthquakes. Different technical strategies are currently under investigation to analyze seismic activity. For best performance, the computer with its harddisk should be fastened to the ground.

Network
One computer is not enough to identify the epicenter of an earthquake. However, a small number of networked computers can locate the epicenter, measure the intensity and estimate the risk of a tsunami. To this end, the computers are connected via a P2P network. To organize the communication within the network, it consists of many nodes (which perform sensing) and a few supernodes (which perform signal processing). In particular, the supernodes perform two tasks:
*     locate the epicenter based on the time lag and intensity of the event
*     remove ‘signal noise’. Noise is generated by events that shake the harddisk, but can not cause a tsunami (e.g. a user kicks his computer). The supernode can detect this noise because it is improbable that many users kick their computer simultaneously.

In order to be able to locate the epicenter, each node must know exactly where it stands on earth. Therefore, the longitude and latitude as well as the orientation must be entered when the software is started the first time. This data can be obtained from an attached GPS-mouse or from www.gpsvisualizer.com.

Known problems
Although the described method works in principle, it has inherent problems: the internals of harddisks operate at very high speeds with frequencies of about 1 kHz, while earthquakes exhibit much smaller frequencies (see the seismogram to the right). Also the accelerations during normal harddisk operation are about 30 g, while an earthquake has a much smaller horizontal acceleration (for example the Kobe earthquake in 1995 had 0.84 g horizontally). Hence, the waves resulting from earthquakes must be filtered out from other waves.

[ usage ]
The application is developed in Java, hence it runs on any computing platform that supports at least Java 1.6. Sensing of seismic activity is very hardware related and is currently only supported on Windows.

MEASURING EARTHQUAKE HARD DRIVE WOBBLES
http://technology.newscientist.com/channel/tech/dn10037-hard-drive-wobbles-track-earthquake-spread.html
Hard drive wobbles track earthquake spread
BY Tom Simonite  /  08 September 2006

Software that turns ordinary computer hard drives into makeshift quake sensors, and connects machines to form a quake-monitoring network, has been released for free on the web. Although experts say the technique is unlikely to replace standard scientific equipment, computer hard drives can provide rough estimates of the intensity and location of an earthquake and warn of an impending tsunami. Hard drives contain tiny vibration sensors that warn when a device is being shaken so that delicate components can be automatically locked in one place. A damaged drive led US technology consultant Michael Stadler to realise that these sensors might also be able to detect the seismic vibration caused by an earthquake. “A friend had their hard drive fail due to the vibration from nearby construction work,” Stadler told New Scientist. So he decided to write a piece of software, called the Tsunami Hard Disk Detector, that monitors hard drive vibration and links computers together in order to spot and map earthquakes.

Travelling wave
“A hard drive is not sensitive enough to detect as accurately as a seismometer,” says Stadler. “But in a peer-to-peer network you can use geographic patterns to identify the direction a seismic wave is travelling, and perhaps the epicentre.” Stadler’s software was made freely available online and has been downloaded around 2500 times so far, mostly by users in Asia. Users can enter the latitude and longitude of their machine and are automatically connected together to form a decentralised network.

If an earthquake occurs nearby, the network maps the strength and timing of vibrations to reveal the epicentre of the quake. The results of the analysis are then instantly available to any computer running the software. And irrelevant vibrations are discounted by only looking for signals that are picked up by several different machines. Information collected by a couple of hundred users successfully identified the earthquake that that struck the south coast of Java in July 2006 and which triggered a tsunami (see Java tsunami death toll over 300). But Stadler admits the system is imperfect. “The official strength was 7.7 on the Richter scale, the Hard Disk Detector said 5.9,” says Stadler. “There’s more work to do but it shows the concept can work.” He notes that the system has failed to detect several other earthquakes.

Global network
Suleyman Nalbant, a geophysicist at the University of Ulster in the UK, says seismometers dotted across the globe can easily pick up quakes like the one that hit Java. But if the hard drive system could detect smaller earthquakes “it could be very useful”, he says. “The global network is sometimes not perfect for smaller events.” In practice, however, problems with accuracy might prevent this, Nalbant says: “I’m not sure the sensors can really be that accurate”. Bruce Malamud, an expert on natural hazards at Kings College London, UK, is also sceptical. “It’s very novel,” he says. “But to make a real difference you’d probably need a very large number of computers using the software.” Stadler’s system was awarded a prize for innovative use of the internet at the Ars Electronica fair held in Linz, Austria, between 31 August and 5 September.

AND ALSO
http://www.gpsvisualizer.com/
http://geodesy.unr.edu/

GPS SHIELD / TSUNAMI EARLY WARNING SYSTEM
http://www.gitews.org/index.php?id=23&L=1

http://environment.newscientist.com/channel/earth/tsunami/mg19526215.700-gps-shield-will-mean-faster-tsunami-alerts.html
GPS shield’ will mean faster tsunami alerts  /  15 September 2007

A “GPS shield” that works in real time could save lives by quickly warning of potential tsunamis. The German-Indonesian Tsunami Early Warning System (GITEWS) is being developed by a team led by Jörn Lauterjung of the National Research Centre for Geosciences in Potsdam, Germany. Unlike a GPS method proposed last year, which detects seismic waves transmitted through the Earth’s crust to distant receivers, the new ground-based system takes real-time measurements of vertical ground motion – the type of fault movement more likely to produce tsunamis (Journal of Geophysical Research, DOI: 10.1029/2006JB004640). To protect the Indian Ocean region, the proposed shield would include an array of 18 GPS stations.

http://www.agu.org/pubs/crossref/2007/2006JB004640.shtml

ABSTRACT
“The 2004 catastrophic Indian Ocean tsunami has strongly emphasized the need for reliable tsunami early warning systems. Another giant tsunamigenic earthquake may occur west of Sumatra, close to the large city of Padang. We demonstrate that the presence of islands between the trench and the Sumatran coast makes earthquake-induced tsunamis especially sensitive to slip distribution on the rupture plane as wave heights at Padang may differ by more than a factor of 5 for earthquakes having the same seismic moment (magnitude) and rupture zone geometry but different slip distribution.

Hence reliable
prediction of tsunami wave heights for Padang cannot be provided using traditional, earthquake-magnitude-based methods. We show, however, that such a prediction can be issued within 10 minutes of an earthquake by incorporating special types of near-field GPS arrays (“GPS-Shield”). These arrays measure both vertical and horizontal displacements and can resolve higher order features of the slip distribution on the fault than the seismic moment if placed above the rupture zone or are less than 100 km away of the rupture zone. Stations in the arrays are located as close as possible to the trench and are aligned perpendicular to the trench, i.e., parallel to the expected gradient of surface coseismic displacement. In the case of Sumatra and Java, the GPS-Shield arrays should be placed at Mentawai Islands, located between the trench and Sumatra and directly at the Sumatra and Java western coasts. We demonstrate that the “GPS-Shield” can also be applied to northern Chile, where giant earthquakes may also occur in the near future. Moreover, this concept may be applied globally to many other tsunamigenic active margins where the land is located above or close to seismogenic zones.”

http://www.earth.northwestern.edu/people/seth/research/
http://www1.uea.ac.uk/cm/home/schools/sci/env/research/seismic

MAGIC NUMBER > 8.5  =  TSUNAMI
http://technology.newscientist.com/article/dn9456
GPS can help give early warning of tsunamis
BY Tom Simonite  /  30 June 2006

Using GPS (global positioning system) data to measure how points on land move following an undersea earthquake could help geologists decide if the tremor will cause an ocean-wide tsunami. Combined with existing tsunami warning systems, the data could speed up decisions about whether to issue an alert and avoid false alarms, say US researchers. Tsunami warning systems use a combination of seismometers to measure tremors and ocean buoys to spot pressure waves. But it is difficult to pinpoint the exact strength of an undersea quake. This can cause problems for those deciding whether to issue a warning, says Seth Stein, a geophysicist at Northwestern University in Illinois, US. “The hardest job is to distinguish quakes that are big from those that are dangerously big,” he told New Scientist. “Richter scale 8 is quite a big earthquake, but about 8.5 is the magic number. Above that, ocean-wide tsunamis start to happen.” GPS measurements of points around a quake could determine more quickly than current methods whether this threshold has been exceeded, he says.

Vague measurement
The 2004 Sumatran quake – which caused the Asian tsunami – was eventually measured at between 9.2 and 9.3, but seismometers can initially only determine whether a quake is larger than about 7. “It normally takes a couple of hours to know whether it was over 8.5 or not,” says Stein. “Limits on how much energy can be stored in rock mean the first body waves of a quake don’t get bigger, but just ring for longer.” Taking GPS measurements of points on land around a quake can answer this crucial question within 15 minutes, according to a study by Stein and co-workers from the University of Nevada, US. To prove the technique can work, they used GPS data recorded during the first 15 minutes of the 2004 Sumatran quake, which made it clear the tremor would go on to cause a devastating tsunami in the Indian Ocean.

Millimetre accuracy
Software developed at NASA’s Jet Propulsion Laboratory was used to measure the position of 38 GPS stations between 300 and 7500 kilometres from the quake’s epicentre northwest of Sumatra. Knowing how these stations moved to within 7 millimetres makes it possible to measure long-frequency waves from the quake and estimate its size. “Using that 15 minutes of data it was possible to say the quake was a 9.” This was very close to the 9.2 or 9.3 figure eventually determined for the quake, says Stein. “We think this would be a useful third component to the existing tsunami warning system. By taking out the guesswork it could make it more accurate and avoid false alarms.”

Geophysicist Paul Burton at the University of East Anglia, UK, agrees, but cautions that the system’s effectiveness will depend on the location of the earthquake. “It’s feasible but might not help in all circumstances.” If the GPS stations available are not located in the right place relative to the epicentre, satellite measurements may not be that helpful, he says. “Another consideration is whether high-tech systems for tsunami warnings will be here in three to four hundred years time,” says Burton. “Educating people right now and making sure the knowledge about what to do is kept alive can make a huge difference. For example, a 10-year-old British girl saved many lives during the Indian Ocean tsunami when she remembered the early signs of a tsunami from her geography lessons.”

{Geophysical Research Letters (DOI: 10.1029/2006GL026145)}

AURORA BOREALIS
http://www.unis.no/20_RESEARCH/2060_Online_Env_Data/weatherstations.htm
http://www.unis.no/60_NEWS/6040_Archive_2008/n_20_02_08_cool_opening/a_cool_opening_news_20022008.htm
http://twistedphysics.typepad.com/cocktail_party_physics/2008/03/i-hear-the-cosm.html
http://climate.gi.alaska.edu/Curtis/curtis.html
http://sd-www.jhuapl.edu/Aurora/index.html
http://sprg.ssl.berkeley.edu/image/latest_wic.html

http://kho.unis.no/nordlysstasjon_data.htm
http://kho.unis.no/nordlysstasjon_doc.htm
http://kho.unis.no/nordlysstasjon_WebCameras.htm
http://kho.unis.no/
Kjell Henriksen Observatory  /  Svalbard 78oN
“KHO is now in full operation! Because of all the light sensitive instrumentation up there, we urge people not to visit us without an appointment. If you drive up by car – please use only your parking lights and shine as little light on the observatory as possible! Happy auroral season! :o)”

http://people.ece.cornell.edu/paul/
http://gps.ece.cornell.edu/
http://www.ion.org/meetings/gnss2006/abstracts.cfm?track=D&session=3#p5

SOLAR FLARES and YOU
http://www.news.cornell.edu/stories/Sept06/solar.flares.gps.TO.html
Solar flares cause GPS failures, possibly devastating for jets and distress calls
BY Thomas Oberst  /  Sept. 26, 2006

Strong solar flares cause Global Positioning System (GPS) receivers to fail, Cornell researchers have discovered. Because solar flares — larger-than-normal radiation “burps” by the sun — are generally unpredictable, such failures could be devastating for “safety-of-life” GPS operations — such as navigating passenger jets, stabilizing floating oil rigs and locating mobile phone distress calls. “If you’re driving to the beach using your car’s navigation system, you’ll be OK. If you’re on a commercial airplane in zero visibility weather, maybe not,” said Paul Kintner Jr., professor of electrical and computer engineering at Cornell and head of Cornell’s GPS Laboratory.

Alessandro Cerruti, a graduate student working for Kintner, accidentally discovered the effect on Sept. 7, 2005, while operating a GPS receiver at Arecibo Observatory in Puerto Rico, one of six Cornell Scintillation Monitor (SCINTMON) receivers. Cerruti was investigating irregularities in the plasma of the Earth’s ionosphere — a phenomenon unrelated to solar flares — when the flare occurred, causing the receiver’s signal to drop significantly. To be sure of the effect, Cerruti obtained data from other receivers operated by the Federal Aviation Administration (FAA) and the Brazilian Air Force. He found that all the receivers had suffered exactly the same degradation at the exact time of the flare regardless of the manufacturer. Furthermore, all receivers on the sunlit side of the Earth had been affected.

Cerruti will report on the findings Sept. 28 at the Institute of Navigation Meeting in Fort Worth, Texas, where he will receive the best student paper prize. The full results of the discovery will be published in a forthcoming issue of the journal Space Weather. The flare consisted of two events about 40 minutes apart: The first lasted 70 seconds and caused a 40 percent signal drop; the second lasted 15 minutes and caused a 50 percent drop. But this flare was moderate and short-lived; in 2011 and 2012, during the next solar maximum, flares are expected to be 10 times as intense and last much longer, causing signal drops of over 90 percent for several hours.

“Soon the FAA will require that every plane have a GPS receiver transmitting its position to air traffic controllers on the ground,” warned Cerruti. “But suppose one day you are on an aircraft and a solar radio burst occurs. There’s an outage, and the GPS receiver cannot produce a location. … It’s a nightmare situation. But now that we know the burst’s severity, we might be able to mitigate the problem.” The only solutions, suggested Kintner, are to equip receivers with weak signal-tracking algorithms or to increase the signal power from the satellites. Unfortunately, the former requires additional compromises to receiver design, and the latter requires a new satellite design that neither exists nor is planned. “I think the best remedy is to be aware of the problem and operate GPS systems with the knowledge that they may fail during a solar flare,” Kintner said.

The team was initially confused as to why the flare had caused the signal loss. Then Kintner recalled that solar flares are accompanied by solar radio bursts. Because the bursts occur over the same frequency bands at which GPS satellites transmit, receivers can become confused, leading to a loss of signal. Had the solar flare occurred during the night in Puerto Rico or had Cerruti been operating SCINTMON only at night, he would not have made the discovery. “We normally do observations only in the tropics and only at night because that’s where and when the most intense ionospheric irregularities occur,” said Kintner. However, since no one had done it before, Cerruti was looking at “mid-latitudes” (between the tropics and the poles), where weaker irregularities can occur both night and day. As a result, SCINTMON detected the solar flare.

Other authors of the forthcoming paper include D.E. Gary and L.J. Lanzerotti of the New Jersey Institute of Technology, E.R. de Paula of the Instituto Nacional de Pesquisas Espaciais and Cornell research associate Hien Vo.

THERMOSPHERIC CLIMATOLOGY
http://www.haystack.mit.edu/atm/open/index.html
http://www.haystack.mit.edu/atm/arrays/isis/index.html

http://www.haystack.mit.edu/atm/science/space/isis/index.html
http://www.haystack.mit.edu/atm/science/space/ic/index.html
http://www.haystack.mit.edu/atm/science/space/tc/index.html

“The neutral upper atmosphere, the Thermosphere, plays an important role in the characteristics of Earth’s ionosphere. Collisions between the high-density neutral atmosphere and ions at altitudes near 100 km above the Earth impede ion motion, creating electrical resistivity which impacts the overall electrical coupling between ionosphere and magnetosphere. This resistivity in the ionospheric E region is the electrical load for the disturbance currents generated in the interaction of the solar wind with our magnetosphere. Time histories of radar and optical observations are enabling a climatology of this region to be developed, and long-duration experiments are being mounted to address the wave modes which couple energy through the I-T interaction region.”

GEOMAGNETIC STORMS
http://www.sciencedaily.com/videos/2006/0312-sun_darkens_electronics.htm
Space Physicists and Atmospheric Scientists Can Now Predict Disruptions Caused by the Sun’s Coronal Mass Ejections / March 1, 2006

Solar activity can wreak havoc in communications systems — particularly during coronal mass ejections, when plumes of electrically charged particles hit earth’s atmosphere. Scientists can now track the plumes down to the single affected cities, helping to predict disruptions. John Foster, a space physicist at the Massachusetts Institute of Technology’s Haystack Observatory in Westford, Mass., says, “This material flies through inner-stellar space and impacts the Earth like a solar hammer hitting the Earth’s magnetic field.” This solar hammer can cause communication disruptions on the ground and a plume of electrically charged particles high in the earth’s atmosphere.

Now, atmospheric scientists at MIT may have discovered a way to predict space weather disruptions by identifying these plumes over the United States. “What we are seeing is a pattern in where these plumes are forming,” says Anthea Coster, an atmospheric research scientist at MIT Haystack Observatory. Scientists hope to detect these patterns with the ISIS instrument. ISIS picks up radio signals and measures plume movement. Then, a supercomputer processes this data, which will alert scientists where the plumes occur, pinpointing down to the state — even city — that will be affected. Foster says, “Predicting these would be a great benefit to any systems users, people who really rely on communications or navigation systems. Military operations, for one, would very much like to know what the space weather conditions would be like tomorrow.” Scientists say in the near future ISIS instruments will be distributed throughout the United States.

BACKGROUND: Bursts of matter from the sun, called coronal mass ejections (CMEs), have long been known to affect cell phone reception, TV and radio signals, and how much radiation exposure we receive while flying in the upper atmosphere. Now, researchers have detected plumes that tell them where the radiation form the ejection is concentrated and what places will be influenced the most by the CME.

CME or SOLAR FLARE?: People sometimes confuse CMEs with solar flares, but they are different phenomena. Solar flares are explosions on the sun that occur when energy build up around sunspots, becoming so hot — millions of degrees Fahrenheit — that they produce a burst of electromagnetic radiation across the entire electromagnetic spectrum, from radio waves to x-rays and gamma rays. CMEs were once thought to be the result of solar flares, but while they sometimes accompany solar flares, there is no direct relation between the two. They occur when a large bubble of plasma escapes through a star’s corona and travels through space to the earth at high speeds over the course of several hours. If a CME collides with the earth, it can produce a geomagnetic storm, which can cause electrical power outages and damage communications satellites and electronic equipment. Solar flares, on the other hand, affect radio communications.

PLASMAS: A plasma is essentially electrically charged (ionized) gas, consisting of free-moving electrons and ions (atoms that have lost electrons). Applying a surge of energy — with a laser, for example — knocks electrons off gas atoms, turning them into ions and creating a plasma. Unless this energy is sustained, however, plasmas will recombine back into a neutral gas. On earth, we are familiar with the ordinary states of matter: solids, liquids and gases. But in the universe at large, plasma is by far the most common form. Plasma in the stars and the space between them makes up 99 percent of the visible universe.

Leave a Reply