Post by Paddy by Grace on Jun 24, 2009 15:55:10 GMT -7
Adios L.A. – "Ring of Fire" 9.0 Megaquakes Predicted
June 23, 2009
Daily Galaxy
On January the 26th, 1700, sometime around 9 p.m. in the evening local time, the Juan de Fuca segment of the planet beneath the ocean in the Pacific Northwest moved. Suddenly. It slipped some 60 feet eastward beneath the North American plate, and caused a monster quake of approximate magnitude 9. It set in motion tsunamis that struck the coast of North America and traveled to the shores of Japan.
Map shows to location of earthquakes around the Pacific Ocean - the "Ring of Fire".
As the years go by, you can become more and more certain that no matter what natural phenomenon is thrown our way, someone, somewhere, has run a simulation on how to deal with it. A team of researchers from San Diego State University (SDSU) have done just that, looking at what a 9.0 scale earthquake would do to the Northwest of America.
Researchers believe that these megaquakes occur every 400 to 500 years or so.
Kim Olsen of SDSU and his team created a supercomputer-powered “virtual earthquake” program that allowed them to recreate such an earthquake. This program encompassed the work of scientists from SDSU, San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.
In addition, to ensure that the entire representation of what could happen is accurate, William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer from Ulm University in Germany to create an accurate representation of the earth’s subsurface layering in that area. This “velocity model” – the first of its kind – expresses how the structure will bend, reflect, and change in size and direction.
Naturally, what they learnt didn’t necessarily send anyone home to bed with warm fuzzy feelings of safety (although Andreas is probably feeling pretty cozy over in Germany).
Their scenario depicted a rupture beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone (an area where two tectonic plates move towards one another, forcing one to slide beneath the other). In their scenario, the ground moved about 1.5 feet per second in Seattle, nearly 6 inches per second in Tacoma, Olympia and Vancouver, and 3 inches in Portland, Oregon.
“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.
“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” Olsen added. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.”
Region specific, this is bad news for the North West for two reasons; one, because the combined long-duration shaking and high ground velocities raise the possibility that such an earthquake could inflict major damage on downtown Seattle; and two, areas like Seattle, Tacoma and Olympia sit on top of sediment filled geological basins, thus, amplifying the waves generated by major earthquakes.
Reason one why scientists bother running these simulations. Reason number two: “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia,” said Olsen.
Now, sometimes I don’t necessarily think I manage to capture the scope of just what it is a supercomputer is. It definitely sounds, super, but, how can you know how super Superman is until you’ve compared him to men who jump buildings and run past bullets.
So to create the simulations for Olsen’s “Anelastic Wave Model,” a massive undertaking was needed. This didn’t just require one really fancy computer in someone’s office. The computations to prepare the initial conditions were done on the San Diego Supercomputer Center’s DataStar supercomputer, before being transferred for the main simulation to the center’s Blue Gene Data supercomputer.
Still, we don’t really have our comparison to Clark Kent dashing in to the phone box and coming out “more powerful than a locomotive,” do we. So here it is; just one of the simulations – of which several were required – ran on 2,000 supercomputer processors, requiring some 80,000 process hours. That is equal to running one program on your computer, continuously, for more than 9 years. That’s one program for 3287 days, 78888 hours.
Either way, for scientific advancement or for engineering and humanitarian preparation, or simply to astound the mind, these simulations were worth the time and effort spent.
www.dailygalaxy.com/my_weblog/2009/06/rim-of-fire-meg.html
June 23, 2009
Daily Galaxy
On January the 26th, 1700, sometime around 9 p.m. in the evening local time, the Juan de Fuca segment of the planet beneath the ocean in the Pacific Northwest moved. Suddenly. It slipped some 60 feet eastward beneath the North American plate, and caused a monster quake of approximate magnitude 9. It set in motion tsunamis that struck the coast of North America and traveled to the shores of Japan.
Map shows to location of earthquakes around the Pacific Ocean - the "Ring of Fire".
As the years go by, you can become more and more certain that no matter what natural phenomenon is thrown our way, someone, somewhere, has run a simulation on how to deal with it. A team of researchers from San Diego State University (SDSU) have done just that, looking at what a 9.0 scale earthquake would do to the Northwest of America.
Researchers believe that these megaquakes occur every 400 to 500 years or so.
Kim Olsen of SDSU and his team created a supercomputer-powered “virtual earthquake” program that allowed them to recreate such an earthquake. This program encompassed the work of scientists from SDSU, San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.
In addition, to ensure that the entire representation of what could happen is accurate, William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer from Ulm University in Germany to create an accurate representation of the earth’s subsurface layering in that area. This “velocity model” – the first of its kind – expresses how the structure will bend, reflect, and change in size and direction.
Naturally, what they learnt didn’t necessarily send anyone home to bed with warm fuzzy feelings of safety (although Andreas is probably feeling pretty cozy over in Germany).
Their scenario depicted a rupture beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone (an area where two tectonic plates move towards one another, forcing one to slide beneath the other). In their scenario, the ground moved about 1.5 feet per second in Seattle, nearly 6 inches per second in Tacoma, Olympia and Vancouver, and 3 inches in Portland, Oregon.
“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.
“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” Olsen added. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.”
Region specific, this is bad news for the North West for two reasons; one, because the combined long-duration shaking and high ground velocities raise the possibility that such an earthquake could inflict major damage on downtown Seattle; and two, areas like Seattle, Tacoma and Olympia sit on top of sediment filled geological basins, thus, amplifying the waves generated by major earthquakes.
Reason one why scientists bother running these simulations. Reason number two: “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia,” said Olsen.
Now, sometimes I don’t necessarily think I manage to capture the scope of just what it is a supercomputer is. It definitely sounds, super, but, how can you know how super Superman is until you’ve compared him to men who jump buildings and run past bullets.
So to create the simulations for Olsen’s “Anelastic Wave Model,” a massive undertaking was needed. This didn’t just require one really fancy computer in someone’s office. The computations to prepare the initial conditions were done on the San Diego Supercomputer Center’s DataStar supercomputer, before being transferred for the main simulation to the center’s Blue Gene Data supercomputer.
Still, we don’t really have our comparison to Clark Kent dashing in to the phone box and coming out “more powerful than a locomotive,” do we. So here it is; just one of the simulations – of which several were required – ran on 2,000 supercomputer processors, requiring some 80,000 process hours. That is equal to running one program on your computer, continuously, for more than 9 years. That’s one program for 3287 days, 78888 hours.
Either way, for scientific advancement or for engineering and humanitarian preparation, or simply to astound the mind, these simulations were worth the time and effort spent.
www.dailygalaxy.com/my_weblog/2009/06/rim-of-fire-meg.html