Simulated Shake-Up Reveals Some Answers
The magnitude 8.0 simulation could guide emergency planning for Southern California.
Seismologists have long asked not if, but when 鈥楾he Big One鈥 will strike Southern California. Just how big will it be, and how will the amount of shaking vary throughout the region?
Now researchers are closer to finding out the answer to the second part of that question, and helping prepare the Golden State鈥檚 emergency response teams to better cope with such a disaster.
Researchers at 色情视频 and the (SDSC) at the University of California San Diego have created the largest-ever simulation of a magnitude 8.0 (M8) earthquake. M8 earthquakes are capable of tremendous damage; the 1994 Northridge earthquake was a magnitude 6.7 and caused billions of dollars in property damage.
The simulation is primarily along the southern section of the San Andreas Fault. About 25 million people reside in that area, which extends as far south as Yuma, Ariz., and Ensenada, Mexico, and runs up through Southern California to as far north as Fresno.
New insight into San Andreas Fault
鈥淭he scientific results of this massive simulation are very interesting, and its level of detail has allowed us to observe things that we were not able to see in the past,鈥 said Kim Olsen, professor of at 色情视频 and the study's lead seismologist.
鈥淔or example, the simulation has allowed us to gain more accurate insight into the nature of the shaking expected from a large earthquake on the San Andreas Fault.鈥
SDSC provided the high-performance computing and scientific visualization expertise for the simulation, while the at the University of Southern California was the lead coordinator in the project. The scientific details of the earthquake source were handled by researchers at 色情视频, and the Ohio State University was also part of the collaborative effort.
Gordon Bell Prize finalist
The research was selected as a finalist for the Gordon Bell Prize, awarded annually for outstanding achievement in high-performance computing applications at the annual Supercomputing Conference. This year鈥檚 conference, called SC10 for Supercomputing 2010, will be held Nov. 13-19 in New Orleans, La.
鈥淭his M8 simulation represents a milestone calculation, a breakthrough in seismology both in terms of computational size and scalability,鈥 said Yifeng Cui, a computational scientist at SDSC and lead author of Scalable Earthquake Simulation on Petascale Supercomputers.
鈥淚t鈥檚 also the largest and most detailed simulation of a major earthquake ever performed in terms of floating point operations, and opens up new territory for earthquake science and engineering with the goal of reducing the potential for loss of life and property.鈥
The simulation, funded through a number of National Science Foundation grants, represents the latest in seismic science on several levels, as well as for computations at the petascale level, which refers to supercomputers capable of more than one quadrillion floating point operations, or calculations, per second.
Olsen, who cautioned that this massive simulation is just one of many possible scenarios that could actually occur, also noted that high-rise buildings are more susceptible to the low-frequency, or a roller-coaster-like, motion, while the smaller structures usually suffer more damage from higher-frequency shaking, which feels more like a series of sudden jolts.
As a follow-up to the record-setting simulation, Olsen said the research team will analyze potential damage to buildings, including 色情视频 high rises, due to the simulated ground motions.
Record-setting on several fronts
鈥淲e have come a long way in just six years, doubling the maximum seismic frequencies for our simulations every two to three years, from 0.5 hertz鈥攐r cycles per second鈥攊n the TeraShake simulations, to 1.0 hertz in the ShakeOut simulations, and now to 2.0 hertz in this latest project,鈥 said Phil Maechling, associate director for information technology at the earthquake center.
Specifically, the latest simulation is the largest in terms duration of the temblor (six minutes) and the geographical area covered鈥攁 rectangular volume approximately 500 miles (810 km) long by 250 miles (405 km) wide by 50 miles (85 km) deep.
The team鈥檚 latest research also set a new record in the number of computer processor cores used, with more than 223,000 cores running within a single 24-hour period on the Jaguar Cray XT5 supercomputer at the in Tennessee. By comparison, a previous TeraShake simulation in 2004 used only 240 cores over a four-day period.
Additionally, the new simulation used a record 436 billion mesh, or grid points, to calculate the potential effect of such an earthquake, versus only 1.8 billion mesh points used in the TeraShake simulations done in 2004.
Earthquake simulations can be used to evaluate earthquake early-warning planning systems, and help engineers, emergency response teams and geophysicists better understand seismic hazards not just in California, but around the world.
鈥淧etascale simulations such as this one are needed to understand the rupture and wave dynamics of the largest earthquakes at shaking frequencies required to engineer safe structures,鈥 said Thomas Jordan, director of the earthquake center and principal investigator for the project. 鈥淔rankly, we were at the very limits of these new capabilities for research of this type.鈥
In addition to Cui, Olsen, Jordan and Maechling, other researchers on the Scalable Earthquake Simulation on Petascale Supercomputers project include:
- Amit Chourasia, Kwangyoon Lee and Jun Zhou from SDSC
- Daniel Roten and Steven M. Day from 色情视频
- Geoffrey Ely and Patrick Small from USC
- D.K. Panda from OSU
- John Levesque, from Cray Inc