|
|
About SCEC |
Major Projects & Research |
Technical Resources |
Education & Preparedness |
||
|
Community Modeling Environment SCEC To Develop Online Collaborative Laboratory for Studying Earthquakes
The five year project will develop the ability for scientists to improve computer models of how the Earth is structured and how the ground moves during earthquakes. The project team includes collaborating researchers from SCEC, the Information Sciences Institute (ISI) at USC, the San Diego Supercomputing Center (SDSC), the Incorporated Institutions for Seismology (IRIS), and the USGS. These Earth scientists and computer scientists will create an online collaborative laboratory-- a "collaboratory"-- allowing scientists from across the country to conduct science together in much more effective ways than are currently possible. This facility will be called the "SCEC Community Modeling Environment."
The collaboratory will provide resources and tools needed to improve estimates of how the earth will shake during particular earthquakes, which is very important information for designing better buildings. Currently scientists are able to predict the slow, rolling-type motion that will be experienced during possible earthquakes very well. But predicting the fast, violent shaking that causes the most damage is thousands of times more complicated and time consuming. "We have lots of different types of data that we attempt to synthesize into an understanding of earthquake processes," said Dr. Thomas Jordan, director of SCEC. "But each scientist typically works on one small aspect of the problem. In order to come up with a comprehensive and integrative understanding of earthquakes, we have to be able to put all of this together. The collaboratory will combine three primary information technologies to organize the science of studying earthquakes, and allow many scientists to be involved."
One component is "digital library" technology, which will allow scientists to organize and retrieve information stored throughout the country. This requires new tools to access existing data collections and simulation programs, as well as the ability to incorporate new collections of data that are generated by the simulations. Another component is "grid" technology, which allows calculations to take advantage of the processing power of a network of many computers. Many computer-based earthquake simulations last several hours even using a high-speed computer, and often scientists must wait days or weeks for an allotted time to run their programs. Using grid technology, scientists will design simulations that will be automatically sent to the best computer that is available.
The third component comprising the collaboratory is called "knowledge representation and reasoning," which will allow computers to automate the processing of earth science data and make it easier for scientists to work together on complex calculations. This requires developing a common way to describe data, computer programs, and results, and how these components relate to each other. "In many cases these are highly developed technologies, but we will push them into new arenas," added Jordan. "This project was funded not just because it is a good thing to do in terms of applying information technology to earthquakes, but because the problems in information technology that need to be solved to develop the collaboratory are themselves important." "Forty years ago seismology was one of the primary drivers in the development of high-performance computing," explained Dr. Bernard Minster, science director of SCEC and director of the Institute of Geophysics and Planetary Physics of the University of California. It again raises a major ITR challenge with the requirements placed on us by quantitative earthquake science: from rock physics and fault rupture, to regional wave propagation simulations, to regional crustal deformation over many earthquake cycles. All these problems are coupled, across many orders of magnitude in spatial and temporal scales. The San Diego Super Computing Center at the University of California San Diego will collaborate in developing data and knowledge management tools that will support this research, led by Reagan Moore of SDSC's Data-Intensive Computing Environments (DICE) group.
The SCEC project is one of 309 awards that will receive more than $156 million from NSF's Information Technology Research (ITR) priority area, which spurs fundamental research and innovative uses of information technology in science and engineering. The SCEC activity is one of eight large projects that will each total between $5.5 million and $13.75 million over five years. "Information technology has already transformed our daily lives, yet its most significant impact so far may be in science and engineering," said Gary Strong of the National Science Foundation. "These new NSF projects show that information technology is enabling new types of fundamental research not previously feasible, by helping to gather and make sense of a data avalanche that will solve countless mysteries about the world around us." Other SCEC InstaNET News articles are listed on the SCEC Home Page and the SCEC InstaNET News Archive page. |
||||||
|
Created in the SCEC |
|
© 2013 Privacy Policy and Accessibility Policy |