According to a recently published study by U.S. Geological Survey scientists, the USGS real-time earthquake web pages also quake with activity following a seismic event. Look at these numbers: 3.6 billion total data requests, including 29 million page views by 7.1 million users, 606 million automated data feeds and 45 million catalog downloads. And that’s just in one month.
Because immediate information on earthquakes is prized by people who live in seismically active areas—and is needed by emergency responders and civic officials to make decisions on public safety—the USGS is committed to delivering that information and to continuously improving its systems to meet these exponentially increasing demands. A small crew of USGS scientists and information technology specialists in Golden, Colorado, is responsible for design, testing, development, operations and monitoring of these mission-critical systems.
“The USGS Earthquake Hazards Program is at the forefront of delivering timely and accurate earthquake information from the regional seismic networks participating in the Advanced National Seismic System and the National Earthquake Information Center to critical users and the general public,” said Dr. Gavin Hayes, USGS Senior Science Advisor for Earthquakes and Geologic Hazards. “Over the past decade, web applications have become the primary mechanism to distribute this information. The unpredictable nature of events, immediate surge of significant internet traffic and an ever-changing technology landscape make this a challenging task.” After the M 6.8 Nisqually, Washington, earthquake of 2001, which caused injuries and significant damage in the Seattle-Tacoma urban area, it became apparent that the demand for digital information had outgrown the service that a basic web server could provide. Since then, there have been numerous changes to USGS information technology architecture, including commercial caching services to offload peak traffic, redundant systems and more complex architectures to try to ensure continuous availability. Within these systems, many of the low-level configurations had to be fine-tuned for the specific pattern of traffic associated with seismic events. The increased demand combined with continual software, security and hardware changes make this an ongoing challenge. The development team has increasingly been using the USGS Cloud Hosting Solutions (CHS) environment to meet the demand.
Not surprisingly, USGS IT systems experienced some technical bumps along the way, including during the 2019 Ridgecrest, California, earthquakes when the sudden and sharp influx of external traffic hampered the speed at which products could be delivered. Developing new delivery methods for earthquake information will further improve response to increasing demand. For example, a large portion of baseline traffic comes from users repeatedly requesting data feeds to find out if earthquakes have occurred. Solutions that provide users notifications about updates, as well as methods to push new events to users, should decrease overall system demand and lower system operating costs.
Photo of the lean team of six, from left to right: Lynda Lastowka, USGS Supervisory Geophysicist; Eddie Hunter, USGS IT Specialist; Jeremy Fee, USGS Computer Scientist; Jonathan Brown, USGS IT Specialist; Eric Martinez USGS Computer Scientist; Eric Jones, USGS Geographer. Photo by James Caple. October, 2017.