Skip navigation

A grid-based facility for large-scale cross-correlation of continuous seismic data

Authors: John Townend, Yannik Behr, Kevin Buckley Martha Savage, John Hine, Victoria University of Wellington

Paper number: 3752 (EQC 08/556)

Technical Abstract

The vast volumes (more than 3.5 gigabytes per day) of seismic data being recorded by the GeoNet network of geophysical instruments provide exciting opportunities for studying the New Zealand plate boundary and better characterising earthquake and volcanic hazard sources. To take full advantage of these data and recent developments in seismological data analysis requires efficient computer algorithms capable of automatically extracting and processing long streams of data.

Geophysical research, as is the case in many other data-intensive fields of science, often involves a large number of incremental processing steps, during each of which various decisions must be made regarding the particular choice of parameters or even algorithms to use. One way of describing a sequence of processing steps is to use a “computational workflow”, a documented chain of modular components representing different stages in the overall analysis. Such a workflow enables researchers to examine the effects of different processing sequences by re-running an analysis using modified input parameters or by replacing portions of the workflow with alternative analytical components.

In this project, we have developed a computational workflow describing the analysis of seismic records obtained from the GeoNet seismic data archive or other repositories. We focus on a process known as “cross-correlation”, in which two signals — such as seismograms — are compared mathematically to determine their similarity and make accurate timing measurements. Cross-correlation underpins much of modern seismology, including earthquake detection and location, and analysis of the seismic noise field, which our workflow is designed to address.
More than 95% of the time, seismometers designed to record earthquakes are actually recording continuous, low-pitched noise—the incoherent background hum of the Earth. By comparing long records of seismic noise recorded at two different locations using cross-correlation techniques, a small amount of coherent seismic energy propagating directly between the two sites can be detected. This energy propagates as a seismic wave at a speed governed by the physical properties of the rocks it passes through. By measuring this speed, we can map the Earth’s deep structure in much the same way as ultrasound is used to look inside human bodies.

The computational workflow we have developed has been configured to run on a grid-computing system consisting of 230 networked desktop computers that can be harnessed for computationally intensive computing when otherwise idle. Distributing a computation among many separate processors within the grid reduces the time required to analyse large volumes of seismic data for noise imaging purposes.
The key outcomes of this project are as follows:

  1. A two-stage computational workflow enabling continuous seismic waveform data extracted from the GeoNet archive, or elsewhere, to be systematically pre-processed and analysed;
  2. Deployment of this workflow on a grid computer in a way that makes use of the maximum amount of the computational power available at any one time;
  3. Development of a web interface that facilitates running the workflow;
  4. Successful application of the workflow to datasets of various scales.
     

 

Order a research paper

Many of these research papers have PDF downloads available on the site.

If you'd like to access a paper that doesn't have a download, get in touch to ask for a copy.