Register      Login
Exploration Geophysics Exploration Geophysics Society
Journal of the Australian Society of Exploration Geophysicists
RESEARCH ARTICLE

Supercomputers in seismic data processing

M. Stanley and R. Singh

Exploration Geophysics 22(2) 379 - 382
Published: 1991

Abstract

Today's computers span a wide range of computing capability from desktop personal computers, through minicomputers and mainframes to the fastest, most powerful machines designed to the limit of existing electronics engineering technology. It is the last category, those that perform millions of floating point operations per second (megaflops) at the peak of current technology that are termed supercomputers. In seismic data processing extremely large volumes of numbers are processed through many stages, most of which involve complex mathematical operations. A typical seismic survey today is acquired with 300 recording channels to 6 s at a 2 ms sample rate. This recording configuration means that one kilometre of data produces 36 million samples (floating point numbers) for processing. Many seismic surveys today involve the recording of up to 10,000 kilometres of data, particularly 3D surveys. Processing techniques currently performed have evolved over past decades in accordance with the available electronic hardware of the time. The latest supercomputing technology allows more complex processing algorithms to be tested on data in realistic time, as well as the increased throughput capacity needed to accommodate today's large multichannel 2D and 3D seismic surveys.

https://doi.org/10.1071/EG991379

© ASEG 1991

Export Citation