Brief History of Seismology

Reproduced from “Introduction To Seismology” by Peter Shearer, Cambridge University Press and USGS materials.

Did you know that around the world…

  • Every day: There are about 50 earthquakes strong enough to be felt locally; several of these produce distant seismic waves that can be measured with sensitive instruments anywhere on the globe.
  • Every few days: There is an earthquake strong enough to damage structures.

Seismology is the scientific study of the seismic waves generated by earthquakes.

  • Scientific & Practical Objectives of Seismology:
    • To learn about the structure of the earth (direct observation of the deep earth is impossible) and the physics of earthquakes
    • To make the engineered human environment safer
  • Seismology is a young science, only about 150 years old.
  • Before scientific studies began, ideas about earthquakes were largely based on myth and superstition.

Early 1800s

  • The theory of elastic wave propagation in solid materials is developed by Cauchy, Poisson, Stokes, Rayleigh, and others. They describe primary and secondary body waves (P- and S-waves) and surface waves. (Theory is way ahead of observation.)

1857

  • R. Mallet, an Irish engineer, travels to Italy to study damage caused by an earthquake near Naples. His work is generally considered to be the first serious attempt at observational seismology. His contributions:
    • earthquake waves radiate from a central focus
    • earthquakes can be located by projecting these waves backward toward the source
    • observatories should be established to monitor earthquakes

1875

  • F. Cecchi builds the first time-recording seismograph in Italy.
  • Higher quality instruments are then developed by British scientists in Japan.
  • These early instruments are undamped, and therefore inaccurate after the first few cycles of shaking.

1897

  • First seismograph in North America is installed at Lick Observatory near San Jose, California. This instrument will later record the 1906 San Francisco earthquake.
  • E. Wiechert develops the first seismometer with viscous damping, capable of producing a useful record for the entire duration of ground shaking.

Early 1900s

  • B. B. Galitzen develops the first electromagnetic seismograph in which a moving pendulum generates electric current in a coil, and establishes a network of seismic stations across Russia.
  • The new design will prove to be much more accurate and reliable than previous mechanical instruments; all modern seismographs are electromagnetic.

1906

  • H. F. Reid, an American engineer, studies survey lines across the San Andreas fault measured before and after the 1906 San Francisco earthquake. He proposes an “elastic rebound” theory for the origin of earthquakes, where accumulated elastic energy is released suddenly by slip on the fault.

1900-1910

  • Seismograms from many earthquakes recorded at many distances become widely available.
  • R. Oldham identifies P-, S-, and surface waves in earthquake records, and detects liquid earth’s core from the absence of direct body waves at certain distances.
  • A. Mohorovicic identifies velocity boundary between earth’s crust and mantle (Moho).
  • The first widely-used travel-time tables are published by Zöppritz.

1914

  • B. Gutenberg publishes travel-time tables that include core phases (seismic waves that penetrate or reflect from the core), and accurately estimates the depth of the earth’s fluid core (2900 km).

1920s

  • Seismic surveying methods using explosions and other artificial sources are developed in the United States for exploring for oil and other resources in the shallow crust.
  • Noise-reducing trace-stacking methods and Vibroseis are developed in the 1950s.

1935

  • C. Richter proposes a magnitude scale for specifying the sizes of earthquakes in southern California. The logarithmic Richter scale allows a huge range of earthquake sizes to be conveniently measured.
  • Defined for a specific region, specific distance range, specific wave type and period, and specific instrument, the idea is quickly adapted for other cases.
  • The smallest felt earthquakes are about magnitude 3, while rare great earthquakes are magnitude 8-9+.

1936

  • I. Lehmann discovers the earth’s solid inner core.

1940

  • H. Jeffries and K. Bullen publish final versions of their travel-time tables for many seismic phases. They are accurate enough to still be in use today.

1950s & 1960s – The Cold War

  • Soviet nuclear tests in the early 1950s generate intense interest by the U.S. military in detection and measurement of nuclear explosions, and funding for government and academic seismology programs surges during the Cold War.
  • The Worldwide Standardized Seismograph Network (WWSSN), consisting of well-calibrated short and long-period seismographs, is established in 1961. This high-quality dataset will contribute to many advances in seismology.

1966

  • The disadvantages of traditional magnitude measures are widely recognized: saturation, inconsistency between magnitude scales, etc. K. Aki introduces “seismic moment”, a more physics-based measure of earthquake size.

1960s

  • The increased number of seismic stations established after ~1900 allowed large earthquakes to be routinely located, leading to the discovery that earthquakes are not randomly located, but rather are concentrated in narrow belts around the globe. The significance of this observation was not appreciated until the plate tectonics revolution of the 1960s. Earthquakes are generated where crustal plates spread apart (e.g., mid- Atlantic Ridge), are consumed at subduction zones (e.g., Japan, Aleutians), or slide past each other at transform boundaries (e.g., San Andreas fault).
  • Seismologists show that “focal mechanisms” of large earthquakes inferred from spatial patterns of radiated energy are consistent with plate tectonic ideas, helping to validate the theory.
  • Evidence (first presented in 1928 by K. Wadati) of deep earthquakes located along dipping zones of seismicity where crustal plates subduct into the mantle also helps validate plate tectonic theory.
  • Seismologists use records from the great Chilean earthquake of 1960 to study earth’s free oscillations. Studies of normal modes excited by large earthquakes provide powerful new constraints on earth’s internal structure.
  • Application of computers to larger datasets and problems begins in the 1960s:
    • routine earthquake locations
    • inverse problems
    • theoretical seismograms
    • source spectra and scaling; slip distribution on fault
    • normal modes
    • crustal imaging using artificial sources

1970s

  • First digital global seismographs installed.
  • First digital portable seismographs used for special
    studies (source scaling, site response, etc.).
  • Centralized archives of digital seismic data
    established.

Earthquake Engineering & Seismology

  • Destructive earthquakes in southern California in 1933 and 1971 lead to establishment and improvement of seismic elements in building codes in the USA. Networks of “strong-motion” seismographs are established and expanded. Unlike conventional seismographs, which are designed for maximum sensitivity, strong-motion instruments can record strong shaking close to damaging earthquakes without saturating.
  • A new body of observation and theory addresses the need to estimate damaging (generally high-frequency) ground motions for engineering design.