March 11th, 2012 • 9:03 PM
The Lesson of the Great Japan Earthquake

by Oyang Teng

The death toll from last year's March 11 earthquake and tsunami off the coast of Japan was the highest for any natural disaster in the industrialized world in memory, and would have been unimaginably worse had Japan not been the most disaster-prepared nation on the planet. A year later, the media continues to focus on the bogeyman of nuclear contamination, while the looming and very real threat of future such megaquakes points to the fundamental question: can earthquakes be predicted?

Despite denials on the part of mainstream seismology, the qualified answer is: yes.

To understand the scienfitic debate, it is necessary to consider how the pervasive reliance on statistical methods have largely supplanted rigorous physical hypothesizing in science (as in economics, with similarly destructive consequences.) At issue, is the fact that the process of earthquake generation is still poorly understood.

Earthquakes originate deep beneath the surface and are therefore outside the range of direct observation. The field of seismology has, therefore, come to depend almost exclusively on the study of how stress accumulates along faults in the ground, by measuring minute movements in the crust. Along with historical records, examination of sediments in trenches dug across faults provides a paleoseismic record of past earthquakes, from which expected average rates of motion along a fault are calculated. Seismic hazard assessments maps extrapolate such past trends forward to establish the probability that a given region will experience an earthquake of a certain magnitude within a 30- to 50-year time interval.

Not only are such methods useless for short-term prediction, but have failed even within the broad terms set out by the hazard maps: the Japan quake, for example, occurred in a region considered relatively low-hazard. This has led some, such as the University of Tokyo's Robert Geller, to declare that earthquakes are inherently unpredictable. As was done in the field of quantum physics by the pro-irrationalist Copenhagen school in the 1920s, the shortcomings of a particular method of scientific investigation are used to claim that the process under study is inherently random (and, therefore, unknowable), its behavior only susceptible to a broad statistical description.

This ignores the fact that there is strong evidence of cyclicity in the appearance of certain earthquakes, on timescales varying from as short as the 11-year solar cycle, to as long as the roughly 60-million year cycles of volcanic and seismic activity evident in the geological record. More importantly, there is indisputable evidence (as presented in LPAC videos over the course of the year) that the complex process of earthquake preparation involves a host of measurable precursor phenomena.

To take the case of the Japan quake, a number of studies have shown that, in retrospect, clear precursor signals appeared in the atmosphere and ionosphere in the days and hours before the main shock struck in the subduction zone off the country's northeastern coast. These included: a sudden decrease in the height of the ionosphere over the future epicenter, some five days before the quake, measured by the transmission and reception of very low frequency radio signals through the ionosphere; satellite-detected anomalous infrared emissions in the atmosphere above the future epicenter beginning three days before; and a sudden increase in the total electron content of the ionosphere over the future epicenter beginning about one hour before, as measured by GPS satellites. (For a more detailed treatment, see the feature article on earthquake prediction in the upcoming issue of 21st Century Science & Technology magazine.)

The key to precursor studies has been a multi-parameter approach; that is, not only different measurements of the same parameter (such as ground- and satellite-based measurements of the electron density of the ionosphere), but simultaneous measurement of different signals from the ground, atmosphere, and ionosphere. Given our current lack of direct observation of deep-earth processes, these can serve as guideposts for understanding the underlying physical processes involved in earthquake formation and triggering.

In the meantime, hindcasts like those performed for the Japan quake have proven remarkably successful for a number of medium and large earthquakes studied with the multi-parameter approach. However, there has been very little funding for an expanded and integrated "sensor web" for precursor monitoring, or for scientists involved in such work to collaborate on real-time prediction. A notable exception is China, which has launched an ambitious ground- and satellite-based precuror monitoring program.

In the United States, the Obama Administration has led the charge in cutting funding for new earth-monitoring satellites, as well as for agencies tasked with disaster preparation. Meanwhile, scientists have been warning that the Pacific Northwest would suffer even greater damage than Japan did if a megaquake struck the Cascadia subduction zone.

From the standpoint of policy, the tragedy of March 11, 2011 has so far been a catalyst for such anti-scientific measures as the takedown of nuclear power. Instead, let it be the catalyst for a new science of earthquake prediction.

* Please follow the Commenting Guidlines.


The Basement Project began in 2006 as a core team of individuals tasked with the study of Kepler's New Astronomy, laying the scientific foundations for an expanded study of the LaRouche-Riemann Science of Physical Economics. Now, that team has expanded both in number, and in areas of research, probing various elements and aspects of the Science of Physical Economy, and delivering in depth reports, videos, and writings for the shaping of economic policy.