Part of a series on 
Earthquakes 

Types 

Causes 
Characteristics 

Measurement 
Prediction 
Other topics 
Earth Sciences Portal Category • Related topics 
The moment magnitude scale (MMS; denoted as M_{w} or M) is used by seismologists to measure the size of earthquakes.^{[1]}
The scale was developed in the 1970s to succeed the 1930sera Richter magnitude scale (M_{L}). Even though the formulas are different, the new scale retains a continuum of magnitude values similar to that defined by the older one. Under suitable assumptions, as with the Richter magnitude scale, an increase of one step on this logarithmic scale corresponds to a 10^{1.5} (about 32) times increase in the amount of energy released, and an increase of two steps corresponds to a 10^{3} (1,000) times increase in energy. Thus, an earthquake of M_{w} of 7.0 releases about 32 times as much energy as one of 6.0 and nearly 1,000 times that of 5.0.
The moment magnitude is based on the seismic moment of the earthquake, which is equal to the shear modulus of the rock near the fault multiplied by the average amount of slip on the fault and the size of the area that slipped.^{[2]}
Since January 2002, the MMS has been the scale used by the United States Geological Survey to calculate and report magnitudes for all modern large earthquakes.^{[3]}
Popular press reports of earthquake magnitude usually fail to distinguish between magnitude scales, and are often reported as "Richter magnitudes" when the reported magnitude is a moment magnitude (or a surfacewave or bodywave magnitude). Because the scales are intended to report the same results within their applicable conditions, the confusion is minor.
In 1935, Charles Richter and Beno Gutenberg developed the local magnitude (M_{L}) scale (popularly known as the Richter scale) with the goal of quantifying mediumsized earthquakes (between magnitude 3.0 and 7.0) in Southern California. This scale was based on the ground motion measured by a particular type of seismometer (a WoodAnderson seismograph) at a distance of 100 kilometres (62 mi) from the earthquake's epicenter.^{[3]} Because of this, there is an upper limit on the highest measurable magnitude, and all large earthquakes will tend to have a local magnitude of around 7.^{[4]} Further, the magnitude becomes unreliable for measurements taken at a distance of more than about 600 kilometres (370 mi) from the epicenter. Since this M_{L} scale was simple to use and corresponded well with the damage which was observed, it was extremely useful for engineering earthquakeresistant structures, and gained common acceptance.^{[5]}
The Richter scale was not effective for characterizing some classes of quakes. As a result, Beno Gutenberg expanded Richter's work to consider earthquakes detected at distant locations. For such large distances the higher frequency vibrations are attenuated and seismic surface waves (Rayleigh and Love waves) are dominated by waves with a period of 20 seconds (which corresponds to a wavelength of about 60 km). Their magnitude was assigned a surface wave magnitude scale (M_{s}). Gutenberg also combined compressional Pwaves and the transverse Swaves (which he termed "body waves") to create a bodywave magnitude scale (mb), measured for periods between 1 and 10 seconds. Ultimately Gutenberg and Richter collaborated to produce a combined scale which was able to estimate the energy released by an earthquake in terms of Gutenberg's surface wave magnitude scale (M_{s}).^{[5]}
The Richter scale, as modified, was successfully applied to characterize localities. This enabled local building codes to establish standards for buildings which were earthquake resistant. However a series of quakes were poorly handled by the modified Richter scale. This series of "great earthquakes", included faults that broke along a line of up to 1000 km. Examples include the 1957 Andreanof Islands earthquake and the 1960 Chilean quake, both of which broke faults approaching 1000 km. The M_{s} scale was unable to characterize these "great earthquakes" accurately.^{[5]}
The difficulties with use of M_{s} in characterizing the quake resulted from the size of these earthquakes. Great quakes produced 20 s waves such that M_{s} was comparable to normal quakes, but also produced very long period waves (more than 200 s) which carried large amounts of energy. As a result, use of the modified Richter scale methodology to estimate earthquake energy was deficient at high energies.^{[5]}
The concept of seismic moment was introduced in 1966,^{[6]} by Keiiti Aki, a professor of geophysics at the Massachusetts Institute of Technology. He employed elastic dislocation theory to improve understanding of the earthquake mechanism. This theory proposed that the seismologic readings of a quake from longperiod seismographs are proportional to the fault area that slips, the average distance that the fault is displaced, and the rigidity of the material adjacent to the fault. However, it took 13 years before the M_{w} scale was designed. The reason for the delay was that the necessary spectra of seismic signals had to be derived by hand at first, which required personal attention to every event. Faster computers than those available in the 1960s were necessary and seismologists had to develop methods to process earthquake signals automatically. In the mid1970s Dziewonski^{[7]} started the Harvard Global Centroid Moment Tensor Catalog.^{[8]} After this advance, it was possible to introduce M_{w} and estimate it for large numbers of earthquakes. Hence the moment magnitude scale represented a major step forward in characterizing earthquakes.^{[9]}
Most earthquake magnitude scales suffered from the fact that they only provided a comparison of the amplitude of waves produced at a standard distance and frequency band; it was difficult to relate these magnitudes to a physical property of the earthquake. Gutenberg and Richter suggested that radiated energy E_{s} could be estimated as
(in Joules). Unfortunately, the duration of many very large earthquakes was longer than 20 seconds, the period of the surface waves used in the measurement of M_{s}. This meant that giant earthquakes such as the 1960 Chilean earthquake (M 9.5) were only assigned an M_{s} 8.2. Caltech seismologist Hiroo Kanamori^{[10]} recognized this deficiency and he took the simple, but important, step of defining a magnitude based on estimates of radiated energy, M_{w}, where the "w" stood for work (energy):
Kanamori recognized that measurement of radiated energy is technically difficult since it involves integration of wave energy over the entire frequency band. To simplify this calculation, he noted that the lowest frequency parts of the spectrum can often be used to estimate the rest of the spectrum. The lowest frequency asymptote of a seismic spectrum is characterized by the seismic moment, M_{0}. Using an approximate relation between radiated energy and seismic moment (which assumes stress drop is complete and ignores fracture energy),
(where E is in Joules and M_{0} is in Nm), Kanamori approximated M_{w} by
The formula above made it much easier to estimate the energybased magnitude M_{w}, but it changed the fundamental nature of the scale into a moment magnitude scale. Caltech seismologist Thomas C. Hanks noted that Kanamori’s M_{w} scale was very similar to a relationship between M_{L} and M_{0} that was reported by Thatcher & Hanks (1973)
Hanks & Kanamori (1979) combined their work to define a new magnitude scale based on estimates of seismic moment
Although the formal definition of moment magnitude is given by this paper and is designated by M, it has been common for many authors to refer to M_{w} as moment magnitude. In most of these cases, they are actually referring to moment magnitude M as defined above.
Moment magnitude is now the most common measure of earthquake size for medium to large earthquake magnitudes,^{[11]} but in practice seismic moment, the seismological parameter it is based on, is not measured routinely for smaller quakes. For example, the United States Geological Survey does not use this scale for earthquakes with a magnitude of less than 3.5, which is the great majority of quakes.
Current practice in official earthquake reports is to adopt moment magnitude as the preferred magnitude, i.e. M_{w} is the official magnitude reported whenever it can be computed. Because seismic moment (M_{0}, the quantity needed to compute M_{w}) is not measured if the earthquake is too small, the reported magnitude for earthquakes smaller than M 4 is often Richter's M_{L}.
Popular press reports most often deal with significant earthquakes larger than M ~ 4. For these events, the official magnitude is the moment magnitude M_{w}, not Richter's local magnitude M_{L}.
The symbol for the moment magnitude scale is M_{w}, with the subscript "w" meaning mechanical work accomplished. The moment magnitude M_{w} is a dimensionless value defined by Hiroo Kanamori^{[12]} as
where M_{0} is the seismic moment in dyne⋅cm (10^{−7} N⋅m).^{[1]} The constant values in the equation are chosen to achieve consistency with the magnitude values produced by earlier scales, such as the Local Magnitude and the Surface Wave magnitude.
Seismic moment is not a direct measure of energy changes during an earthquake. The relations between seismic moment and the energies involved in an earthquake depend on parameters that have large uncertainties and that may vary between earthquakes. Potential energy is stored in the crust in the form of elastic energy due to builtup stress and gravitational energy.^{[13]} During an earthquake, a portion of this stored energy is transformed into
The potential energy drop caused by an earthquake is approximately related to its seismic moment by
where is the average of the absolute shear stresses on the fault before and after the earthquake (e.g. equation 3 of Venkataraman & Kanamori 2004). Currently, there is no technology to measure absolute stresses at all depths of interest, or method to estimate it accurately, thus is poorly known. It could be highly variable from one earthquake to another. Two earthquakes with identical but different would have released different .
The radiated energy caused by an earthquake is approximately related to seismic moment by
where is radiated efficiency and is the static stress drop, i.e. the difference between shear stresses on the fault before and after the earthquake (e.g. from equation 1 of Venkataraman & Kanamori 2004). These two quantities are far from being constants. For instance, depends on rupture speed; it is close to 1 for regular earthquakes but much smaller for slower earthquakes such as tsunami earthquakes and slow earthquakes. Two earthquakes with identical but different or would have radiated different .
Because and are fundamentally independent properties of an earthquake source, and since can now be computed more directly and robustly than in the 1970s, introducing a separate magnitude associated to radiated energy was warranted. Choy and Boatwright defined in 1995 the energy magnitude^{[14]}
where is in J (N.m).
Assuming the values of are the same for all earthquakes, one can consider M_{w} as a measure of the potential energy change ΔW caused by earthquakes. Similarly, if one assumes is the same for all earthquakes, one can consider M_{w} as a measure of the energy E_{s} radiated by earthquakes.
Under these assumptions, the following formula, obtained by solving for M_{0} the equation defining M_{w}, allows one to assess the ratio of energy release (potential or radiated) between two earthquakes of different moment magnitudes, and :
As with the Richter scale, an increase of one step on the logarithmic scale of moment magnitude corresponds to a 10^{1.5} ≈ 32 times increase in the amount of energy released, and an increase of two steps corresponds to a 10^{3} = 1000 times increase in energy. Thus, an earthquake of M_{w} of 7.0 contains 1000 times as much energy as one of 5.0 and about 32 times that of 6.0.
The energy released by nuclear weapons is traditionally expressed in terms of the energy stored in a kiloton or megaton of the conventional explosive trinitrotoluene (TNT).
A rule of thumb equivalence from seismology used in the study of nuclear proliferation asserts that a one kiloton nuclear explosion creates a seismic signal with a magnitude of approximately 4.0.^{[15]} This in turn leads to the equation^{[16]}
where is the mass of the explosive TNT that is quoted for comparison (relative to megatons Mt).
Such comparison figures are not very meaningful. As with earthquakes, during an underground explosion of a nuclear weapon, only a small fraction of the total amount of energy released ends up being radiated as seismic waves. Therefore, a seismic efficiency needs to be chosen for the bomb that is being quoted in this comparison. Using the conventional specific energy of TNT (4.184 MJ/kg), the above formula implies that about 0.5% of the bomb's energy is converted into radiated seismic energy .^{[17]} For real underground nuclear tests, the actual seismic efficiency achieved varies significantly and depends on the site and design parameters of the test.
The moment magnitude (M_{w}>) scale was introduced to address the shortcomings of the Richter scale (detailed above) while maintaining consistency. Thus, for mediumsized earthquakes, the moment magnitude values should be similar to Richter values. That is, a magnitude 5.0 earthquake will be about a 5.0 on both scales. Unlike other scales, the moment magnitude scale does not saturate at the upper end; there is no upper limit to the possible measurable magnitudes. However, this has the sideeffect that the scales diverge for smaller earthquakes.^{[1]}
Various ways of determining moment magnitude have been developed, and several subtypes of the M_{w} scale can be used to indicate the basis used.^{[18]}
That original scale has been tweaked through the decades, and nowadays calling it the "Richter scale" is an anachronism. The most common measure is known simply as the moment magnitude scale..