Byerlee's law is an experimentally derived law in physics that gives the stress circumstances in the Earth's crust at which fracturing along a geological fault takes place. The relation was determined by American geophysicist James Byerlee, by using experimental data to solve the Mohr–Coulomb failure criterion.[1]

Mohr-Coulombs criterion is a linear function of shear stress over normal stress, at the point of brittle failure inside a material\[\tau = S _0 + \mu (\sigma _n - P _f)\]

In which \(\tau\) is the shear stress and \(\sigma _n\) the normal stress. \(S _0\) is the cohesion or internal strength of the material. The value \(P _f \) is the pore fluid pressure inside a rock, which is constant on a small scale and weakens the rock. Byerlee found that in the upper crust, the criterion can be simplified to\[ \tau = 0.85 \sigma _n\]

for normal stresses up to 200 MPa; and

\( \tau = 50 + 0.6\sigma _n\)

for normal stresses higher than 200 MPa. In both formulas the shear stress is given in MPa.

However, the crust is far from a homogeneous material and consists of many rock types. Material constants can therefore vary locally. Even though Byerlee's law is a simplification, it is a good enough approximation for almost all situations. Byerlee's law gets less accurate when pressures and temperatures get higher than normal in the upper crust (e.g. temperatures over 400°C)

See also

References

  1. Script error
nl:Wet van Byerlee