Buildings hit by an earthquake

Perhaps the most challenging thing about earthquakes is the great uncertainty around where and when they are likely to strike, which makes taking actions that may save lives challenging.

Earthquakes induce redistribution of mass and this generates observable changes of the Earth’s gravitational field, measurable using specialised instruments. High accuracy gravity measurements may provide a useful tool to help with managing the risk by identifying which faults are under stress and most likely to be active. By monitoring the progression of fault movements, it’s possible to get a medium-term outlook on which areas are most likely to be affected when the next earthquake happens.

Perhaps even more impressively, it is possible to detect an instantaneous gravity signal caused by the shifting of the mass in the ground which arrives before the first seismic waves. Such an approach has the potential to give earlier warnings by overcoming limitations imposed by the propagation speed of seismic waves. Current earthquake warning systems are based on networks of accelerometers and seismometers which detect the early arrival “P” seismic waves prior to the arrival of the more destructive shear waves, but are unable to respond before the ground movements have already started greatly limiting how advanced the warning can be.

While an alert triggered by a gravity signal might only give a few additional seconds, such a warning can provide extra time to allow the public to take preventative action, such as ensuring fire station doors are opened before they may be disabled. It could also allow the shutdown of public transit to avoid derailment, as well as power, gas and other networks which may be damaged or provide secondary risks such as fire, saving lives.

An alert triggered by a gravity signal might only give a few additional seconds, but such a warning can provide extra time to allow the public to take preventative action.

Dr Daniel Boddice, Department of Civil Engineering

The resulting signals for these applications may be only cause tiny changes in gravitational acceleration ranging from a few microGals down to only a fraction of a microGal (1 Gal = 1 centimeter per second squared; the average gravity field on earth is around 983,000,000 microgals). Making measurements of gravity with sufficient accuracy, long term stability and data sampling rate to achieve these signals is of course challenging. Detection of small signals such as the transient gravity signals for early warning systems is greatly limited by the background seismic noise which affects the sensor readings. Obtaining measurements which are stable over long periods of time and consistent between different sensors is difficult too, due to mechanical wear and drift on sensors as well as different manufacturing tolerances on the sensors, for example due to variations in the proof masses or differences in stiffness and frequency responses of springs for spring based gravimeter technologies.

The Quantum Technology hub for Sensors and Timing led by the University of Birmingham is developing sensors which meet these challenges by exploiting the quantum behaviour of cold atoms, to measure gravity accurately. As well as the cold atom sensors being extremely sensitive to small changes in the gravitational field, the identical nature of atoms makes them ideal test masses which give consistent results when dropped. This makes measurements repeatable over time, allowing time lapse surveys to measure changes in the environment over time such as for fault monitoring, as well as making measurements between different instruments comparable, allowing for a monitoring network of sensors to be deployed.

Furthermore, by using a common laser to measure two vertically separated atom clouds simultaneously to give a gravity gradient, common seismic noise can be suppressed. This allows faster measurements and helps to overcome the limitations of seismic noise which affects traditional gravimeters and allowing fast and accurate measurements. This has tremendous potential to improve both for temporal rate of sampling for monitoring purposes as well as allowing more points to be collected during a survey, improving the resolution and spatial coverage.