A group of Dutch researchers has demonstrated a technique for measuring rainfall by examining disruption to mobile phone signals. As well as providing a potential method for providing more measurements in the future, the study also reminds us of one of the downsides of mobile networks: they can be a bit rubbish during a downpour.
Rain picture from Shutterstock
The study, published in PNAS, measured available mobile signal power every 15 minutes over 12 days over 2400 links on a Dutch mobile network. That data was used to extrapolate rainfall figures and then compared to the data available from rain gauges and radar detection systems. The result? A close correspondence. While information from gauges was still more accurate at a highly regional level, it’s a useful potential addition in areas where rain gauges aren’t common.
Whatever future possibilities that technique holds, the study also reminds us of one of the inherent limitations of mobile networks, namely that rainfall affects the available signal:
Power losses along links are measured and stored by cellular communication companies to monitor the stability of their link networks. At the used radio frequencies, these losses are, apart from free space losses, mainly the result of attenuation by rainfall. Raindrops absorb part of the incident wave and, in addition, scatter some of the energy out of the beam.
Bear that in mind the next time someone echoes Alan Jones and suggests that mobile networks are the broadband of the future (though the need for backhaul and the inherent speed declines as more users jump on should be argument enough in that context). Mobile networks are very handy and massively popular, but they won’t solve everything.