If you’ve ever been to an intensive care unit you may have noticed that they are full of monitors. Each one is critical in examining a patient’s health, from the electrical conduction of their heart to the oxygen concentration of their blood. What’s not wondered, and very much should be, is, what happens to all the data produced by these monitors?
Intensive care picture from Shutterstock
In the past, there was nothing to do with it. Massive hard drives didn’t exist. The cloud didn’t exist. But now that technology allows us to store huge amounts of data, the information can no longer be thoughtlessly discarded. Now the data is being wasted. Wasted data is a wasted opportunity. Proper use of this information can teach the systems the dynamics associated with patient degeneration, alerting the attending physician when it is detected. That represents a considerable coup in the fight to free up the time of hospital staff while maintaining services for patients.
More data means better care
As far back as 1981, physicians and data scientists at George Washington University published the Acute Physiology Score. This was a severity of illness score aimed at the general intensive care unit population. The hope was that this score could act as a triage metric of sorts. Upon arriving in an intensive care unit, clinicians could be directed to those with a higher score first, ensuring care is delivered more quickly to those who need it the most. However, data was sparse and difficult to store, and so a compromise between model complexity and efficacy had to be made. What resulted was a system that was not sensitive enough to be used on an individual patient.
Since then, the idea that algorithms could never be used to assess the health of an individual patient stuck. Technology has evolved, as have methods of assessing physiologic measurements from an individual patient, yet the methods used have scarcely changed since 1981.
Modelling the severity of illness using algorithms can help doctors and families make decisions about end-of-life care. If a clinician is able to tell a patient’s family with confidence that, unfortunately, it is extremely unlikely that a patient will survive even with radical treatment, it eases the discussion about palliative care. This side of healthcare is an emotive subject and there is, understandably, a sense that human doctors are best placed to deal with distressed families. But doctors regularly base their assessments on past experiences and the best data available to them on a given condition. Providing them with detailed information based on algorithms therefore doesn’t have to replace the human touch, it can simply make for more informed decision making.
What really makes a good hospital?
This type of monitoring can also make assessing hospital performance much more realistic, and in the past, this was their primary utility. Intuitively, looking for a hospital which provides the “best” care was thought to be as simple as finding the one with the lowest patient mortality. However, this neglects the demographics of the patients admitted to the hospital, often referred to as the case-mix.
While one hospital may cater mainly for the terminally ill, another could be a hotspot for trauma accidents from a nearby highway. The mortality rates from these hospitals will differ wildly, but this is independent of the level of care provided at that institution. Risk-adjusting is the practice of estimating, given the patients admitted, how well a hospital should perform on average, and performance is often measured as the percent mortality or length of stay of patients admitted. If the hospital performs better than expected, you could then look for beneficial policies which exist in that hospital and disseminate these to other care providers.
Embracing the black box
For all these reasons, care providers need to think more about the opportunities presented by the automated monitoring of patient health. Though doctors have traditionally not been comfortable using “black box” methods in care, more and more studies are being undertaken to evaluate the improvement of care possible when leveraging the volumes of data recorded in the hospital. These studies range from early alerting of physicians to patients requiring rapid response therapy to prediction of mechanical ventilation weaning time. Ensuring that these studies transition from the exceptions that they currently are to common practice requires health services and their employees to embrace the fact that optimal clinical care for the complex patient requires complex algorithms. Only then will we realise the true potential of data-driven health care.
Alistair Johnson is a Doctoral Student at the University of Oxford. . He acknowledges the support of the RCUK Digital Economy Programme grant number EP/G036861/1 (Oxford Centre for Doctoral Training in Healthcare Innovation).
This article was originally published at The Conversation. Read the original article.
Comments
One response to “How Big Data Could Improve Intensive Care”
Erm … quite a lot of this is already being done. Most setups as described above have storage capabilities. True not all data isn’t stored long term, but that’s mostly because it’s not useful later on unless for research purposes. In which case, flexing up isn’t that big a problem since HDD space is so cheap now.
Fully integrated electronic patient records and ordering systems are also nothing new. The VA network in the States implemented this years ago. It’s just struggling to get traction in government funded health systems due to the massive set-up costs and difficulties in integration with existing piecemeal electronic set-ups.
The NHS IT project fell on it’s face in the UK a few years ago probably due to cost blow-outs, shoddy management and taking on too big a challenge at once. Keep an eye on South Australia over the next few years … much of the above article is already in the pipe-line.
I get what you are saying about IEMR, Im in a health service in QLD thats developing this, and I work in IT provisioning for his health service. But the point this article is trying to make, the thing that isnt being utilised, is pouring the raw biometric data thats recorded in these monitors into a supercomputer farm and analyse tens of thousands of patients biometric data to recognise trends in heartrates in intubated patients or various other scenarios, to give us ways to detect patient crashes before their condition deteriorates, rather than the traditional blind studies used in medicine, that take decades to find results. Big data will bring new discoveries that were never before possible.