6 ways big data can lower costs
July 10, 2014 in Medical Technology
Brigham and Women’s Hospital has put forth a new report showcasing a half-dozen ways to lower healthcare costs through the use of big data.
[See also: Big data doesn't have to be 'Star Wars']
With electronic health records now common across the U.S., the amount of clinical data ripe for research and analytics is on the rise. This is opening big opportunities to arrive at insights that could improve the value of patient care, say BW officials.
A new study published in the July issue of Health Affairs shows how big data analytics is helping pave the way toward reduced costs.
[See also: Geisinger CEO gives tips for smarter BI]
“The examples we present in this study provide key insights to the ‘low hanging fruit’ in healthcare big data and have implications for regulatory oversight, offer suggestions for addressing privacy concerns and underscore the need for support of research on analytics,” said David Bates, MD, chief quality officer at Brigham and Women’s Hospital and lead author on the study, in a press statement.
In the study, supported in part by a grant from the Gordon and Betty Moore Foundation, BW researchers are applying algorithms toward cost reduction in six categories:
Given that just 5 percent of patients account for about half of all U.S. healthcare spending. Bates and his team give special attention to these high cost patients. Their strategy includes formalizing an approach to predict which patients are likely to be high cost, what measurements can be incorporated to improve this prediction, particularly those focused on mental health, and how to make these predictions actionable. But making the most of this data depends on making predictions easily available to clinicians in a manner that doesn’t disrupt current workflow, according to the BW researchers – who suggest that developing predictive models requires the development of analytic systems using data from high risk patient groups.
As many as one-third of readmissions may be preventable – offering big opportunities for improvement in care and reduction in cost, according to the report. Bates and his coauthors suggest that all healthcare organizations should use algorithms to predict who is likely to be readmitted, but highlight the challenges of implementing such algorithms. They include: tailoring the intervention to the individual patient, ensuring that patients receive the interventions intended for them, monitoring specific patients after discharge to ensure they do not develop issues that would cause their condition to deteriorate, and ensuring a low ratio false positive rate of patients flagged for an intervention to patients who experience a readmission.
Effective triage is key to foreseeing complications when a patient first receives care in the hospital setting, the BW study argues. This is important in order to manage staff and bed resources, ensuring the patient is sent to the correct unit for care and overall it informs the management of the patient’s care. Researchers suggest integrating a triage algorithm into clinical work flow, and underscore the importance of having a detailed guideline to clarify how specifically the algorithm will inform care. Researchers examine two pilot studies which provide lessons learned in establishing effective triage algorithms.
As patients’ conditions worsen, certain organs may fail to adequately compensate for the systemic stress of a disease. But there is often a period in which physiological data can be used to determine whether the patient is at risk for decompensating. The study shows how the initial rationale for intensive care units was to allow patients who were critically ill to be closely monitored for just this purpose. Researchers emphasize such systems can now be used throughout the hospital, and that effective analytic systems in this area must use multiple data streams to detect decompensation, as many new technologies are becoming available that can be used to better monitor patients.
Adverse events are expensive, and can result in high rates of morbidity and mortality. But they’re preventable at high rates, according to BW researchers, who spotlight three areas – renal failure, infection and adverse drug events – as specific opportunities where analytics can realize cost savings
When it comes to chronic diseases affecting multiple organ systems, correctly managing these systemic problems is essential to keeping costs down. For instance, the study shows how autoimmune disorders such as rheumatoid arthritis and lupus could benefit from big data, enabling caregivers deliver expensive therapies in a more targeted way – helping predict the trajectory of a patient’s disease and tailoring treatments and therapies along the way. Lack of access to health records with pertinent data has traditionally been a big limiter in using big data to treat chronic diseases. But with EHRs now widespread, this is one area ripe for improvements patient care and lowering of costs, according to the BW study.
“Support for research that evaluates the use of analytics and big data to address these six use cases, as well as thoughtful consideration of regulation and payment is warranted,” said Bates.
He did strike one note of caution, however, noting that, “as multiple streams of data become available for analytic purposes, consideration of patients’ privacy and their desire to link disparate sources of data will be of the utmost importance.”