How To Deliver Healthcare Insight Faster

When it comes to healthcare, data analytics isn’t just a "nice to have" — it can dramatically improve patient outcomes. Back in 2014, a data infrastructure solution called health data interchanges (HIE) was accessed only 2.4% of the time but those patients who had their providers examine their previous health data were 30% less likely to end up in the hospital. What’s more, the future value of the savings received by that healthcare network from 2014 to 2020 is a remarkable $2.8 million. 

But even with that financial ROI, fast forward to February 2020, six years later — it still took the Covid-19 coronavirus to cause HIE services use to triple as providers now understood the importance of patient data sharing during Covid-19.

 

So sharing data has health and financial benefits. You are committed to data sharing, analytics, and deriving benefits from your health data gold mine. But why has it been so difficult to do? What challenges occur and how do you avoid them? 

Many large businesses run into big data headaches that prevent them from achieving their vision. One client's vision is to reimagine health insurance as a digital business. Other healthcare companies likely recognize one or more of the following headaches in reaching their goals:

 

• Fail to Bootstrap: Failed attempts at building a data platform. This could look like a number of things, including a design using centralized approaches of ingesting data, processing it and then serving it to a monolithic data solution like a data lake. 

• Fail to Scale Sources and Consumers: Sitting on old data. Our client had a decades-old data warehouse they initially just wanted to put on the cloud.

• Fail to Materialize Value: This can happen often with a "boil the ocean" expectation from a solution. Our customer was exhausted from their large goals of promoting the health of every healthcare customer. 

• Fail to Materialize Value: Another way this manifests is getting stuck on particular details — such as the how of architecture rather than what the architecture should be, why it should be and who it should be for. Our client's IT analytics unit, like many others pursuing big data projects, spent an enormous amount of time on the how of their architecture.

We have seen how typical data solutions have failed publicly in the press and silently ended inside our own companies.

So is it possible to connect two groups — users and IT — and build an appropriate system in a timely manner? Is it possible to build a system serving multiple, parallel uses and users without "cut and paste" reuse? I believe the answer is yes and the Data Mesh provides the platform for success.

A Mesh Solution

To understand a Data Mesh, change your current mental model from a single, centralized data lake, to an ecosystem of data products that play nicely together. Instead of handcrafting a data lake for everyone, you gather domain-specific users together and ascertain their smaller group needs. Instead of handcrafting bulk data loads, you create an automation pipeline reading from related data sources to produce a data product for that group. Instead of handcrafting data export code, you use a language to autogenerate the data pipeline, to produce the data product, ASAP. These three elements — a domain-specific user group, a generated-automated pipeline and a data product for that user group — create a node on your Data Mesh. 

Data Mesh-oriented cloud development uses well-known data infrastructure tooling (e.g. Kubernetes and Terraform) and is much faster than data lake and warehouse solution development. In my experience, it's possible to reduce the time it takes to get valuable insights from quarters and months to weeks.

For those in the healthcare industry, this type of approach can support results like process and operational improvements that can manifest in better prediction of patient outcomes.

The ROI Of Continuous Data Analytics And Its Benefits

A vital output of a Data Mesh approach is called a "data product." It serves a business community for one or more business use cases. It is built very quickly — in just days to weeks. The data product is then typically accessed by business intelligence, machine learning applications — and connections can be automatically generated by the Data Mesh tooling. An important metric for Data Mesh time to market improvement is:

The number of data products that produce analytical ROI or business impact, deployed live to customer group(s) per time period. For example, the time to market for creating a business-relevant Data Mesh data set.

One of our customer’s data product teams built 50 data products that went live within two months. These data products were accessed by 4,800 business users running Microsoft PowerBI on top to improve healthcare. 

Data Assets And Not Lakes Of Data

Escaping from years of legacy data warehousing architecture is not easy. After you have "changed your current mental model from a single, centralized data lake," how do you proceed? A few best practices for Data Mesh can help guide your own journey to a better data solution. They are made up of three phases:

1. Build a platform strategy: Primarily to identify end-to-end use cases.

2. Perform a data discovery: Primarily to determine and validate the data sources to be read.

3. Iteratively construct the pipeline, operate it, and share data products for your users to consume

While iteratively constructing and delivering, keep these principles of Data Mesh architecture in mind to help ensure your data products are complete:

1. Distributed Domain Driven Architecture: Ensure the data products are bounded by a domain.

2. Self-Serve Platform Design: The software to create a data product is public — within your security perimeter. Ensure that others can "run and extend" the pipeline to create new data products.

3. Product Thinking With Data: Ownership is established for the data products 

The prevailing wisdom of more data and centralization is better, a.k.a. "a single version of the truth," will continue to be a challenge to building scalable and relevant analytical systems. But there is folly in this approach. Healthcare data can lead to high ROI and critical care analytics, but the assets should be relevant to current use cases, the data customers and, importantly, timely to capture market opportunities.

Previous post
Back to list
Next post