Patients sending their physician a photo of a mole or other physical presentations of disease are often disappointed to learn they still need to be seen in person for a final diagnosis and treatment plan. While phone camera quality has improved greatly in the decade since they were first introduced, for many diagnoses there’s no substitute for being seen in person. A face-to-face visit allows physicians to not only visually inspect the patient in different lighting and from different angles, but also to palpate, listen to, and interact with them regarding other potentially related symptoms or concerns in real-time.

The same is true of data. To continue our analogy, to really build data trust — and ensure the right data is being processed correctly — we need to be sure the client is seeing the data “live in person” and not just photographically. When we first light up one of Hospital IQ’s analytics dashboards, clients almost universally declare the data is correct and immediately want to move on to results. Client champions are eager to demonstrate value from their investment in our software products, so the sooner they can get to results, the better.

While this enthusiasm is great, we often need to pump the brakes and create an environment where the client can examine the data from different angles and in different time frames. This detail is often skipped simply because it’s typically not been possible — most analytics packages spit out the final number or trend without the ability to validate the result by drilling down.

With the right tools, this in-depth data validation actually saves tremendous time on the back-end by building trust and anticipating objections that could otherwise erode confidence and derail a project.

What does it mean to have an interactive “in-person” visit with data?

    1. Local curiosity: We need someone from the hospital who is curious about the data. They don’t need to be deeply engaged with our product, they just need to have some hunches about data operations we can use as a vector to dig deeper.

      For example, we can get started by asking very simple questions like “who do you think is the most productive surgeon?”, “which inpatient units have the worst inbound or outbound boarding?”, or “when is the ED most backed up?” These questions give us leverage into the data that confirms or challenges these notions. When the hunch aligns with the data, we get confirmation the data pipeline is correct. When they disagree, it’s an opportunity to drill deeper and discover why. In many ways, we learn much more from the latter scenario than the former. If we’re mishandling the data, it’s an opportunity to show how our transparency is working and how these issues get resolved. That, too, builds trust.

    2.  Real-time tools: The data analysis tool must be able to answer data questions in real-time, not as a homework project for the data analyst. Projects lose tremendous momentum when clinical curiosity needs a week of research, followed by a new meeting to discuss the results. The overhead of meetings is demoralizing, and the first half is usually spent coming back up to speed, especially if the attendees have changed. In this environment, clinicians are reluctant to ask new or follow-up questions for fear of more meetings and slowdowns. This additional friction effectively kills their curiosity and blocks change.
    3. Corrective actions: Errors in data handling or misunderstandings in policy can be fixed expeditiously. It’s great to be able to quickly identify issues, but it’s not actually helpful if they can’t be rectified. Clients don’t want to hear “that’s a known limitation”. They want issues fixed.

It’s our job as the analytics provider to curate a journey through data that is relevant to the client’s best practices and business needs. We don’t just send a URL and login credentials with the vague question “does the data look correct?” Part of our onboarding protocol is to walk the client through the data by engaging their local knowledge of hospital operations. Our interactive online tools allow this to happen quickly.

Data validation is often an exercise in CYA (Cover Your A**) so the vendor can circle back months later and insist they acted in good faith. We couldn’t disagree more. Good faith data validation requires pulling the client into the boat and leveraging their knowledge and instincts to create a data warehouse that produces actionable insights to improve hospital efficiency. A competent physician won’t make a final diagnosis of skin cancer without seeing a patient in person. Similarly, a competent data analytics company can’t declare the data is accurate and complete without taking the client on a deep dive into their data.