How AI Can Interpret Your Blood Test

How AI Can Interpret Your Blood Test

2 Min Read

Pricey AI services claim they can interpret your blood test results, but the truth is more complicated.

The wait between having your blood drawn and getting the results can be stressful. Sometimes, the technical report arrives before a doctor can explain it.

AI-powered chatbots, like ChatGPT, Claude, and Gemini, have become a tool for patients seeking to understand lab results. Companies like Whoop and Levels see an opportunity here, offering AI-powered insights into lab work, usually requiring a subscription costing several hundred dollars annually.

These services provide detailed reports in simple language, offering personalized tips like dietary changes. However, Dr. John Whyte warns there’s no solid evidence that AI can reliably interpret blood results.

“Physicians aren’t always the best communicators,” Whyte says. He doubts the claims by companies offering AI interpretations, noting the lack of rigorous research proving their efficacy over free chatbots or even doctors.

AI models such as Gemini and ChatGPT aren’t specifically validated for interpreting blood tests. Google and Quest Diagnostics’ AI tool helps patients navigate their data but doesn’t offer medical advice.

Jonathan Kron of BloodGPT admits there’s no standard for comprehensive blood test interpretation. Errors are common when using chatbots for this task. BloodGPT has created systems for accuracy checks, consulting specialists for certain hormonal biomarkers. Their subscription ranges from $9.99 to $17.99 per month but lacks peer-reviewed research backing.

Mt. Sinai’s Dr. Girish Nadkarni argues that companies need to demonstrate success through retrospective data comparison and prospective trials with expert comparisons. He emphasizes the need for model accuracy, as AI products might be okay for most but risky at the extremes.

Levels markets subscription plans that provide lab work, clinician-reviewed reports, and AI insights. Corporate head Josh Clemente recommends frequent blood testing but warns against total reliance on AI, favoring human oversight. Yet, AI can inadvertently create an approval bias.

Whoop’s Alexi Coffey acknowledges AI’s potential in integrating physiological data but stresses cautious interpretation.

Dr. Whyte from the AMA advises skepticism about AI’s personalized claims, recommending using AI for clarity rather than personalized insights.

Blood tests are vital yet often misrepresented. Until research proves AI’s accuracy in interpreting blood tests, it’s best to use it for straightforward explanations.

Note: The content above is for educational purposes only and not intended as medical advice. Always consult a qualified health provider. Ziff Davis, Mashable’s parent company, filed a lawsuit against OpenAI in April 2025.

You might also like