top of page
Search

Understanding Factors Influencing Using AI in Molecular Diagnostics in the EU

  • davidereesephd
  • Apr 13
  • 3 min read

We frequently get asked about regulations for diagnostics in the EU vs the US. There's a lot going on here — the EU situation is genuinely complex and still evolving. So, to approach this subject, here's a breakdown of the main concerns:


The dual-compliance problem

This is the central issue. Under the AI Act's current structure, AI used as part of medical devices and diagnostics is generally treated as high-risk by default, meaning many AI medical devices are expected to meet both MDR/IVDR requirements and the AI Act's high-risk AI requirements — an overlap that industry has increasingly criticized as duplicative regulatory burden.

The EU Commission has proposed a "Digital Omnibus" legislative package to streamline this, but legal observers anticipate the proposal could be finalized by summer 2026 or 2027 at the latest, and it remains uncertain how, when, or even whether all AI Act requirements will be incorporated — meaning a "disorderly transition" period is likely ahead.


Timeline pressure

The AI Act became fully applicable in August 2026, with full compliance obligations for high-risk systems — including medical devices and IVDs — taking effect by August 2027. That's a tight window for an industry still working out what dual compliance even means in practice.


The explainability and transparency gap

The EU now expects manufacturers to prove not only that their products are safe and effective under MDR/IVDR, but also that their AI/ML algorithms are fair, transparent, and explainable under the AI Act. For complex diagnostic models — particularly deep learning systems analyzing imaging — explainability remains genuinely hard technically, not just administratively.


Human oversight requirements

Under the AI Act, providers of high-risk AI must ensure appropriate user information and promote AI literacy, helping clinicians understand system limitations, confidence levels, and appropriate oversight. The concern is that if these obligations are weakened or removed via the Digital Omnibus simplification, companies that have invested in robust governance and explainability could end up competing against products optimized for minimal compliance.


GPAI models in clinical settings

GPAI models like LLMs used for clinical reasoning and decision-making must still fulfill MDR/IVDR and high-risk requirements — but they are regulated at the EU level by the AI Office, not by national authorities, creating a complicated jurisdictional split. This is an emerging gap as LLM-powered clinical tools become more common.


Algorithmic bias and data equity

This sits underneath all of the above. The concern is that AI diagnostic tools trained predominantly on data from certain populations will underperform on others. The AI Act demands data governance and bias documentation, but enforcement mechanisms and harmonized standards for measuring bias in medical AI are still being developed.


The competitive disadvantage risk

In the EU, a key 2026 focus is compliance with the EU AI Act, which prioritizes ethical principles, risk assessment, and transparency — and this may impact deployment timelines compared to the US, where the regulatory environment is more permissive. The worry among EU innovators is that a heavier compliance burden slows clinical adoption and pushes development activity elsewhere, while US and Chinese competitors move faster.


The bottom line is that the regulatory intent is sound — transparency, human oversight, and bias mitigation are legitimate requirements for diagnostic AI — but the execution is creating significant uncertainty for companies trying to bring products to market in 2026 and 2027.


As reference, you might want to look at a recent report by DLA Piper for additional depth or ask Alexea group. Good luck!

 
 
 

Recent Posts

See All
Data Problems in Diagnosis Land

The data problem LLMs learn from text — medical literature, forums, patient descriptions — not from actually examining patients. A real diagnosis relies heavily on things that can’t be typed: how some

 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

© 2025 by Alexea Group

bottom of page