⚠
If you work in healthcare, invest in it, or are a patient: 1,250+ AI-enabled medical devices are now FDA-authorised in the US. AI already passes the bar exam, medical licensing exam, and CPA exam. The liability question — when AI misdiagnoses, who is responsible? — remains legally unresolved.
01 — Diagnostics: AI is reading scans better than doctors
94%
AI accuracy detecting lung nodules — vs. 65% for human radiologists in the same task (MIT/MGH)
SCI
54%
of US hospitals with 100+ beds now use AI in radiology — primarily for image interpretation and worklist prioritisation
RAM
53%
reduction in radiologist workload possible with AI triage — without compromising diagnostic oversight
RAM
$255B
AI-enabled medical device market projected by 2033, up from $13.7B in 2024 — 18× growth
IL
AI vs. human accuracy by condition
SCI OD
Early breast cancer (dense tissue)
The nuanced read
"AI models perform comparably to non-expert physicians overall — but still perform significantly worse than expert physicians."
— npj Digital Medicine meta-analysis, 83 studies, 2025 NPJ
The consensus: AI dramatically outperforms a solo radiologist at 2am or an overburdened clinician on a 12-hour shift. AI + experienced physician = significantly better than either alone.
The real gains come from combination, not replacement. AI flagged 49.8% of interval cancers that human readers missed outright.
RAM
So what does this mean?
If you or someone you care about is due for a cancer screening, the hospital using AI-assisted radiology will catch things earlier. That's not theoretical — it's a 29-percentage-point accuracy gap for lung nodules, right now.
AI isn't replacing your doctor. It's catching the things your doctor misses at 2am after a 12-hour shift. The combination of AI + human is better than either alone. The question you should be asking your hospital: are you using these tools yet?
02 — Drug discovery: from a decade to 18 months
Drug development timeline: traditional vs. AI-accelerated
4–6yrs
Traditional: target identification to preclinical candidate
Typical pharma pipeline
→
18mos
AI-accelerated: same pipeline stage
Insilico Medicine, 2021 — verified PMC
AlphaFold 3 — Google DeepMind spinout
$600M
Raised April 2025. Preparing first human trials for AI-designed oncology drugs. Nobel Prize 2024 for AlphaFold protein-folding breakthrough. Eli Lilly deal: $45M upfront, up to $1.7B on milestones.
ISM001-055 — pulmonary fibrosis
18 mos
Target to preclinical candidate in 18 months at a cost of $150,000 — vs 4–6 years and tens of millions via traditional methods. Phase IIa results positive. First AI drug in human trials.
Exscientia + Sumitomo
PMC
DSP-1181 — OCD treatment
<12 mos
First AI-designed molecule to enter human clinical trials. Developed in under 12 months. The traditional equivalent takes 4+ years. Validated AI's ability to compress the discovery loop end-to-end.
Demis Hassabis at Davos 2025:
"One day we hope to be able to say — here's a disease, and then click a button and out pops the design for a drug to address that disease." According to GlobalData, there are currently more than 3,000 drugs developed or repurposed using AI, with most in early-stage development. The sector drew $3.3B in venture funding in 2024 alone.
DDN
So what does this mean?
Drug development that used to take a decade and cost billions is being compressed to months and thousands. That means treatments for diseases that were previously "uneconomic to research" — rare conditions, neglected diseases — are suddenly viable.
If you're a patient waiting for a treatment that doesn't exist yet, the timeline just shortened dramatically. If you're an investor, AI drug discovery is the most validated commercial use case in healthcare AI, with $3.3B in VC funding in 2024 alone. The Nobel Prize already happened — this isn't speculative.
03 — Regulation & liability: the law hasn't caught up
1,250+
AI-enabled medical devices authorised by the FDA as of July 2025 — up from 950 in Aug 2024. Approvals running at ~100/year and accelerating.
BPC
43%
of FDA-approved AI medical devices lack clinical validation data. Only 28% have undergone prospective testing.
FRONT
250+
AI healthcare bills introduced across 34 US states by mid-2025. A patchwork of obligations, not a single rulebook.
BB
The liability black hole: when AI misdiagnoses, who pays?
No one fully knows yet. That's not a metaphor — it's the current legal reality.
🏭
AI Developer
Currently shielded by product liability regimes — but not in every scenario
🏥
Hospital
Most likely locus — if it deployed the tool and failed to monitor
👨⚕️
Physician
Retains final responsibility for patient care — but can't explain the black box
⚖️
Courts
"A lot of unknowns" — compared to self-driving car liability cases
ME
Also notable: the FDA itself now uses AI — "Elsa," powered by Anthropic's Claude — to help staff read and summarise regulatory documents. The regulator of AI is using AI to regulate AI. BPC
So what does this mean?
Over 1,250 AI medical devices are FDA-authorised, but 43% lack proper clinical validation. The technology is outpacing the regulation — and when something goes wrong, the legal system doesn't yet know who to hold accountable.
If you're a patient, this means asking questions about how AI tools in your hospital were validated. If you're a healthcare professional, this means understanding that you likely still carry the legal liability even when AI assists your decision. The regulatory framework is a patchwork — not a safety net.
04 — The equity gap: great equaliser or greatest divider?
The two-tier healthcare system forming right now
✓ AI-augmented care
🔬AI detects cancers 6–12 months earlier — often before symptoms appear
⚡Viz.ai cuts stroke triage time — standard now in 900+ hospitals globally
🧬AI genomic analysis personalises cancer treatment to molecular subtype
🌍AI triage makes specialist-level care available in remote or understaffed areas
💊Drug repurposing finds treatments for rare diseases that were previously uneconomic to research
✗ Without access
⚠️Fewer than 30% of AI devices disclose demographic diversity in training data — bias risk is real
FRONT
💰Best AI diagnostic tools concentrated in well-resourced academic medical centres, not community hospitals
🌐AI accuracy drops for underrepresented ethnic groups trained predominantly on majority-population data
🚧Global South largely excluded — the same access divide in education applies to health
📋If AI standard of care becomes legal norm, small practices unable to afford tools face disproportionate liability
ME
The Harvard/MIT Bipartisan Policy Center note:
"AI has the potential to expand access to care and improve quality — but only if bias mitigation, representative training data, and equitable deployment become non-negotiable requirements, not afterthoughts." PMC2
So what does this mean?
AI could detect your cancer 6–12 months earlier — but only if your hospital has the tools. The gap between AI-augmented and non-augmented care is widening, and it maps directly onto existing inequalities: wealth, geography, and ethnicity.
If you have the ability to choose where you receive care, this is now a factor worth considering. And if you work in healthcare, the push for representative training data and equitable deployment isn't just ethical — it's the difference between AI helping everyone and AI helping only those who already have the most.
94%
AI accuracy for lung nodule detection — vs. 65% for human radiologists in the same study
18 mos
to develop a drug candidate with AI — down from 4–6 years via traditional pipelines
1,250+
AI-enabled medical devices now FDA-authorised — 43% lacking clinical validation data
What does all of this
mean for you?
1.Subscribe to Veltrix Collective. We track AI developments across healthcare, work, and daily life — and break down what actually matters for you. One briefing a week, no jargon. Join here →
2.Ask your hospital about AI screening. Next time you're due for a mammogram, lung scan, or cancer screening, ask directly: "Do you use AI-assisted imaging?" Hospitals using tools like Aidoc or Viz.ai are catching cancers months earlier. You have the right to know.
3.Use Claude or ChatGPT to understand your health data. Upload a blood test result, scan report, or prescription list. Ask it to explain what each value means, what's out of range, and what questions to ask your doctor. This isn't a replacement for medical advice — it's preparation for a better conversation.
4.Research clinical trials with Gemini or Claude. If you or a family member has a condition, ask: "What AI-assisted clinical trials are currently recruiting for [condition]?" AI can search ClinicalTrials.gov faster than you can navigate it. Many AI-designed drugs are entering Phase II trials right now.
5.Check your bias exposure. Ask ChatGPT or Claude: "What are the known demographic biases in AI diagnostic tools for [your condition]?" Fewer than 30% of AI devices disclose training data diversity. Understanding the limitations is as important as understanding the benefits.
6.Follow the regulation. Use Gemini's search grounding to track AI healthcare policy developments in your country. The rules are being written right now — 250+ state-level bills in the US alone. What gets decided in 2026 will shape your care for the next decade.
The gap between AI-augmented healthcare and everything else is widening every month. Start navigating it this week.
Source references
SCI
Scispot — AI Diagnostics Revolutionising Medical Diagnosis 2025MIT/MGH: 94% AI vs 65% human for lung nodules. Korea study: 90% vs 78% breast cancer sensitivity. Aidoc deployed in 900+ hospitals.
scispot.com →
RAM
Ramsoft — How Accurate Are AI Diagnostics in Radiology? (2025)53% workload reduction. 49.8% of interval cancers flagged by AI missed by humans. 54% US hospital AI adoption. 95%+ accuracy for lung cancer and retinal disorders.
ramsoft.com →
NPJ
npj Digital Medicine — AI vs. Physician Diagnostic Accuracy Meta-Analysis (2025)83 studies. AI matches non-expert physicians overall but performs worse than experts. No significant overall performance difference (p=0.10).
nature.com →
OD
OncoDaily — AI Transforming Cancer Care 2025Prostate cancer AI: AUROC 0.91 vs 0.86 radiologists, 6.8% more cancers detected. AI mammography: sensitivity matching or exceeding double-read.
oncodaily.com →
ISO
Fortune — Isomorphic Labs prepares human trials (Jul 2025)$600M raised April 2025. Oncology human trials imminent. Eli Lilly $45M + $1.7B milestone deal. Nobel Prize 2024 for AlphaFold.
fortune.com →
PMC
PMC — From Lab to Clinic: AI Drug Discovery Timelines (2025)Insilico ISM001-055: 18 months to preclinical, $150K cost. Exscientia DSP-1181: under 12 months, first AI drug in human trials. $3.3B VC funding 2024.
pmc.ncbi.nlm.nih.gov →
DDN
Drug Discovery News — How AI is Transforming Drug Discovery (Oct 2025)3,000+ AI-developed drugs in pipeline. $3.3B VC in 2024. FDA draft guidance on AI in drug development. Oxford: 54-gene Alzheimer's analysis in days vs weeks.
drugdiscoverynews.com →
BPC
Bipartisan Policy Center — FDA Oversight of Health AI (Dec 2025)1,250+ AI devices authorised July 2025. FDA staff down 15%. FDA's own AI tool "Elsa" powered by Anthropic's Claude for internal documents.
bipartisanpolicy.org →
FRONT
Frontiers in Medicine — Decade of AI Medical Device Regulation (Jun 2025)43% of FDA-approved AI devices lack clinical validation. 28% prospectively tested. Under 30% disclose training data demographics.
frontiersin.org →
ME
Medical Economics — The New Malpractice Frontier (Feb 2026)Liability unclear: developer vs hospital vs physician. Self-driving car liability analogy. Malpractice insurers adding AI policy riders. Small practice equity concerns.
medicaleconomics.com →
BB
blueBriX — The 2026 AI Reset: New Era for Healthcare Policy (Jan 2026)250+ state AI bills by mid-2025. GE Healthcare: 58 FDA-cleared AI tools. Viz.ai gold standard for stroke triage. Lifecycle-based regulatory model shift.
bluebrix.health →
PMC2
PMC — The Illusion of Safety: Report to FDA on AI Healthcare (Jun 2025)Equity framework requirements. Bias risk in unrepresentative training data. Case for specialised regulatory body with technical expertise.
pmc.ncbi.nlm.nih.gov →
IL
IntuitionLabs — AI Medical Devices: 2025 Status, Regulation & Challenges$13.7B market 2024, projected $255B by 2033. FDA liability currently falls on clinician/institution. 100 new approvals per year pace.
intuitionlabs.ai →
05 — Stay ahead of the shift
Veltrix Collective
The future of healthcare
is already here.
Weekly briefings on AI tools, adoption trends, and what actually matters — across healthcare, work, and daily life. No hype. Just signal. Join readers navigating the shift.
Weekly, every Tuesday · No spam ·
Privacy policy · Unsubscribe anytime