Your Next Doctor or Practitioner Might Be AI: But Not Quite as You May Think

AI has changed the dynamic between patient, clinician and practitioner. Will we ever see a future of digital AI clinicians and practitioners? Yes, but it might also increase the value of human interaction.

Subscribe to Elevate Newsletter

An early signal we’ve been tracking for some time is the role of AI companions in healthcare. The inevitability of ChatGPT style interactions being implemented in healthcare settings to help people interpret results, explain conditions, diagnose and prescribe treatments is primed for Elevate Ninety potential (an innovation that scores 90+ in our innovation foresight index, identifying it as an early signal with the potential to be highly disruptive and a market shaping force)

Already, people are increasingly turning to ChatGPT for advice on their health and well-being needs prompting OpenAI to announce plans for developing their own health apps. Furthermore, the NHS are developing their own large language model (LLM) interface. In June (2025), NHS announced their plan to roll ‘My Companion’ as a feature as part of their NHS app by 2028.

A companion, not a replacement

Despite, the centre of gravity in care tilting toward AI companions for improving patient experience, it also offers a useful companion on the supply side. Hospitals are adopting ambient AI to free clinicians from notes, speeding up documentation, shifting time back to patient conversations and improve clinician’s ability to diagnose and treat patients. Just take a look at HCA’s expansion with Commure showing how AI will ‘go beyond simple automation to enhance clinical workflows’.

And regulators are normalising AI: the FDA now tracks 1,000+ AI/ML-enabled devices on market; UK bodies like NICE are actively shaping evidence standards for AI dermatology triage.

What Trends are Driving Adoption?

If you zoom out for a moment, the rise of AI health companions isn’t happening in a vacuum. It’s being pulled forward by a cluster of macro-forces that are all pointing in the same direction: we simply won’t be able to deliver modern healthcare without an intelligent layer sitting between people, providers and data.

The healthcare system is overstretched and AI fills the gaps.

Demand is rising, clinicians are burning out, and most systems don’t have the workforce capacity to keep up. Intelligent companions become the first line of “explanation and navigation”, translating jargon, guiding next steps, and reducing unnecessary appointments. It’s triage, admin, and basic decision support, but at population scale. This makes LLMs not just useful… but inevitable.

Consumers already made the leap before the system did.

People are now more comfortable asking ChatGPT a sensitive health question than Googling it. And unlike early symptom checkers, LLMs talk back in context, build rapport, remember preferences, and give personalised answers that feel human. Once behaviour changes upstream, institutions have no choice but to follow.

Clinical workflows are finally becoming ‘AI-native’.

Ambient clinical AI is the quiet revolution here. When hospitals see that speech-to-notes automation unlocks an extra hour of face-to-face time per shift and improves diagnostic accuracy because clinicians aren’t multitasking, the business case writes itself. This is why we’re seeing the shift from “AI tools” to “AI teammates”. It’s workflow-level augmentation and not a bolt on.

Regulators are moving from cautious to constructive.

We’ve crossed the threshold: regulation is no longer the blocker, it’s becoming the scaffolding. The FDA’s 1,000+ cleared AI/ML devices, NICE’s evidence frameworks, and NHS England building standards for LLM safety all signal a world where AI companions are not experimental, they’re expected. Once norms solidify, adoption accelerates.

Health data is fragmenting and companions become the glue.

People now generate more personal health data than ever: wearables, labs, diagnostics, gym platforms, nutrition apps. But that data is useless unless something can interpret it, contextualise it, and provide next-step guidance. This is where LLM companions shine. They become the interface layer, becoming your personal translator between raw data and actionable insight.

The economics are irresistible.

Every major health system is looking for scalable, low-friction ways to reduce cost per patient without compromising care. AI companions are the only solution that scale faster than demand. Once CFOs realise that AI can simultaneously improve satisfaction, reduce admin load, and free up clinician time, resistance evaporates.

Signals We Are Tracking This Week:

Below are the clearest, highest-impact signals from recent sources that show the AI health-companion shift accelerating across sectors. And why they score high on Elevate Ninety’s Foresight Index.

1. AI Companions and LLM Triage CoPilots Are Becoming Health Systems

Sectors: Digital Health & Fitness, Medical/Pharma

Relevant Foresight Index Innovations: AI-Generated Health Companions / Chat-Based Coaching Interfaces, Multi-Device Health Coaching Hub, Unified Lab Results Vault for Consumers

  • Signal: PureHealth’s Pura app rolled out AI health-companion features at population scale, now covering 600k+ users and integrating labs, wearable data and personalised nutrition guidance. This is a national health operator reframing a companion as a core access point, not a wellness add-on.

  • Signal: TruDoc Healthcare launched new app for GCC region & India which bundles 24/7 virtual doctor consults with AI capabilities such as medical test’s results summariser, prescription ordering, facial scan for vital signs and wearable data integration.

  • Signal: Thailand’s National Health Security Office launched “Doctor Home”, described as an AI-powered smart physician offering preliminary health screening via a national platform – effectively a state-backed virtual GP front door.

  • Why it matters: High Foresight Index indicators, especially Adoption Potential and Macro-Trend Alignment, as health systems formalise AI as the front door to care. Massive validation for the “multi-device health OS” model where LLMs become co-clinicians that reduce workload, pre-screen cases and accelerate routing

2. AI-Driven Preventative Cardiology Hits Clinical Momentum

Sector: Digital Health & Fitness, Medical/Pharma, Fitness & Exercise

Relevant Foresight Index Innovations: AI-Enabled Preventative Cardiology Platforms, Continuous Multi-Biomarker Monitoring Platforms, Wearable Blood Pressure Patch. Implantable Continuous Blood monitors.

  • Signal: PMcardio’s AI-ECG platform reported positive randomised trial results, Queen of Hearts AI-ECG model in STEMI detection showing AI-enhanced ECGs can reduce false cath-lab activations and speed up diagnosis.

  • Signal: Meanwhile, vTitan launched vCardio, an edge-AI wearable cardiac monitor, providing continuous ECG at home, and can detect 20+ arrhythmias with edge-Ai processing.
  • Why it matters: Cardiology is shaping up to be the first mainstream category where consumers rely on a continuous AI companion to detect disease risk before symptoms. This pushes Market Potential and Adoption Potential into Elevate Ninety territory.

3. Seizure-Prediction Wearables Move from R&D to Market

Sector: Digital Health & Fitness, Medical/Pharma, Mental Wellbeing

Relevant Foresight Index Innovations: AI Seizure Prediction Wearables, Continuous Multi-Biomarker Monitoring Platforms

  • Signal: Neuraxpharm and mjn-neuro announce the launch of EPISERAS®. It’s an AI-powered earpiece for continuous seizure prediction, secured CE and UKCA approvals. Furthermore, a multi-country real-world data study now underway with interim data being prepared for publication.
  • Why it matters: This is one of the clearest examples of a narrow, high-stakes intelligent companion. An always on, behavioural-context aware, and clinically validated wearable. Strong Uniqueness and Longevity scores; early signs of a category that will expand beyond epilepsy into broader neurological prediction markets.

Other signals to take note:

Safe deployment of Large Language Models (LLMs) still in evolution:

  • Yet, regulators and clinicians are pushing back on “therapy via general-purpose LLMs”. Over the last three months, major media and professional bodies (e.g. NHS clinicians, KCL researchers, FT, Guardian) have highlighted unsafe or harmful advice from general-purpose LLMs when used as DIY therapy, and warned that AI companions marketed to young people blur boundaries between “friend”, “therapist” and “influencer.”

Foresight Index Innovation Highlight

The signals are pointing towards the importance of practitioner lead LLMs as concerns grows for the accuracy of information for consumer’s using LLM’s to seek DIY resolutions to health issue. If there’s one innovation that shifts the AI as the practitioner and therapist to being used as a co-pilot to a qualified expert, it’s LLM Triage Co-pilots for health and fitness professionals.

LLM Triage Copilot for Health & Fitness Professionals

Unlike general-purpose chatbots pitched directly at the public, these systems sit behind the scenes, supporting coaches, PTs, nutritionists, and allied health professionals with structured triage, risk flagging, program adjustments and evidence-based guidance.

While regulators and practitioners are highlighting the dangers and risks of using LLM’s for DIY lead consultations, this backlash doesn’t slow the category. In fact, it shifts it. It moves the centre of innovation from consumer self-help tools to professionally supervised AI, where safety, traceability, and scope of practice constraints are built in from day one.

The triage copilot is the beneficiary of this shift. It’s the model regulators prefer, clinicians trust, and professional bodies can meaningfully integrate.

Our foresight Index scores the Innovation across our 5 innovation factors (scores out of 100):

Foresight Index Innovation Scores:

Adoption Potential: 86

Market Potential: 88

Longevity: 86

Innovation Uniqueness: 88

Macro Trend Alignment: 92

Foresight Index Score: 88

Future Elevate Zone Potential: Yes

Our Foresight Point of View

The human professional remains in charge, but the copilot becomes a constant presence always-on risk detection, contextual guidance, and quality control.

In the short term (1 – 3 years), AI becomes the first pass assistant. Expect widespread adoption across gyms, clinics and digital coaching platforms:

  • structured intake and red-flag screening,
  • automated progress checks,
  • instant evidence retrieval and program adjustment suggestions.

Professionals don’t lose control, they gain a safety net. This will become the new baseline expectation across hybrid coaching and health apps.

In the mid term (3 – 5 years), co-pilots integrate with full health data ecosystems. We’ll see:

  • interoperability with EHRs, wearables, labs and recovery tools
  • risk scoring that adapts dynamically to the client’s real data
  • shared decision-making workflows between practitioners, AI, and clients

The co-pilot becomes a connective tissue across fragmented data, reducing admin, strengthening boundaries, and making hybrid care feel more “continuous” and personalised.

Finally, over the long term (5 – 10 Years), the emergence of regulated, sector specific AI para-professionals. By the end of the decade, LLM co-pilots evolve into licensed digital assistants with regulated scopes of practice. They’ll handle:

  • autonomous low-risk triage
  • ongoing monitoring and behavioural nudges
  • practitioner-level documentation and care summaries
  • escalating complex cases to humans

The human professional remains in charge, but the copilot becomes a constant presence, always-on risk detection, contextual guidance, and quality control.

Ultimately, we foresee a future where the practitioner is augmented, not automated. And the organisations that win are the ones who adopt these tools early enough to build new workflows around them.

The Takeaway

AI health companions are no longer a fringe experiment, they’re becoming the front door to care. Health systems are overstretched, consumers have already adopted LLM-style interactions, and regulators are now building the scaffolding for safe deployment. The real shift isn’t “AI replacing clinicians and practitioners”, but AI acting as a companion layer that interprets data, supports decisions, flags risks, and stitches together fragmented health information. The momentum is moving away from DIY AI therapy and toward professionally supervised copilots that extend the capabilities of practitioners, not override them.

Founders and CEOs Takeaway

AI companions are quickly becoming a table stakes capability for any organisation operating in health, fitness, or wellness. This isn’t about keeping up with technology, it’s about future-proofing care models and operational efficiency. The winners will be those who adopt AI copilots early, redesign workflows around them, and position their brand as augmented, not automated. If your product or service touches data, triage, personalisation, or guidance, AI integration must move from roadmap to reality.

In a world where health systems are stretched and data is multiplying, the organisations that thrive will be those who treat AI not as a replacement for clinicians, but as the intelligent layer that elevates every decision, every interaction, and every outcome.

Marketers Takeaway

The narrative is shifting from AI as novelty to AI as trust multiplier. Consumers want clarity, empathy, guidance and not overwhelming dashboards. Your opportunity is to communicate AI not as a robot doctor, but as a support layer that improves experience, personalisation and outcomes. Messaging that blends safety, supervision, and partnership will land best. This is also a chance to reposition your brand as “intelligently enabled” without promising automation that users don’t want.

Innovators Takeaway

The highest-value opportunities sit in supervised intelligence, not consumer-facing LLMs pretending to be therapists or clinicians. Build tools that:

  • augment practitioners,
  • integrate fragmented data,
  • enforce scope of practice boundaries,
  • support decision-making, not replace it.

This is where regulators are supportive, where adoption curves are steepest, and where the Elevate Ninety Index consistently scores 85+. If you’re designing AI for health, design for workflow, not just conversation.

Investors Takeaway

The investable space is shifting from general-purpose chatbots to verticalised, compliance-ready AI copilots. The strongest signals are in:

  • LLM triage copilots,
  • AI-assisted preventative cardiology,
  • multi-biomarker monitoring with AI interpretation,
  • regulated AI companions for high-stakes conditions.

These categories show repeatable evidence gains, regulatory momentum, and clear cost-saving value propositions. Expect early winners to become infrastructure companies (not just apps) sitting between data, provider, and patient.

Practitioners Takeaway

AI isn’t here to replace your clinical judgement, it’s here to give you time back, reduce cognitive load, and help you operate at the top of your licence. The shift to AI-native workflows means:

  • fewer admin hours,
  • better documentation,
  • earlier risk detection,
  • more meaningful client time.

And because LLM copilots operate within your professional scope, you stay in full control. Over the next decade, these copilots will quietly become your default co-worker, always-on, always cross-referencing data, and always supporting your decisions. The future practitioner is not replaced; they’re amplified.

Subscribe to Elevate Newsletter

Receive our digestible weekly newsletter on the key search trends, industry round-up, quick-tips and stay up to date on industry events and campaigns

Share:

Facebook
Twitter
Pinterest
LinkedIn