AMA introduces healthcare systems toolkit

The American Medical Association has created a new toolkit to guide healthcare systems in establishing a governance framework for implementing and scaling artificial intelligence systems.

The initiative was prompted by a dramatic increase in physicians’ use of AI since 2023, say AMA leaders, and it aims to address the associated benefits and risks – including liability and patient safety – of AI and machine learning deployments.

Why it matters
After studying how physicians’ use of artificial intelligence has grown over the past two years, AMA developed recommendations on the benefits and unforeseen consequences of large language models, such as generative pretrained transformers and other AI-generated medical advice or content.

The STEPS Forward Governance for Augmented Intelligence toolkit, developed with support from Manatt Health, can help provider organizations identify, assess and prioritize AI usage risks to better ensure patient safety and care equity.

The purpose of the guide is to help practices identify risks and spell out when AI can be used, in an overarching AI governance policy that is not burdensome to produce, AMA said in its announcement on Monday

AMA’s pillars of responsible AI adoption are:

  • Establishing executive accountability and structure.
  • Forming a working group to detail priorities, processes and policies.
  • Assessing current policies.
  • Developing AI policies.
  • Defining project intake, vendor evaluation and assessment processes.
  • Updating standard planning and implementation processes.
  • Establishing an oversight and monitoring process.
  • Supporting AI organizational readiness.

The AMA toolkit provides resources to help providers evaluate existing policies, such as what to include in an AI governance policy and a downloadable model policy that can be modified to align with their existing governance structure, roles, responsibilities and processes.

Dr Margaret Lozovatsky, AMA’s chief medical information officer and vice president of digital health innovations, said in a statement that healthcare AI technology is evolving faster than hospitals can implement tools and stressed the importance of governance.

Elaborating further, she told Healthcare IT News by email on Tuesday that “Physicians must be full partners throughout the AI lifecycle, from design and governance to integration and oversight, to ensure these tools are clinically valid, ethically sound and aligned with the standard of care and the integrity of the patient-physician relationship.”

While there is excitement about the transformative potential of AI to enhance care, operations and improve outcomes, “At the same time, there is concern about AI’s potential to worsen bias, increase privacy risks, introduce new liability issues and offer seemingly convincing yet ultimately incorrect conclusions or recommendations that could affect patient care,” she said. “The rapid adoption of AI in healthcare must be guided by physician leadership and robust organizational governance to ensure AI technologies are implemented into care settings in a safe, ethical and responsible manner.”

The larger trend
AMA has said that providers’ clinical experts are best positioned to determine whether AI applications are high quality, appropriate and valid from a clinical perspective.

And when implementing, managing and scaling clinical AI, organizations must communicate to clinicians and patients how AI-enabled systems or technologies directly impact medical decision-making and treatment recommendations at the point of care.

This past year, AMA released its report, “The 2024 Future of Health: The Emerging Landscape of Augmented Intelligence in Health Care,” which was largely based on the organization’s 2023 AI Physician Sentiment survey of more than 1,000 physicians, interviews with AI experts and a roundtable discussion with specialty society representatives.

The use of AI tools was “not pervasive – 62% of respondents indicated they do not use a listed set of AI tools in their practice today,” said AMA researchers in the study.

The survey had asked which AI use cases doctors currently incorporate into their practices, listing a myriad from automation of insurance pre-authorization, documentation and charting, discharge instructions, care plans and progress notes to patient-facing chatbots, predictive analytics, case prioritization support and more.

However, when AMA repeated the survey the following year, it found a dramatic increase in the number of physicians using these AI tools. Nearly 70% of physicians used AI systems in 2024, up from 38% the year prior, AMA said.

While physicians had concerns about flawed AI, the use of AI use cases nearly doubled to 66% up from 38% in 2023.

“The dramatic drop in non-users (62% to 33%) in just one year is impressive and unusually fast for healthcare technology adoption,” said AMA leaders in the 2024 survey report. “Significantly more physicians are currently using AI for visit documentation, discharge summaries and care plans and medical research and standard of care summaries than in 2023.”

Nearly half (47%) ranked increased oversight as the No. 1 regulatory action needed to increase trust in adopting AI tools.

“One of the most significant concerns raised by physicians regarding the use of AI in clinical practice is concern over potential liability for use of AI that ultimately performs poorly,” AMA noted in its deep dive into the shift in physicians’ AI sentiments.

AMA, in its role, advocates for governance policies that help to ensure that physicians’ risks arising from AI are mitigated to the greatest extent possible.

Physicians must engage with AI “that satisfies rigorous, clearly defined standards to meet the goals of the quadruple aim, advance health equity, prioritize patient safety and limit risks to both patients and physicians,” AMA said.

On the record
“Setting up an appropriate governance structure now is more important than it’s ever been because we’ve never seen such quick rates of adoption,” said Lozovatsky in a statement.

“Effective organizational governance is essential to ensure that AI systems support rather than disrupt clinical workflows, embed ongoing clinical oversight, uphold care quality and provide clear mechanisms for accountability,” she said Tuesday. Healthcare IT News