AI in mental health care: Where to draw the line?

(The Center Square) – Artificial intelligence has become a standard tool in most industries, including health care.

Yet, institutions are still working to draw lines around acceptable use within their specific contexts.

For instance, schools may tell students they can use the technology to craft an outline but not to write a paper. For mental health care, one legislator has posed that the technology should go no further than administrative tasks.

Rep. Melissa Shusterman, D-Paoli, has introduced legislation into the state House that would prohibit mental health professionals from using AI for any tasks beyond the administrative workload they can help to streamline.

“As artificial intelligence (AI) is increasingly being used in mental and behavioral health care settings, the potential risks associated with providing inaccurate, biased, or inconsistent medical recommendations can undermine patient care, resulting in substandard services and possible harm to the patient,” wrote Shusterman in a memo supporting the bill.

- Advertisement -

As it stands, many professionals use AI to assist with scheduling, accounting, notetaking, and emailing. Today’s business software across industries typically includes varying levels of AI, often occurring automatically, like phrasing and grammar suggestions in an email or word processing app.

Clinical uses, however, offer a brave new world of medical care. Medical schools are incorporating AI tools into their educational framework, heralding its potential uses from diagnosis to monitoring and therapeutics, but states are still working out the specifics of regulation. Blanket prohibitions could derail or delay innovations some hope will fundamentally change our relationship to disease.

For its part, the World Health Organization has outlined its ethical concerns around AI while detailing core principles for its use. In the U.S., thinking on the matter at the federal level has shifted drastically with different administrations, leaving an open field for experimentation.

Shusterman noted that there are no FDA approved chatbots.

Outside of clinical settings, there has been growing alarm about increased personal use of chatbots for therapeutic purposes.

The personalized nature of the relationship between user and bot has led to mixed results. While some adopters find AI to be a useful tool to talk their thoughts out, instances of self-harm and even what has been dubbed “AI psychosis” have experts wondering if the benefits are worth the risks.

- Advertisement -

In response to concerns, ChatGPT parent company OpenAI said that they are scanning user conversations for signs of trouble. When spotted, humans review the case and take appropriate action, whether that’s suspending an account or reporting a user to the authorities.

Shusterman’s legislation won’t stop incidents like this from happening for users working directly with AI, but it can stop mental health care professionals from becoming overly reliant on technology to answer tough questions. Diagnostic criteria are often complex, and there’s overlap in symptoms and behaviors for a wide variety of mental disorders.

“Mental health care requires a personalized approach based on human emotion, education and professional experience, as well as standards of ethics,” wrote Shusterman. “AI can be permitted to function as a supplementary administrative tool but not replace the expertise of health care providers.”

spot_img
spot_img

Hot this week

Health care company agrees to pay $22.5 million to settle claims of over billing

A health care company agreed to pay nearly $22.5...

African and Caribbean Nations Call for Reparations for Slave Trade, Propose Global Fund

Nations across Africa and the Caribbean, deeply impacted by...

Sports betting expert offers advice on paying taxes for gambling winnings

(The Center Square) – Tax season is underway, and...

Business association ‘disappointed’ by WA L&I’s proposed workers comp rate hike

(The Center Square) – The Association of Washington Business...

Entertainment district benefits don’t outweigh the cost, economists say

(The Center Square) — Weeks later, after more details...

Trump threatens tariffs on China over ‘hostile’ rare earths policy

President Donald Trump threatened a "massive increase" in tariffs...

Trump administration appeals Illinois TRO blocking National Guard deployment

(The Center Square) – The Trump administration is appealing...

2024 was deadliest year for journalists on record

Last year was the deadliest year for journalists on...

Arctic cutters pact led by Louisiana shipbuilder

(The Center Square) − Louisiana-based Bollinger Shipyards has been...

Govt shutdown raises concerns over national security

(The Center Square) - As partisan divides appear to...

$4.5B awarded in new contracts to build Smart Wall along southwest border

Roughly $4.5 billion in contracts have been awarded to...

More like this
Related

Trump threatens tariffs on China over ‘hostile’ rare earths policy

President Donald Trump threatened a "massive increase" in tariffs...

Trump administration appeals Illinois TRO blocking National Guard deployment

(The Center Square) – The Trump administration is appealing...

2024 was deadliest year for journalists on record

Last year was the deadliest year for journalists on...

Arctic cutters pact led by Louisiana shipbuilder

(The Center Square) − Louisiana-based Bollinger Shipyards has been...