Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Quantum AI is surprisingly good at predicting chaos

    April 18, 2026

    Scientists have found evidence that some of Alzheimer’s symptoms may start outside the brain

    April 18, 2026

    How extreme people view their friends’ humor

    April 17, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Health Magazine
    • Home
    • Environmental Health
    • Health Technology
    • Medical Research
    • Mental Health
    • Nutrition Science
    • Pharma
    • Public Health
    • Discover
      • Daily Health Tips
      • Financial Health & Stability
      • Holistic Health & Wellness
      • Mental Health
      • Nutrition & Dietary Trends
      • Professional & Personal Growth
    • Our Mission
    Health Magazine
    Home » News » Talkative, leaky, and not human-like
    Discover

    Talkative, leaky, and not human-like

    healthadminBy healthadminApril 17, 2026No Comments10 Mins Read
    Talkative, leaky, and not human-like
    Share
    Facebook Twitter Reddit Telegram Pinterest Email



    Vince Lahey of Carefree, Arizona, embraces chatbots. From big tech products to “shady” products, they offer “someone with whom you can share more secrets than a therapist.”

    He particularly likes the app for feedback and support, but it can also cause him to get reprimanded and get into fights with his ex-wife. “It made me want to share more,” Lahey said. “I don’t care about their perception of me.”

    There are many people like Lahey.

    Demand for mental health care is increasing. The number of self-reported days with poor mental health has increased by 25% since the 1990s, a study analyzing survey data found. According to the Centers for Disease Control and Prevention, the suicide rate in 2022 will be on par with the suicide rate in 2018, which was the highest in nearly 80 years.

    Many patients are attracted to non-human therapists who utilize artificial intelligence. It’s more appealing than someone lying on the couch with a stern demeanor. Social media is full of videos of people begging for therapists who “don’t have the time,” have poor judgment, or simply charge less.

    Tom Insel, former director of the National Institute of Mental Health, said most people who need care aren’t getting it, citing research from his former agency. Of those people, 40% receive “minimally acceptable care.”

    “There’s a huge need for quality care,” he says. “In scientific terms, we’re in a world where things are really bad.”

    Insel said that last fall, OpenAI engineers told him that about 5% to 10% of the company’s user base, which at the time was about 800 million people, relied on ChatGPT for mental health support.

    Polls suggest that these AI chatbots could become even more popular among young people. A KFF poll found that nearly 3 in 10 respondents between the ages of 18 and 29 have turned to an AI chatbot for mental or emotional health advice in the past year. Uninsured adults were approximately twice as likely to report using AI tools compared to insured adults. Additionally, nearly 60% of adult respondents who used chatbots for mental health did not follow up with a live professional.

    The app puts you on the couch

    The burgeoning app industry provides AI therapists with human-like, often unrealistic and attractive avatars to serve as a sounding board for people experiencing anxiety, depression, and other symptoms.

    KFF Health News confirmed in March that there were approximately 45 AI therapy apps on Apple’s App Store. Although many services charge high fees, with some annual plans costing $690, they are generally cheaper than talk therapy, which can cost hundreds of dollars an hour without insurance coverage.

    In the App Store, the word “cure” is often used as a marketing term, and it says in small print that an app cannot diagnose or treat a disease. There is one app branded OhSofia! Downloads of AI Therapy Chat have reached six figures, Osofia said. Founder Anton Ilyin took office in December.

    “People are seeking therapy,” Irin says. On the other hand, the product’s name promises “therapy chat.” However, the company warns in its privacy policy that it “does not provide medical advice, diagnosis, treatment, or crisis intervention, nor is it intended to be a substitute for professional medical services.” The app has a disclaimer, so executives don’t think it will cause confusion.

    The app promises great results even without backup. One company promises users “instant help during a panic attack.” Another claims it is “proven to be effective by researchers” and reduces anxiety and stress 2.3 times faster. (I haven’t said what it’s faster than.)

    Vail Wright, senior director of the American Psychological Association’s Office of Healthcare Innovation, said there are few legal or regulatory guardrails for how developers refer to their products or even whether they are safe or effective. Even federal patient privacy protections don’t apply, she said.

    “Therapy is not a legally protected term,” Wright said. “So basically anyone is doing therapy.”

    John Taurus, a psychiatrist and clinical informatician at Beth Israel Deaconess Medical Center, said many apps “overrepresent themselves.” “Deceiving people into thinking they’ve received treatment when they actually haven’t has many negative consequences, including delaying actual treatment,” he said.

    States such as Nevada, Illinois and California are trying to sort out the regulatory mess by enacting laws that prohibit apps from describing their chatbots as AI therapists.

    “It’s a profession. People go to school. They get a license to do it,” said Nevada state Rep. Jovan Jackson, co-author of a bill that would prohibit apps from calling themselves mental health professionals.

    Underlying the hype are outside researchers and company representatives themselves telling the FDA and Congress that there is little evidence to support the effectiveness of these products. Studies have yielded contradictory answers, with some suggesting that companion-centric chatbots are “consistently bad” at crisis management.

    “When it comes to chatbots, we don’t have enough evidence that they work,” says Charlotte Blais, a professor at Sweden’s Uppsala University who specializes in trial design for digital health products.

    The lack of “high-quality” clinical trials is due to the FDA not providing recommendations on how to test products, he said. “FDA does not provide strict advice on what the standards should be.”

    In response, Department of Health and Human Services spokeswoman Emily Hilliard said that “patient safety is the FDA’s top priority” and that AI-based products are subject to agency regulations that require them to demonstrate “reasonable assurance of safety and effectiveness before being sold in the United States.”

    poisonous app

    Preston Roche, a psychiatry resident and active on social media, gets a lot of questions about whether AI is a good therapist. Having tried ChatGPT himself, he said he was initially “impressed” with how it helped him “test out” negative thoughts using cognitive behavioral therapy techniques.

    But Roche said he became disillusioned after seeing social media posts discussing people developing mental illness and people being encouraged to make harmful decisions. He concluded that bots are sycophants.

    “When we look at the responsibilities of therapists globally, we find that it’s completely burdensome,” he says.

    This propensity for pandering, apps based on large-scale language models that empathize with, flatter, or deceive their human interlocutors, is inherent in their design, digital health experts say.

    “These models were developed to answer your questions and prompts and give you what you’re looking for, and they’re basically very good at affirming what you’re feeling and providing psychological support like a good friend would,” said Insel, a former NIMH director.

    But that’s not what a good therapist does. “The point of psychotherapy is primarily to get you to deal with the things you’ve been avoiding,” he says.

    While polls show that many users are happy with what they get from ChatGPT and other apps, there have also been high-profile reports that the service offers advice and encouragement to self-harm.

    Additionally, at least a dozen lawsuits alleging wrongful death or serious harm have been filed against OpenAI after ChatGPT users committed suicide or were hospitalized. In most of these cases, plaintiffs claim they began using the app for a single purpose, such as schoolwork, before disclosing it. These lawsuits have been consolidated into a class action lawsuit.

    Google and Character.ai, a Google-funded startup that creates “avatars” that take on specific personas such as athletes, celebrities, fellow researchers and therapists, have settled other wrongful death lawsuits, according to media reports.

    OpenAI CEO Sam Altman said up to 1,500 people a week could talk about suicide on ChatGPT.

    In a public Q&A reported by the Wall Street Journal, Altman referenced the specific model of ChatGPT, which will be introduced in 2024, saying, “We’ve seen problems where using models like 4o can make people in mentally vulnerable situations even worse,” adding, “I don’t think this will be the last time we face challenges like this with a model.”

    An OpenAI spokesperson did not respond to a request for comment.

    The company said it is working with mental health experts on safety measures, including referring users to the national suicide hotline 988. However, the lawsuit against OpenAI alleges that existing safeguards are not sufficient, and some studies show that the problem is getting worse over time. OpenAI has published its own data suggesting the opposite.

    OpenAI has defended itself in court, with defenses ranging from denying that its product caused self-harm in the early stages of one case to claiming that the defendants misused the product by inducing discussion of suicide. He said he is also working on improving safety features.

    Smaller apps also rely on OpenAI or other AI models to power their products, executives told KFF Health News. Startup founders and other experts said in interviews that they fear that if companies simply import these models into their services, they could replicate safety flaws present in the original products.

    data risk

    A review of the App Store by KFF Health News found that the listed age restrictions are minimal. It says 15 of the approximately 40 apps can be downloaded by users as young as 4 years old. A further 11 people said it could be downloaded by people 12 and older.

    Privacy standards are opaque. Although the App Store said some apps do not track personally identifiable data or share it with advertisers, company websites’ privacy policies contained statements to the contrary about the use of such data and disclosure to advertisers such as AdMob.

    In response to a request for comment, Apple spokesperson Adam Dema sent a link to the company’s App Store policies that prohibit apps from using health data for advertising purposes and require them to display information about how the data is commonly used. Mr. Dema did not respond to requests for further comment on how Apple enforces these policies.

    Researchers and policy advocates said sharing psychiatric data with social media companies means patient profiles could be identified. They could be targeted by risky treatment companies or charged different prices for products depending on their health status.

    KFF Health News reached out to multiple app makers about these discrepancies. The two companies that responded said their privacy policies were compiled in error and promised to change their policies to reflect their stance on advertising. (A third, OhSofia!’s team, said only that the app’s privacy policy says users can “opt out of marketing communications,” but that it does not advertise.)

    One executive told KFF Health News there is business pressure to maintain access to data.

    “My general feeling is that a subscription model is far superior to any advertising,” said Tim Rubin, founder of Wellness AI, adding that he would be changing the wording of the app’s privacy policy.

    One investor reportedly advised him not to stop advertising. “That’s essentially what they see as the most valuable thing about having an app like this and that data.”

    “I think we’re still at the beginning of a revolution that’s coming in the way people seek psychological support and, in some cases, therapy,” Insel said. “And what concerns me is that there is no framework for any of this.”



    Source link

    Visited 1 times, 1 visit(s) today
    Share. Facebook Twitter Pinterest LinkedIn Telegram Reddit Email
    Previous ArticleNew federal Medicaid rules require a month of work. Some states require even more.
    Next Article People with good cardiorespiratory fitness tend to have less anxiety and are more resilient in emotional situations.
    healthadmin

    Related Posts

    Researchers discover how the composition of cell membranes promotes cancer growth

    April 17, 2026

    Scientists discover unexpected immune pathway in mRNA cancer vaccine

    April 17, 2026

    New federal Medicaid rules require a month of work. Some states require even more.

    April 17, 2026

    As U.S. birth rate declines, federal action could make pregnancies more dangerous

    April 17, 2026

    Obesity, GLP-1, metabolic care

    April 17, 2026

    Childhood influenza infection leaves a lasting immune signature

    April 17, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Categories

    • Daily Health Tips
    • Discover
    • Environmental Health
    • Exercise & Fitness
    • Featured
    • Featured Videos
    • Financial Health & Stability
    • Fitness
    • Fitness Updates
    • Health
    • Health Technology
    • Healthy Aging
    • Healthy Living
    • Holistic Healing
    • Holistic Health & Wellness
    • Medical Research
    • Medical Research & Insights
    • Mental Health
    • Mental Wellness
    • Natural Remedies
    • New Workouts
    • Nutrition
    • Nutrition & Dietary Trends
    • Nutrition & Superfoods
    • Nutrition Science
    • Pharma
    • Preventive Healthcare
    • Professional & Personal Growth
    • Public Health
    • Public Health & Awareness
    • Selected
    • Sleep & Recovery
    • Top Programs
    • Weight Management
    • Workouts
    Popular Posts
    • the-pros-and-cons-of-paleo-dietsThe Pros and Cons of Paleo Diets: What Science Really Says April 16, 2025
    • Improve Mental Health10 Science-Backed Practices to Improve Mental Health… March 11, 2025
    • How Healthy Living Is Transforming Modern Wellness TrendsHow Healthy Living Is Transforming Modern Wellness… December 3, 2025
    • Kankakee_expansion.jpgCSL releases details of $1.5 billion U.S.… March 10, 2026
    • urlhttps3A2F2Fcalifornia-times-brightspot.s3.amazonaws.com2Fc32Fcd2F988500d440f2a55515940909.jpegA ‘reckless’ scrapyard with a history of… October 24, 2025
    • Healthy Living: Expert Tips to Improve Your Health in 2026Healthy Living: Expert Tips to Improve Your Health in 2026 November 16, 2025

    Demo
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss

    Quantum AI is surprisingly good at predicting chaos

    By healthadminApril 18, 2026

    A new study led by researchers at University College London has shown that combining quantum…

    Scientists have found evidence that some of Alzheimer’s symptoms may start outside the brain

    April 18, 2026

    How extreme people view their friends’ humor

    April 17, 2026

    Higher intelligence in adolescence is associated with lower risk of mental illness in adulthood

    April 17, 2026

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    HealthxMagazine
    HealthxMagazine

    At HealthX Magazine, we are dedicated to empowering entrepreneurs, doctors, chiropractors, healthcare professionals, personal trainers, executives, thought leaders, and anyone striving for optimal health.

    Our Picks

    Higher intelligence in adolescence is associated with lower risk of mental illness in adulthood

    April 17, 2026

    Maturation of brain pathways explains sudden leaps in children’s language abilities

    April 17, 2026

    Researchers discover how the composition of cell membranes promotes cancer growth

    April 17, 2026
    New Comments
      Facebook X (Twitter) Instagram Pinterest
      • Home
      • Privacy Policy
      • Our Mission
      © 2026 ThemeSphere. Designed by ThemeSphere.

      Type above and press Enter to search. Press Esc to cancel.