Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Government official retires from CDC cruise ship program

    May 7, 2026

    Italian pharmaceutical company Angelini acquires Catalyst in $4.1 billion deal

    May 7, 2026

    Perpetrators of AI sexual abuse often view their actions as a joke, new study shows

    May 7, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Health Magazine
    • Home
    • Environmental Health
    • Health Technology
    • Medical Research
    • Mental Health
    • Nutrition Science
    • Pharma
    • Public Health
    • Discover
      • Daily Health Tips
      • Financial Health & Stability
      • Holistic Health & Wellness
      • Mental Health
      • Nutrition & Dietary Trends
      • Professional & Personal Growth
    • Our Mission
    Health Magazine
    Home » News » Perpetrators of AI sexual abuse often view their actions as a joke, new study shows
    Mental Health

    Perpetrators of AI sexual abuse often view their actions as a joke, new study shows

    healthadminBy healthadminMay 7, 2026No Comments6 Mins Read
    Perpetrators of AI sexual abuse often view their actions as a joke, new study shows
    Share
    Facebook Twitter Reddit Telegram Pinterest Email


    A survey of more than 7,000 people in Australia, the UK and the US found that 3.2% of the population reported engaging in the creation, sharing and/or threat of sharing sexual deepfakes. Men, young adults, non-white respondents, and people with disabilities were more likely to engage in this behavior. 18% of people viewed these images intentionally, mostly out of curiosity. This study Computers in human behavior.

    Sexual deepfakes are synthetic sexual images, videos, or audio recordings created or altered using AI or other digital tools. These are usually created to make it appear as if a real person is naked, engaging in sexual acts, or saying sexual things, even though that is not actually happening. Sexual deepfakes can use a real person’s face, body, voice, and likeness and combine them with fabricated sexual content.

    Many sexual deepfakes are non-consensual, meaning the person depicted did not consent to the creation or sharing of the material. Non-consensual sexual deepfakes can be used for harassment, humiliation, intimidation, revenge, and sexual exploitation. Even if viewers know the content is fake, they can damage an individual’s reputation, privacy, safety, relationships, and mental health.

    Although the sexual content displayed in such deepfakes is not real, the harm they cause can still be very real if the deepfakes depict a real, identifiable person. For this reason, sexual deepfakes are increasingly being treated as a serious legal and ethical issue in many jurisdictions. In research and policy development, these are typically described as synthetic sexualized media depicting an identifiable person without their consent.

    Study author Rebecca Umbach and her colleagues wanted to find out how often people engage in what they call sexual abuse based on AI-generated images. This behavior includes non-consensual creation of AI-generated intimate images (i.e. sexual deepfakes), non-consensual sharing of AI-generated intimate images, and threats to share AI-generated intimate images. The study authors also looked at how many people view such images and how often. More specifically, they were interested in content generated using a variety of platforms, from those that use AI to digitally remove clothing and generate explicit synthetic content, to more sophisticated deepfake generators and custom-built models.

    They conducted an online survey. Participants in the study were 7,231 respondents from Australia, the UK and the US, recruited by Sago, a leading market research company with its own online panel. Approximately 2,400 people responded from each of the three countries. The study authors said they selected these countries based on evidence of high “deepfake porn” traffic. Approximately 50-51% of study participants were women. 12-13% of participants identified as LGBTQ+. 18-20% were disabled.

    The survey directly asked participants whether they engaged in the nonconsensual creation, sharing, or threat of sharing digitally altered sexual images. For example, participants were asked, “How many times, since you turned 18, have you posted, sent, or shown a fake or digitally altered nude/sexual image (photo or video) of someone (over 18) without their permission?”

    The survey also asked about participants’ demographic data, their relationship with the person in the sexual content (e.g., “former sexual partner,” “family member,” “acquaintance,” etc.), and motivations for their actions. Participants were also asked whether they had ever intentionally viewed AI-generated nude or sexual photos or videos of celebrities, public figures, influencers, or ordinary people. Those who said they intentionally saw or looked at such images were asked why they saw the image, why they thought the image was generated by AI, and how they felt when they saw the image.

    Results showed that 3.2% of participants engaged in at least one of the three behavioral study authors considered to be sexual abuse based on AI-generated images. In other words, 3.2% of people reported creating, sharing, or threatening to share a sexual deepfake. This rate varied by country: 6.1% in the UK, 3.5% in Australia, and 2.6% in the US.

    In addition to this, 1.4% of respondents reported that they had created, shared, or threatened to share digitally altered sexual images without the use of AI, and 0.5% reported that they were unsure about AI’s involvement in the images they were manipulating. Additionally, 0.3% of participants reported that they had threatened to share digitally altered images that did not actually exist.

    Further analysis showed that men, younger adults, non-white participants, and participants with disabilities were more likely to engage in these behaviors. (Initially, it appeared that people with lower education levels were more likely to participate in AI-IBSA, but statistical modeling showed that this relationship disappeared when researchers controlled for other demographic factors. Similarly, adjusting for the data eliminated the racial disparity between white and non-white respondents among UK participants.) In most cases, participants reported creating sexual deepfakes because they wanted to experiment with the technology and to show off. Sharing was most often described as being done “for fun/as a joke.”

    26% of those who shared the image and 22% of those who created the image said they wanted to destroy the target’s reputation. 12% of creators and 20% of sharers reported doing it for financial gain. In most cases, the perpetrators targeted current or former sexual partners. Interestingly, participants more often reported sharing deepfake sexual images of men (56%) than women (41%).

    18% of participants reported intentionally viewing sexual deepfake images. Men were 3.6 times more likely than women to intentionally view sexual deepfake images (29% vs. 8%). Similarly, younger adults, LGBTQ+ individuals, non-white participants, and participants with disabilities were more likely to view sexual deepfakes intentionally. The main motive for viewing such images was curiosity, followed by sexual gratification and entertainment.

    However, this study revealed significant gender differences in emotional responses to content. Men were significantly more likely to report feeling amused and excited, while women were much more likely to feel empathy for the depicted characters, sadness for the world, and disgust for the author.

    “These findings suggest that in addition to preventing the creation of non-consensual AI-generated sexual images, sociotechnical interventions are needed to address the seemingly normalized consumption of these images,” the study authors concluded.

    This study contributes to scientific understanding of the behaviors associated with sexual deepfake images. However, all data used in the study were self-reported, leaving room for reporting bias to influence the findings.

    The paper, “AI-generated image-based sexual abuse: Perpetration and consumption across three geographies,” was authored by Rebecca Umbach, Nicola Henry, Renee Shelby, Gemma Stevens, and Kwynn Gonzalez-Pons.



    Source link

    Visited 1 times, 1 visit(s) today
    Share. Facebook Twitter Pinterest LinkedIn Telegram Reddit Email
    Previous ArticlePsychedelic maker Optimi Health plans to list in the US
    Next Article Italian pharmaceutical company Angelini acquires Catalyst in $4.1 billion deal
    healthadmin

    Related Posts

    Your expectations for love predict your satisfaction during your single life

    May 7, 2026

    Elucidating the hidden effects of lithium on Alzheimer’s disease at the cellular level

    May 7, 2026

    Scientists prove how common chord progressions unravel social bonds in the brain

    May 7, 2026

    The human brain appears to rely heavily on thighs to accurately judge a woman’s body size

    May 7, 2026

    A survey of large amounts of data reveals that taking a break from social media does not improve mental health

    May 6, 2026

    EEG reveals why negative emotions steal attention in borderline personality traits

    May 6, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Categories

    • Daily Health Tips
    • Discover
    • Environmental Health
    • Exercise & Fitness
    • Featured
    • Featured Videos
    • Financial Health & Stability
    • Fitness
    • Fitness Updates
    • Health
    • Health Technology
    • Healthy Aging
    • Healthy Living
    • Holistic Healing
    • Holistic Health & Wellness
    • Medical Research
    • Medical Research & Insights
    • Mental Health
    • Mental Wellness
    • Natural Remedies
    • New Workouts
    • Nutrition
    • Nutrition & Dietary Trends
    • Nutrition & Superfoods
    • Nutrition Science
    • Pharma
    • Preventive Healthcare
    • Professional & Personal Growth
    • Public Health
    • Public Health & Awareness
    • Selected
    • Sleep & Recovery
    • Top Programs
    • Weight Management
    • Workouts
    Popular Posts
    • 1773313737_bacteria_-_Sebastian_Kaulitzki_46826fb7971649bfaca04a9b4cef3309-620x480.jpgHow Sino Biological ProPure™ redefines ultra-low… March 12, 2026
    • the-pros-and-cons-of-paleo-dietsThe Pros and Cons of Paleo Diets: What Science Really Says April 16, 2025
    • pexels-david-bartus-442116The food industry needs to act now to cut greenhouse… January 2, 2022
    • Improve Mental Health10 Science-Backed Practices to Improve Mental Health… March 11, 2025
    • 1773729862_TagImage-3347-458389964760995353448-620x480.jpgDespite safety concerns, parents underestimate the… March 17, 2026
    • 1773209206_futuristic_techno_design_on_background_of_supercomputer_data_center_-_Image_-_Timofeev_Vladimir_M1_4.jpegMulti-agent AI systems outperform single models… March 11, 2026

    Demo
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss

    Government official retires from CDC cruise ship program

    By healthadminMay 7, 2026

    WASHINGTON – The top U.S. official responsible for public health on cruise ships is resigning,…

    Italian pharmaceutical company Angelini acquires Catalyst in $4.1 billion deal

    May 7, 2026

    Perpetrators of AI sexual abuse often view their actions as a joke, new study shows

    May 7, 2026

    Psychedelic maker Optimi Health plans to list in the US

    May 7, 2026

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    HealthxMagazine
    HealthxMagazine

    At HealthX Magazine, we are dedicated to empowering entrepreneurs, doctors, chiropractors, healthcare professionals, personal trainers, executives, thought leaders, and anyone striving for optimal health.

    Our Picks

    Psychedelic maker Optimi Health plans to list in the US

    May 7, 2026

    FDA to reconsider shock rejection of cell therapy Evalo. Will uniQure be next?

    May 7, 2026

    Non-hormonal therapy for vasomotor symptoms

    May 7, 2026
    New Comments
      Facebook X (Twitter) Instagram Pinterest
      • Home
      • Privacy Policy
      • Our Mission
      © 2026 ThemeSphere. Designed by ThemeSphere.

      Type above and press Enter to search. Press Esc to cancel.