Home Mobile Apps That Undress Women’s Photos Surge In Popularity. What Could Go Wrong?

Apps That Undress Women’s Photos Surge In Popularity. What Could Go Wrong?

Apps that use artificial intelligence to undress people in photos are surging in popularity, according to new social media research. These programs manipulate existing pictures and videos of real individuals and make them appear nude without consent. Many of these “nudifying” apps only work on women.

A recent study conducted by the social media analytics firm Graphika analyzed 34 companies offering this service, which they call non-consensual intimate imagery (NCII). They found that these websites received a whopping 24 million unique visits in September alone.

One such company’s ads offer “Undress Any Girl You Want.” This virtual undressing industry was almost non-existent last year. The availability of open-source AI image diffusion models has simplified the process of altering photos and allowed these new apps and websites to be launched. As evidence of its substantial growth, the volume of referral links on Reddit and X has soared over 2400% in the last year.

“Bolstered by these AI services, synthetic NCII providers now operate as a fully-fledged online industry, leveraging many of the same marketing tactics and monetization tools as established e-commerce companies. This includes advertising on mainstream social media platforms, influencer marketing, deploying customer referral schemes, and the use of online payment technologies,” the researchers write.

Nude photos shared on the web without consent are nothing new. In 2018, a hacker was sentenced to eight months in prison for his part in sharing authentic naked photographs of the actress Jennifer Lawrence and other celebrities publicly on the internet. Lawrence called the photo hacking a “sex crime.”

Now, such photos can be created with a few simple keystrokes. And it will be hard to tell if the images are authentic or fabricated.

In June, the FBI issued a warning about an increase in the manipulation of photos for sextortion or to create explicit content. “Malicious actors use content manipulation technologies and services to exploit photos and videos—typically captured from an individual’s social media account, open internet, or requested from the victim—into sexually-themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites,” the FBI website warns. They add that the photos are sometimes used as “sextortion” to gain ransom from the subject of the photo.

Adding to the concern, these programs are being used to nudify minors. AI-generated naked images of more than 20 young girls in Spain were uncovered in September. Most of the pictures had been created using fully clothed photos taken from the girls’ Instagram accounts. After the images were altered using the AI-powered app “ClothOff,” the nude photos were shared in WhatsApp groups.

Last month, a similar incident unfolded in New Jersey, involving students creating nude photos of their classmates.

The ability to undress photos of celebrities, classmates, strangers on the bus, executives, coworkers and children remains only a few clicks away. There is currently no federal law banning the creation of these photos, though they are prohibited for use with minors. In November, a child psychiatrist in North Carolina was sentenced to 40 years in prison for child pornography. In addition to other egregious acts, the psychiatrist used AI to digitally alter clothed images of his patients to render them sexually explicit.

As for adult photos, it seems these apps and the images they create remain legal. According to Time, TikTok and Meta have blocked the search word “undress” to reduce access to the programs. Google has also removed some ads for undressing apps and websites.

Follow me on Twitter or LinkedInCheck out my website


 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment