Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ inappropriate content

Michie

Well-Known Member
Site Supporter
Feb 5, 2002
166,616
56,251
Woods
✟4,675,011.00
Country
United States
Faith
Catholic
Marital Status
Married
Politics
US-Others
Fourteen-year-old Francesca’s life has changed forever.

An email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than 30 students whose images had been digitally altered to appear as synthetic sexually explicit media — sometimes referred to as “deepfake” inappropriate contentography.

Speaking to media, Francesca shared how she felt betrayed, saying, “We need to do something about this because it’s not OK, and people are making it seem like it is.” She’s right — something must be done.

The issue of image-based sexual abuse (IBSA) — whether it manifests as nonconsensual AI or “deepfake” content, nonconsensual recording or sharing of explicit imagery, extortion or blackmail based on images, recorded sexual abuse, or its many other manifestations — can feel like something that happens to “others.” It’s a headline we scroll past. It can feel distant from our own lives. But that’s far from the truth.

If anyone has ever taken a video or photo of you and posted it online, even an innocent family photo or professional headshot, that’s all it takes.

You and your loved ones are personally at risk for having your images turned into sexually explicit “synthetic,” “nudified,” or “deepfake” content.

Continued below.
https://thefederalist.com/2023/12/0...-hijacked-for-deepfake-inappropriate content/