Seeing is believing the real and present danger of fake AI images
Advertisement
Read this article for free:
or
Already have an account? Log in here »
To continue reading, please subscribe:
Monthly Digital Subscription
$1 per week for 24 weeks*
- Enjoy unlimited reading on winnipegfreepress.com
- Read the E-Edition, our digital replica newspaper
- Access News Break, our award-winning app
- Play interactive puzzles
*Billed as $4.00 plus GST every four weeks. After 24 weeks, price increases to the regular rate of $19.00 plus GST every four weeks. Offer available to new and qualified returning subscribers only. Cancel any time.
Monthly Digital Subscription
$4.75/week*
- Enjoy unlimited reading on winnipegfreepress.com
- Read the E-Edition, our digital replica newspaper
- Access News Break, our award-winning app
- Play interactive puzzles
*Billed as $19 plus GST every four weeks. Cancel any time.
To continue reading, please subscribe:
Add Winnipeg Free Press access to your Brandon Sun subscription for only
$1 for the first 4 weeks*
*$1 will be added to your next bill. After your 4 weeks access is complete your rate will increase by $0.00 a X percent off the regular rate.
Read unlimited articles for free today:
or
Already have an account? Log in here »
Hey there, time traveller!
This article was published 10/02/2024 (573 days ago), so information in it may no longer be current.
We’re in a new era of sexual harassment.
In late January, sexually explicit AI-generated images of Taylor Swift began spreading over social media. These fake images were viewed millions of times and likely originated from the infamous message board 4chan as part of a “challenge” to circumvent safeguards and filters, per a new report covered this week in the New York Times.
The Swift images have reginited calls for policymakers to move faster when it comes to establishing laws regarding the distribution — and, now, creation — of non-consensual intimate imagery.

PHOTO / TORU HANAI / THE ASSOCIATED PRESS
Whether or not they look real, AI images of Taylor Swift, above in a real photo in Tokyo on Feb. 7, or Winnipeg high school students, they exist and the harm they do is also very real.
You might recall back in 2014, when a bunch of celebrities had private nude photos stolen from their iCloud accounts and shared all over the internet.
Public reaction at the time was largely unsympathetic; these celebs were, after all, “asking for it” by being famous and taking these photos in the first place. The cultural entitlement over celebrity bodies was on full display.
Now, 10 years on, computer-literate creeps don’t even have to go to the trouble of hacking into someone’s account and stealing private images in order to humiliate and harass. They can just create them. And that should be very concerning to anyone who uses the internet.
These fake images are not somehow less damaging to Swift because she is currently the most famous woman in the world, and has access to a powerful base of Swifties who can get them removed from social media. Behaviour like this doesn’t exist in a vacuum. What happens to pop stars also happens to teenage girls.
In December, the Winnipeg Police Service were investigating reports of AI-generated nude photos of underage students circulating at Collège Béliveau, a Grade 9-12 high school in Windsor Park.
Imagine, for a moment, what it would feel like, to have your likeness used this way. Imagine how powerless and violated you would feel.
I’ve seen a few conversations online about the “believability” of the Taylor Swift images, usually helmed by people for whom deepfake porn harassment is simply a debate topic.
First of all, even if we were talking about a crude cutout of Taylor’s head pasted onto the body of Playboy centrefold assembled with scissors and a glue stick and disseminated widely without her consent, it would still be wrong.
Secondly, whether we’re talking about a celebrity, or a Grade 10 student, or a woman from your marketing department, it doesn’t actually matter whether these images “look real” or not. They are real in the sense they exist — and, knowing the internet, will likely exist forever. The harm they do is also very real.
Besides, AI technology will continue to improve — quickly. Coupled with a decline in media literacy, people might stop even questioning what they’re seeing.
The speed and ease with which these images can be created and spread is also alarming; one doesn’t even need to have a mastery of Photoshop anymore.
And yet, despite this rapid acceleration in technology, it seems as if we’re still stuck in 2014 when it comes to the law. Per a recent Canadian Press story about Canadian provinces playing catchup on this file, Manitoba is one of eight provinces that do indeed have intimate image laws, but ours don’t refer to altered images.
That needs to change, and fast.
We cannot afford to have the creation and distribution of sexually explicit AI-generated images dealt with the same way online sexual harassment has traditionally been dealt with, which is to just tell women to “stay off the internet.”
We’re still on the internet either way. Shouldn’t we have some agency over how we show up there?
jen.zoratti@winnipegfreepress.com

Jen Zoratti is a columnist and feature writer working in the Arts & Life department, as well as the author of the weekly newsletter NEXT. A National Newspaper Award finalist for arts and entertainment writing, Jen is a graduate of the Creative Communications program at RRC Polytech and was a music writer before joining the Free Press in 2013. Read more about Jen.
Every piece of reporting Jen produces is reviewed by an editing team before it is posted online or published in print – part of the Free Press‘s tradition, since 1872, of producing reliable independent journalism. Read more about Free Press’s history and mandate, and learn how our newsroom operates.
Our newsroom depends on a growing audience of readers to power our journalism. If you are not a paid reader, please consider becoming a subscriber.
Our newsroom depends on its audience of readers to power our journalism. Thank you for your support.