Plato Data Intelligence.
Vertical Search & Ai.

Canada Has Loopholes on Sexualized AI Deepfakes

Date:

Canadian laws still have huge gaps to cover to protect young children from the spread of sexualized AI deepfakes on social media and online platforms, according to experts.

The observations were made after no criminal charges were laid following AI-generated fake nudes of girls from a Winnipeg school that circulated online.

The loopholes in the laws

According to CBC, officials at Collège Béliveau, which is a grade 7 to 12 French immersion school in Windsor, became aware of the problem after some students reported about the girls’ images circulating online in December.

The school authorities revealed that the images appeared to have been gathered from public sources before being doctored using AI.

They also could not ascertain the number of images that were posted online or the number of girls who were victimized in the incident, as well as the identity of the perpetrators of the crime.

However, when the Winnipeg police investigated the matter, they concluded there were no arrests or criminal charges to be made.

Experts say the laws were created before deepfakes were in public and do not include the term “altered images.”

This, according to Suzie Dunn, an assistant law professor at Dalhousie University’s Schulich School of Law, exposed the gaps in Canadian laws and their failure to address the case of sexualized deepfakes.

“Many of the early intimate image laws were created before deepfakes were in public, and so many of them didn’t include the term ‘altered images,’” said Dunn.

Dunn’s research area looks at laws and policies around sexual violence facilitated by technology.

Also read: 5G-enabled AR Animated Avatar Calls for Metaverse

The law too slow to catch up

Canada definitely has laws against the sharing of intimate images without consent. According to Dunn, these laws should now cover deepfakes as well. Despite suggesting this, she admitted she has never witnessed anyone do that, partly because incidents of nature like what happened at the Winnipeg school are not common.

The only closest case happened last year in Quebec, when a 61-year-old man was incarcerated for more than three years for using AI to create synthetic videos of child pornography. Dunn, however, maintained that such laws still do not cover the Winnipeg case.

But there could be other factors too. The lack of charges could be because the perpetrators are also minors, and the police had to use discretion in their final decision.

Another possibility could be a lack of victim cooperation. However, Dunn thinks this wouldn’t be a “deal breaker in a case with digital evidence like deepfake photos.”

Perhaps it’s time for action

For parents of children victimized in the incident, the outcome was disappointing. A mother to one of the girls suggested empowering young girls by educating them on the dangers of sharing intimate and altered images “to avoid a repeat of similar cases in the future.”

Already, there are some sections of Canada that do not believe in the possibility of sexual violence online.

An associate professor of sociology at Western University and Canada Research Chair in inequality and gender, Kaitlynn Mendes, cited Manitoba as one of the provinces that does not recognize online sexual violence.

“It’s really heartbreaking to hear about what happened in Manitoba, but hopefully we can learn lessons and use it as an opportunity to start having these conversations and also start pushing for change,” said Mendes, who also recently co-authored a report on the subject.

This, she said, means young girls will be “ill-equipped” to respond to cases like the Winnipeg incident.

“They won’t understand what rights they have. They won’t understand places where they can go for help or support if or when things go wrong,” she said.

She called on teachers in these provinces to perhaps talk to children about these cases, although it’s not on the official curriculum.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?