Criminal charges against 2 N.W.T. youth highlight need for AI legislation and education, experts say | CBC News


Criminal charges against 2 N.W.T. youth highlight need for AI legislation and education, experts say | CBC News

Listen to this article

Estimated 5 minutes

The audio version of this article is generated by AI-based technology. Mispronunciations can occur. We are working with our partners to continually review and improve the results.

Recent criminal charges against two youth in the N.W.T. who allegedly used artificial intelligence (AI) to create child sexual abuse and exploitation material have some advocates calling for more education and clear policy around AI use in the territory.  

RCMP said in a news release last week that the two young people allegedly used AI technology to alter social media photos of other youth to make them appear nude, and then shared those images with each other. 

According to court documents, the alleged offences occurred between Feb. 1 and March 18 of this year. 

The two youth were each charged with making, transmitting and possessing child sexual abuse and exploitation material. As part of their release order, they were told not to contact a list of 17 other people, including one another. 

The N.W.T. RCMP said in the news release that its Internet Child Exploitation Unit “actively investigates offences involving [child sexual abuse and exploitation material], including AI-generated content.” 

AI-generated intimate images and Canadian law 

Incidents involving intimate images and deepfakes are on the rise across the country, said Suzie Dunn, an assistant professor of law and the interim director of the Law and Technology Institute at Dalhousie University.

She says the emergence of apps used to create nude deepfake images is increasing the number of “everyday people who are targeted.” 

Yet according to Dunn, Canadian law surrounding the use of AI-generated fake images is still relatively murky. 

Currently every province, except Quebec, has civil statutes that allow a person to sue someone else for sharing real intimate images without consent. None of the territories have introduced these civil statutes, said Dunn. 

As deepfakes became more prevalent, some provinces began to change the language in their civil statutes to include fake or manipulated images, she said.  

Last year, the federal government introduced legislation which would amend the Criminal Code to include deepfakes in the definition of intimate images. Bill C-16 hasn’t passed yet. 

When it comes to the creation, possession and distribution of child sexual abuse and exploitation material (CSAEM) — which depicts children under the age of 18 in a sexual manner — Dunn said that’s a criminal offence even if the images are fake or altered. 

She said that so far there have been two criminal cases in Canada where people have pleaded guilty to creating and distributing CSAEM that were deepfakes.

Dunn called the recent charges against the two N.W.T. youth “quite an intense criminal provision for children to be charged with.” 

However, she noted that under current Canadian law there are few alternatives, putting police in a “really difficult position” when attempting to hold people accountable for this type of behaviour.  

Morgan Fane is the N.W.T. Crown prosecutor involved in the case. He said that to his knowledge, there hasn’t been a case before the courts in the territory involving the use of generative computer tools to create images depicting youth in a sexual manner. 

‘A big challenge for people in every sector’

As the case in the N.W.T. goes before the courts, Dunn says work must also happen around AI education. 

Nancy MacNeill, a youth mental health and sexual health advocate in the N.W.T., agrees. She says right now, dealing with deepfakes and sexual imagery is like “the Wild West” and it is unclear what resources are accessible for people who have been victimized. 

She says the charges laid against the two N.W.T. youth last week have prompted a lot of questions from other young people concerned about how to respond or protect themselves.

“This is a new technology and controlling it in any capacity is proving to be a big challenge for people in every sector,” she said. 

Rita Mueller, president of the N.W.T. Teachers’ Association, says educators are also looking for guidance. She said the territorial government has so far not offered much, when it comes to policy or guidelines for AI use in the classrooms and educational resources.

She also says teachers don’t have adequate resources to teach students about ethical AI use.

“You are asking teachers now to maneuver through this ever-changing, very rapidly growing world of AI applications without having a policy that says, ‘in the Northwest Territories… this is how AI should be used,’” Mueller said.  

In an email to CBC News, RCMP said that Yellowknife’s community policing officer delivered presentations to schools around cyber security in February and March. However, AI and the offences that could be associated with it were not the main focus.

Mueller is calling for a “meaningful conversation” to be had between the school districts and the territorial government to discuss creating a policy governing AI use.  

She indicated that there is no need to “completely reinvent the wheel,” and the N.W.T. can learn from AI policies implemented in other juridictions. 

“We need to sit down and we need to really figure this out — and quickly,” she said.