Add a bookmark to get started

26 April 20234 minute read

B.C. passes new legislation: Intimate Images Protection Act

Revenge porn and the AI manipulation of intimate images are a harmful and invasive form of harassment and assault, proliferated by the rapid spread of information (and misinformation) across a borderless internet, but which can lead to real world distress, reputational damage, and even physical harm. 

Against this backdrop, governments around the world have been enacting legislation to make sure that victims have tools to seek justice against the purveyors of these illicit materials. The Province of British Columbia has now passed the Intimate Images Protection Act without amendment in the Province of British Columbia, applying retroactively to March 6, 2023. Regulations must be put in place before the legislation takes effect. As set out in our earlier article the purpose of this new legislation is to provide stronger protections to people who have their intimate images shared without their consent and a faster process for removing these images than is currently available when individuals have to seek redress through the B.C. Supreme Court. 

According to the Province, in 2020, there was an 80 percent increase in incidents reported to police of non-consensual sharing of images across the country compared to the previous five years. 

“Intimate images” that are protected under the legislation include pictures, videos, livestreams, whether or not the individual is identifiable and whether or not the image has been altered in any way (thus including digitally altered images and even deepfakes):

  • in which an individual is, or is depicted as, engaged in a sexual act, is nude or nearly nude, or exposing certain defined intimate body parts; and
  • in which the individual has a reasonable expectation of privacy at the time of recording or distribution, or the time of simultaneous representation.

Once it comes into force, the legislation will create a new, expedited process resulting in legal decisions and orders designed ‎specifically to stop the distribution of intimate images without consent. ‎ Claims can be made to the Civil Resolution Tribunal and orders can be directed at the “wrongdoer” or a third party intermediary who can be directed to remove, delete or destroy the image and de-index it from any search engines. In addition, the decision maker can make any other order they consider to be just and reasonable in the circumstances.

For example, the individual depicted in an intimate image can claim relief, without proof of damage, where an image is distributed or threatened to be distributed without consent. (We would also note that such activities may also be crimes under the Criminal Code’s Sections 162.1, 264, 372 and 346 relating to, respectively, Criminal Harassment, Extortion, and Indecent Communications). Existing torts for defamation, intrusion upon seclusion, breach of confidence, intentional infliction of ‎mental suffering, appropriation of personality, do not always provide a victim the necessary tools for ‎redress given the anonymity and wide reach of the internet.‎

The legislation also allows greater protection for minors (of an age to be prescribed by regulation) to make an application on their own behalf. This puts some extra power in the hands of victims of revenge porn or deepfake non-consensual images, giving them access to speedier justice (in which far greater relief can be granted in the prompt removal of the images as opposed to the full-length trial of the wrongdoer).

An important provision in the legislation is that any individual who may previously have consented to the distribution of an intimate image has the right to revoke that consent at any time.

B.C. is following in the footsteps of other Canadian provinces with this legislation, including Nova Scotia and Saskatchewan. And, as mentioned, the distribution of intimate images without consent was added as a crime as Section 162.1 of the Criminal Code in 2015, though to our knowledge it has not been tested as to whether it could apply to altered photographs or deepfakes.

Print