TSCE #9(F)(2): 2021 Parliamentary Hearings On Pornhub, CSAM, Digital Fingerprinting, Databases

When the issues of internet privacy and child protection intersect, sorting things out can be fairly tricky. It was only a decade ago when “Conservative” Public Safety Minister, Vic Toews, decided that having basic protections of browsing history amounted to coddling pedophiles. Nonetheless, these concerns don’t go away just because someone else is now in office.

Now, it’s the group Pornhub that is under the public spotlight. It is just one such site owned by MindGeek. The porno empire of MindGeek includes (but isn’t limited to):

  • Pornhub
  • RedTube
  • YouPorn
  • Brazzers
  • Digital Playground
  • Men.com
  • Reality Kings
  • Sean Cody
  • WhyNotBi.com

Allegations have come up that actual sexual abuse as been published on this site, as well as revenge porn, and videos featuring minors. All of that is illegal. As for the hearings:

The above videos are clipped from the this hearing. The transcript of that day’s hearing is available here.

From January to June 2021, there were Parliamentary hearings held in Ottawa based on what was happened with Pornhub. It turned out that a very large amount of their content involved non-consenting parties, or minors, or both. After an outrage in December 2020, and threatened with the loss of payment processors like Visa and Mastercard, there was some serious damage control.

To be clear, the whole pornography uploading industry is disgusting. This is especially true as it’s fairly easy to allow content of minors to be included, and non-consensual content as well. That being said, the hearings were interesting, but for additional reasons.

One notable topic was the level of software available to scan images and videos, to implement “digital fingerprinting”, and to collaborate with other social media sites. Furthermore, Mindgeek explained they knew exactly who is uploading to their site, and where it’s happening from.

(February 5, 2021, 13:05)
.
We are also working to ensure that once content is removed, it can never make its way back to our platform or to any platform. The revictimization of individuals when their content is re-uploaded causes profound injury that we are working fiercely to prevent. We are attacking this problem in two ways. First, our people are trained to remove such material upon request. Second, we digitally fingerprint any content removed from our website so that it cannot be re-uploaded to our own platform.
.
For the last two years, we have been building a tool called “SafeGuard” to help fight the distribution of non-consensual intimate images. As I sit before you today, I am pleased to report that this month we will be implementing SafeGuard for all videos uploaded to Pornhub. We will offer SafeGuard for free to our non-adult peers, including Facebook, YouTube and Reddit. We are optimistic that all major social media platforms will implement SafeGuard and contribute to its fingerprint database. Such co-operation will be a major step to limit the spread of non-consensual material on the Internet.

(February 5, 2021, 13:10)
Mrs. Shannon Stubbs:
How do you know?
.
Mr. Feras Antoon:
It’s because every single piece of content is viewed by our human moderators. Number two, it goes through software that we have licensed from YouTube, like CSAI Match, and from Microsoft, like PhotoDNA for pictures. It goes through a software called Vobile.
.
Mrs. Shannon Stubbs:
But then why, for example, do Pornhub’s terms of service say, “we sometimes review Content submitted or contributed by users”?
.
Mr. David Tassillo (Chief Operating Officer, Entreprise MindGeek Canada):
Mrs. Stubbs, I would like to add to what Feras mentioned.
I’m not too sure where it says that in the terms of service, but I can guarantee you that every piece of content, before it’s actually made available on the website, goes through several different filters, some of which my colleague made reference to.
.
Depending on whether it comes up as a photo or as a video, we go through different pieces of software that would compare it to known active cases of CSAM, so we’ll actually do a hash check. We actually don’t send the content itself over; they create a digital key per se that’s compared to a known active database. After that, it’s compared to the other piece of software that Feras mentioned, Vobile, which is a fingerprinting software by which anyone can have their content fingerprinted. Any time MindGeek would find the piece of infringing content, we’d add it to that database to prevent the re-upload.
.
Once it passes the software queue…. If anything fails at the software level, it automatically doesn’t make it up to the site. Once that piece has gone through, we move over to the human moderation section. The human moderators will watch each one of the videos, and if they deem that the video passes, it will be—

Essentially, all of the material, whether uploaded or not, will become part of a huge database. Who will have access to it, and for what reasons could content be released?

And software is used, including stuff provided by YouTube and Microsoft. Will they have access to it? Can the material be stored somewhere else?

The Adult Industry Laborers and Artists Association wrote to Parliament, essentially arguing that the porn industry was better at regulating itself than the Government. Also, it was a large sector of the economy which people relied on to provide for their families.

The Sex Workers of Winnipeg Action Coalition actually wrote to Parliament advising AGAINST mandatory identification for using and uploading onto such sites. They argue that it’s too easy to compile and save the data to be used to nefarious purposes (and cite Clearview AI). In terms of material uploaded without consent, they actually have a point.

The Free Speech Coalition wrote to the hearing and recommended working with sites like Pornhub. They claim that illicit material will just be shared elsewhere if this were shut down.

In MindGeek’s written submissions, they spelled out — at least broadly — the technical tools they had to combat illicit material and keep it from being shared:

Our human moderators are supported by a growing suite of technical tools, which fall into two broad categories: those that detect previously identified CSAM and non-consensual content using a fingerprint technology and those that use artificial intelligence to detect unreported CSAM content.

MindGeek’s fingerprinting tools rely on a unique digital fingerprint to match a video or photograph to those already identified in a database of banned content. These tools include YouTube’s CASI Match, Microsoft’s Photo DNA, Vobile’s MediaWise, and MindGeek’s own SafeGuard. All items caught by these tools as CSAM or non-consensual are immediately blocked from the website and handled by our second level review team.

That’s quite the list of electronic tools. And keep in mind, Pornhub knows exactly who the people are uploading to their site. How exactly would this artificial intelligence work, and what would it be programmed to look for?

The Parliamentary Report has also been issued on this subject. Now, this smut shouldn’t be around at all. However, if it can’t be removed, these are some decent recommendations to help the problem somewhat.

Recommendation 1 concerning liability
That the Government of Canada explore means to hold online platforms liable for any failure to prevent the upload of, or ensure the timely deletion of child sexual abuse material, content depicting non-consensual activity, and any other content uploaded without the knowledge or consent of all parties, including enacting a duty of care, along with financial penalties for non-compliance or failure to fulfil a required duty of care.

Recommendation 2 concerning the duty to verify age and consent
That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation.

Recommendation 3 concerning consultation
That the Government of Canada consult with survivors, child advocacy centres, victim support agencies, law enforcement, web platforms and sex workers prior to enacting any legislation or regulations relating to the protection of privacy and reputation on online platforms.

Recommendation 4 concerning section 3 of the Mandatory Reporting Act
That the Government of Canada, in collaboration with the provinces, amend section 3 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to make the National Child Exploitation Coordination Centre the designated law enforcement agency for the purpose of reporting under that section and that it ensure that the National Child Exploitation Coordination Centre has the resources it needs to investigate the increased referrals of child sexual abuse materials

Recommendation 5 concerning reporting obligations
That the Government of Canada invest resources to ensure the compliance of access providers, content providers and Internet content hosting services with their reporting obligations under An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service through education and awareness initiatives.

Recommendation 6 concerning section 11 of the Mandatory Reporting Act
That the Government of Canada consider amending section 11 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to extend the period of time to commence prosecution for an offence under this Act.

Recommendation 7 regarding compliance under the Mandatory Reporting Act
That the Government of Canada call upon the Royal Canadian Mounted Police and other police services to ensure the compliance of Internet service providers, as defined in An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, with their reporting obligations under that Act, and that compliance be absolute with no means for providers to opt out

Recommendation 8 concerning requirements for uploaders of content
That the Government of Canada set requirements for uploaders of content to provide proof of valid consent of all persons depicted and that the new regulations include penalties severe enough to act as an effective deterrent.

Recommendation 9 regarding pornographic content and age verification
That the Government of Canada develop clear regulations that require Internet service providers, as defined in An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, to utilize a robust process for age verification of all individuals in uploaded pornographic content, including content generated by individuals, studios or contract partners.

Recommendation 10 concerning proactive enforcement of Canadian laws
That the Government of Canada proactively enforce all Canadian laws regarding child sexual abuse material and the posting of non-consensual material and that in particular, it enforce section 3 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service by requiring all Internet service providers, as defined in the Act, to report child sexual abuse material directly to an officer, constable or other person employed for the preservation and maintenance of the public peace.

Recommendation 11 concerning accessible mechanisms for the removal of online content
That the Government of Canada develop accessible mechanisms that ensure that Canadians victimized by the posting of an image or video online without their consent on sites like Pornhub have the right to have that content removed immediately and to be given the benefit of the doubt with respect to the non-consensual nature of the content, and that the Government of Canada provide all the necessary resources required to put in place these accessible mechanisms.

Recommendation 12 concerning a potential new pattern of sexual violence
That the Government of Canada work with key stakeholder groups such as Canadian sexual assault centres, women’s rights organizations and representatives from LGBTQ2 communities to determine if the posting of non-consensual material depicting sexual violence on sites like Pornhub is reflective of, and contributing to, a new pattern of sexual violence, and that it report its findings, including recommendations for further action, to Parliament.

Recommendation 13 concerning the accountability of websites regarding the downloading and re-uploading of pornographic content
That the Government of Canada hold accountable websites that allow the downloading and re-uploading of pornographic content that erases the identity of the source material, thereby preventing authorities from assessing those accountable for the material.

Recommendation 14 concerning a new legal framework to impose certain obligations on Internet service providers hosting pornographic content
That the Government of Canada create a legal framework that would compel Internet service providers that host pornographic content to:
• implement and use available tools to combat the flagrant and relentless re-uploading of illegal content;
• hire, train and effectively supervise staff to carry out moderation and content removal tasks at an appropriate scale;
maintain detailed records of user reports and responses that can be audited by authorities;
• be legally accountable for content moderation and removal decisions and the harm to individuals that results when efforts are inadequate; and
• build in and design features that prioritize the best interests and privacy rights of children and vulnerable adults

Admittedly, these are some good proposals. Will anything come of these hearings when the next Parliament sits? I guess we will have to wait and see in the new session.

Again, this is not defend this disgusting industry. However, even with safeguards, there are still plenty of children and non-consenting people who are victimized here. It’s not much of a consolation to say that “it will just go elsewhere” if these sites are shut down.

Even for young adults, what happens in 5 or 10 years when they grow up and realize they’ve made a serious mistake? How easy (or possible) will it be to get this information scrubbed?

(1) https://www.ourcommons.ca/Committees/en/ETHI/StudyActivity?studyActivityId=11088039
(2) https://www.ourcommons.ca/DocumentViewer/en/43-2/ETHI/meeting-19/evidence
(3) https://www.ourcommons.ca/Content/Committee/432/ETHI/Reports/RP11148202/ethirp03/ethirp03-e.pdf
(4) Pornhub Parliamentary Hearings Adult Gender Equality LEAF
(5) Pornhub Parliamentary Hearings Adult Industry Labourers
(6) Pornhub Parliamentary Hearings Christian Legal Fellowship
(7) Pornhub Parliamentary Hearings Free Speech
(8) Pornhub Parliamentary Hearings MindGeek
(9) Pornhub Parliamentary Hearings Non State Torture
(10) Pornhub Parliamentary Hearings Ntl Center For Exploitation
(11) Pornhub Parliamentary Hearings Stop Exploitation
(12) Pornhub Parliamentary Hearings Winnipeg Sex Workers
(13) Pornhub Parliamentary Hearings Your Brain On Porn
(14) https://www.theguardian.com/us-news/2020/dec/10/pornhub-mastercard-visa-rape-child-abuse-images
(15) https://en.wikipedia.org/wiki/MindGeek

Leave a Reply

Discover more from Canuck Law

Subscribe now to keep reading and get access to the full archive.

Continue reading