TSCE #9(F)(2): 2021 Parliamentary Hearings On Pornhub, CSAM, Digital Fingerprinting, Databases

When the issues of internet privacy and child protection intersect, sorting things out can be fairly tricky. It was only a decade ago when “Conservative” Public Safety Minister, Vic Toews, decided that having basic protections of browsing history amounted to coddling pedophiles. Nonetheless, these concerns don’t go away just because someone else is now in office.

Now, it’s the group Pornhub that is under the public spotlight. It is just one such site owned by MindGeek. The porno empire of MindGeek includes (but isn’t limited to):

  • Pornhub
  • RedTube
  • YouPorn
  • Brazzers
  • Digital Playground
  • Men.com
  • Reality Kings
  • Sean Cody
  • WhyNotBi.com

Allegations have come up that actual sexual abuse as been published on this site, as well as revenge porn, and videos featuring minors. All of that is illegal. As for the hearings:

The above videos are clipped from the this hearing. The transcript of that day’s hearing is available here.

From January to June 2021, there were Parliamentary hearings held in Ottawa based on what was happened with Pornhub. It turned out that a very large amount of their content involved non-consenting parties, or minors, or both. After an outrage in December 2020, and threatened with the loss of payment processors like Visa and Mastercard, there was some serious damage control.

To be clear, the whole pornography uploading industry is disgusting. This is especially true as it’s fairly easy to allow content of minors to be included, and non-consensual content as well. That being said, the hearings were interesting, but for additional reasons.

One notable topic was the level of software available to scan images and videos, to implement “digital fingerprinting”, and to collaborate with other social media sites. Furthermore, Mindgeek explained they knew exactly who is uploading to their site, and where it’s happening from.

(February 5, 2021, 13:05)
.
We are also working to ensure that once content is removed, it can never make its way back to our platform or to any platform. The revictimization of individuals when their content is re-uploaded causes profound injury that we are working fiercely to prevent. We are attacking this problem in two ways. First, our people are trained to remove such material upon request. Second, we digitally fingerprint any content removed from our website so that it cannot be re-uploaded to our own platform.
.
For the last two years, we have been building a tool called “SafeGuard” to help fight the distribution of non-consensual intimate images. As I sit before you today, I am pleased to report that this month we will be implementing SafeGuard for all videos uploaded to Pornhub. We will offer SafeGuard for free to our non-adult peers, including Facebook, YouTube and Reddit. We are optimistic that all major social media platforms will implement SafeGuard and contribute to its fingerprint database. Such co-operation will be a major step to limit the spread of non-consensual material on the Internet.

(February 5, 2021, 13:10)
Mrs. Shannon Stubbs:
How do you know?
.
Mr. Feras Antoon:
It’s because every single piece of content is viewed by our human moderators. Number two, it goes through software that we have licensed from YouTube, like CSAI Match, and from Microsoft, like PhotoDNA for pictures. It goes through a software called Vobile.
.
Mrs. Shannon Stubbs:
But then why, for example, do Pornhub’s terms of service say, “we sometimes review Content submitted or contributed by users”?
.
Mr. David Tassillo (Chief Operating Officer, Entreprise MindGeek Canada):
Mrs. Stubbs, I would like to add to what Feras mentioned.
I’m not too sure where it says that in the terms of service, but I can guarantee you that every piece of content, before it’s actually made available on the website, goes through several different filters, some of which my colleague made reference to.
.
Depending on whether it comes up as a photo or as a video, we go through different pieces of software that would compare it to known active cases of CSAM, so we’ll actually do a hash check. We actually don’t send the content itself over; they create a digital key per se that’s compared to a known active database. After that, it’s compared to the other piece of software that Feras mentioned, Vobile, which is a fingerprinting software by which anyone can have their content fingerprinted. Any time MindGeek would find the piece of infringing content, we’d add it to that database to prevent the re-upload.
.
Once it passes the software queue…. If anything fails at the software level, it automatically doesn’t make it up to the site. Once that piece has gone through, we move over to the human moderation section. The human moderators will watch each one of the videos, and if they deem that the video passes, it will be—

Essentially, all of the material, whether uploaded or not, will become part of a huge database. Who will have access to it, and for what reasons could content be released?

And software is used, including stuff provided by YouTube and Microsoft. Will they have access to it? Can the material be stored somewhere else?

The Adult Industry Laborers and Artists Association wrote to Parliament, essentially arguing that the porn industry was better at regulating itself than the Government. Also, it was a large sector of the economy which people relied on to provide for their families.

The Sex Workers of Winnipeg Action Coalition actually wrote to Parliament advising AGAINST mandatory identification for using and uploading onto such sites. They argue that it’s too easy to compile and save the data to be used to nefarious purposes (and cite Clearview AI). In terms of material uploaded without consent, they actually have a point.

The Free Speech Coalition wrote to the hearing and recommended working with sites like Pornhub. They claim that illicit material will just be shared elsewhere if this were shut down.

In MindGeek’s written submissions, they spelled out — at least broadly — the technical tools they had to combat illicit material and keep it from being shared:

Our human moderators are supported by a growing suite of technical tools, which fall into two broad categories: those that detect previously identified CSAM and non-consensual content using a fingerprint technology and those that use artificial intelligence to detect unreported CSAM content.

MindGeek’s fingerprinting tools rely on a unique digital fingerprint to match a video or photograph to those already identified in a database of banned content. These tools include YouTube’s CASI Match, Microsoft’s Photo DNA, Vobile’s MediaWise, and MindGeek’s own SafeGuard. All items caught by these tools as CSAM or non-consensual are immediately blocked from the website and handled by our second level review team.

That’s quite the list of electronic tools. And keep in mind, Pornhub knows exactly who the people are uploading to their site. How exactly would this artificial intelligence work, and what would it be programmed to look for?

The Parliamentary Report has also been issued on this subject. Now, this smut shouldn’t be around at all. However, if it can’t be removed, these are some decent recommendations to help the problem somewhat.

Recommendation 1 concerning liability
That the Government of Canada explore means to hold online platforms liable for any failure to prevent the upload of, or ensure the timely deletion of child sexual abuse material, content depicting non-consensual activity, and any other content uploaded without the knowledge or consent of all parties, including enacting a duty of care, along with financial penalties for non-compliance or failure to fulfil a required duty of care.

Recommendation 2 concerning the duty to verify age and consent
That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation.

Recommendation 3 concerning consultation
That the Government of Canada consult with survivors, child advocacy centres, victim support agencies, law enforcement, web platforms and sex workers prior to enacting any legislation or regulations relating to the protection of privacy and reputation on online platforms.

Recommendation 4 concerning section 3 of the Mandatory Reporting Act
That the Government of Canada, in collaboration with the provinces, amend section 3 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to make the National Child Exploitation Coordination Centre the designated law enforcement agency for the purpose of reporting under that section and that it ensure that the National Child Exploitation Coordination Centre has the resources it needs to investigate the increased referrals of child sexual abuse materials

Recommendation 5 concerning reporting obligations
That the Government of Canada invest resources to ensure the compliance of access providers, content providers and Internet content hosting services with their reporting obligations under An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service through education and awareness initiatives.

Recommendation 6 concerning section 11 of the Mandatory Reporting Act
That the Government of Canada consider amending section 11 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to extend the period of time to commence prosecution for an offence under this Act.

Recommendation 7 regarding compliance under the Mandatory Reporting Act
That the Government of Canada call upon the Royal Canadian Mounted Police and other police services to ensure the compliance of Internet service providers, as defined in An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, with their reporting obligations under that Act, and that compliance be absolute with no means for providers to opt out

Recommendation 8 concerning requirements for uploaders of content
That the Government of Canada set requirements for uploaders of content to provide proof of valid consent of all persons depicted and that the new regulations include penalties severe enough to act as an effective deterrent.

Recommendation 9 regarding pornographic content and age verification
That the Government of Canada develop clear regulations that require Internet service providers, as defined in An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, to utilize a robust process for age verification of all individuals in uploaded pornographic content, including content generated by individuals, studios or contract partners.

Recommendation 10 concerning proactive enforcement of Canadian laws
That the Government of Canada proactively enforce all Canadian laws regarding child sexual abuse material and the posting of non-consensual material and that in particular, it enforce section 3 of An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service by requiring all Internet service providers, as defined in the Act, to report child sexual abuse material directly to an officer, constable or other person employed for the preservation and maintenance of the public peace.

Recommendation 11 concerning accessible mechanisms for the removal of online content
That the Government of Canada develop accessible mechanisms that ensure that Canadians victimized by the posting of an image or video online without their consent on sites like Pornhub have the right to have that content removed immediately and to be given the benefit of the doubt with respect to the non-consensual nature of the content, and that the Government of Canada provide all the necessary resources required to put in place these accessible mechanisms.

Recommendation 12 concerning a potential new pattern of sexual violence
That the Government of Canada work with key stakeholder groups such as Canadian sexual assault centres, women’s rights organizations and representatives from LGBTQ2 communities to determine if the posting of non-consensual material depicting sexual violence on sites like Pornhub is reflective of, and contributing to, a new pattern of sexual violence, and that it report its findings, including recommendations for further action, to Parliament.

Recommendation 13 concerning the accountability of websites regarding the downloading and re-uploading of pornographic content
That the Government of Canada hold accountable websites that allow the downloading and re-uploading of pornographic content that erases the identity of the source material, thereby preventing authorities from assessing those accountable for the material.

Recommendation 14 concerning a new legal framework to impose certain obligations on Internet service providers hosting pornographic content
That the Government of Canada create a legal framework that would compel Internet service providers that host pornographic content to:
• implement and use available tools to combat the flagrant and relentless re-uploading of illegal content;
• hire, train and effectively supervise staff to carry out moderation and content removal tasks at an appropriate scale;
maintain detailed records of user reports and responses that can be audited by authorities;
• be legally accountable for content moderation and removal decisions and the harm to individuals that results when efforts are inadequate; and
• build in and design features that prioritize the best interests and privacy rights of children and vulnerable adults

Admittedly, these are some good proposals. Will anything come of these hearings when the next Parliament sits? I guess we will have to wait and see in the new session.

Again, this is not defend this disgusting industry. However, even with safeguards, there are still plenty of children and non-consenting people who are victimized here. It’s not much of a consolation to say that “it will just go elsewhere” if these sites are shut down.

Even for young adults, what happens in 5 or 10 years when they grow up and realize they’ve made a serious mistake? How easy (or possible) will it be to get this information scrubbed?

(1) https://www.ourcommons.ca/Committees/en/ETHI/StudyActivity?studyActivityId=11088039
(2) https://www.ourcommons.ca/DocumentViewer/en/43-2/ETHI/meeting-19/evidence
(3) https://www.ourcommons.ca/Content/Committee/432/ETHI/Reports/RP11148202/ethirp03/ethirp03-e.pdf
(4) Pornhub Parliamentary Hearings Adult Gender Equality LEAF
(5) Pornhub Parliamentary Hearings Adult Industry Labourers
(6) Pornhub Parliamentary Hearings Christian Legal Fellowship
(7) Pornhub Parliamentary Hearings Free Speech
(8) Pornhub Parliamentary Hearings MindGeek
(9) Pornhub Parliamentary Hearings Non State Torture
(10) Pornhub Parliamentary Hearings Ntl Center For Exploitation
(11) Pornhub Parliamentary Hearings Stop Exploitation
(12) Pornhub Parliamentary Hearings Winnipeg Sex Workers
(13) Pornhub Parliamentary Hearings Your Brain On Porn
(14) https://www.theguardian.com/us-news/2020/dec/10/pornhub-mastercard-visa-rape-child-abuse-images
(15) https://en.wikipedia.org/wiki/MindGeek

TSCE #9(I): “Mr. Girl”, Pedo Defending Cuties Film Gets YouTube Channel Restored

Free speech and open discourse are generally extremely beneficial to society. However, the selective censoring of that on platforms like YouTube raise some serious questions. Here, YouTube and Twitter don’t seem to have an issue with disturbing content.

1. “Mr. Girl”, Max Karson, Defends Cuties

The first video is Max Karson (a.k.a. “Mr. Girl”) appearing on the Kill Stream with Ethan Ralph. Ralph frequently hosts discussion on topics like pornography, so this isn’t just a one-off. Karson then made his “Cuties” video the next day. While scrubbed from YouTube, it’s still on his site. Several people made great reviews of it, including Adonis Paul and Brittany Venti.

2. Most Likely Sincere, Not Trolling

The suggestion had been made several times that Karson was trolling, that this whole thing was an act either for attention, or to generate views. While that is possible, the tone and overall content comes across as someone who is serious about this content. While satire and comedy (even raunchy stuff) should be protected as free speech, this doesn’t look like that at all.

3. Karson’s YouTube Channel Gets Restored

Even though the Cuties video was taken from the YouTube channel, it is still available — in full — on the website, https://maxkarson.com/. There’s also a disgusting “apology” video posted. Additionally, Karson is still able to receive donations via Square Space and Patreon.

There wouldn’t be as much of an issue if there were uniform standards, either for or against free speech absolutism. However, there seem to be double standards, depending on the subject.

Again, if this was some strange version of satire or parody, what exactly is the punch line? How does this result in humour or comedy?

YouTube has no problems with removing content that contradicts the Covid-19 narrative. Guess we have to draw the line somewhere. Canuck Law is just one of many accounts who have been threatened with the loss of their channel over that.

Worth pointing out: Twitter is currently being sued for (allegedly) not removing illegal material involving minors on its website. That is still ongoing in Court.

4. Trafficking, Smuggling, Child Exploitation

Serious issues like smuggling or trafficking are routinely avoided in public discourse. Also important are the links between open borders and human smuggling; between ideology and exploitation; between tolerance and exploitation; between abortion and organ trafficking; or between censorship and complicity. Mainstream media will also never get into the organizations who are pushing these agendas, nor the complicit politicians. These topics don’t exist in isolation, and are interconnected.

TSCE #9(A): Bill C-75 Revisited, The NGOs Pushing Degeneracy, Child Abuse

Bill C-75 has been addressed twice on this site, once for reducing penalties for terrorism offences, and once for reducing penalties for crimes against children. This piece looks more at some of the groups trying to influence the legislation.

1. Trafficking, Smuggling, Child Exploitation

Serious issues like smuggling or trafficking are routinely avoided in public discourse. Also important are the links between open borders and human smuggling; between ideology and exploitation; between tolerance and exploitation; between abortion and organ trafficking; or between censorship and complicity. Mainstream media will also never get into the organizations who are pushing these agendas, nor the complicit politicians. These topics don’t exist in isolation, and are interconnected.

2. Important Links

Parliamentary Study On Bill C-75 (Fall 2018)

Bill C-75 Canadian Centre For Gender Sexual Diversity
Bill C-75 Canadian Civil Liberties Association
Bill C-75 EGALE Canada Human Rights Trust
Bill C-75 Vancouver Rape Relief
Bill C-75 Law Society Of Ontario
Bill C-75 Tom Hooper Et AlBill C-75 UNICEF Canada

Bill C-75 Families For Justice Alberta

3. EGALE Canada Human Rights Trust

From around 16:23 in this September 25, 2018 transcript from the Parliamentary Hearings on law and justice. A few points worth noting.

First: while this is cloaked as a social justice issue, there seems to be no concern for the consequences of the changes sought here. Second: what is wrong with the parents of young children wanting their (intersex) children from having normal lives as a recognized gender? Third: there is the claim that gays are discriminated against because the age of consent is higher than with straight couples. Strange how they always want it lowered, and never propose RAISING it overall.

4. Centre For Gender And Sexual Diversity

Following the introduction of C-39, An Act to amend the Criminal Code (unconstitutional provisions) and to make consequential amendments to other Acts, the CCGSD was excited that the government was looking serious at equalizing age of consent legislation. We applaud the government on including this as is critical step forward. The CCGSD has been asking for this critical change since 2008. This is critical to the LGBTQI2+ communities as the criminalization of consensual sexual acts between Canadians should be seen as equal under the law regardless of your sexual orientation or gender identity

What they refer to as “equalizing the age” of consent was the provision to reduce the age of consent for anal sex from 18 to 16. Normal sex has a minimum age of consent of 16, years old, and even that was only recent. It used to be 14. The Centre for Gender and Sexual Diversity has deemed it a “priority” to lower the age of consent — since 2008 — instead of asking for a higher universal standard.

They talk about equality for consensual acts between Canadians, but they don’t mentions consensual acts between ADULT Canadians. That detail seems left out.

1-Bill C-75 fails to address sex work criminalization
The criminalization of sex work has been ruled unconstitutional by the Supreme court and continues to put Canadian sex workers in danger. Local, provincial and federal police services continue to use existing legislation to harass and criminalize folks who should be allowed to do their job with the support and protection of the state.
We strongly recommend that a clear decriminalization of sex work be included in C-75.

There doesn’t seem to be any moral issues with sex work itself, or the dangers or moral issues it causes. Instead, CCGSD takes issue with there being laws against it.

2-Bill C-75 fails to protect intersex children from non-consensual surgery
In June 2017, the CCGSD came out with our Pink Agenda making it clear that we stand in solidarity with Intersex communities and their right to decide what is best for their bodies, and yet today Section 268(3) of the Criminal Code of Canada allows non-consensual surgery by medical practitioners to alter the bodies of infants and children whom they perceive to be ambiguous (i.e. intersex).
We strongly recommend that the repeal of Section 268(3) be included in C-75.

We can’t have parents attempting to correct birth defects the best way they know how, in order to help their children go about their lives. What is wrong with them simply being normal boys or girls?

3-Bill C-75 fails to repeal the ‘bawdy house’ laws or obscenity laws that disproportionately affect queer and trans people
The ‘bawdy house’ laws have continue to criticized by many LGBTQI2+ organizations, including most recently the coalition of LGBTQ2I+ and allied organizations during the debate on C-66, An Act to establish a procedure for expunging certain historically unjust convictions and to make related amendments to other Acts (http://ccgsd-ccdgs.org/c66). These laws continue to be used to criminalize consensual LGBTQI2+ behaviours, and need to be full repealed.
We strongly recommend that the repeal of the ‘bawdy house’ laws be included in C-75

An bizarre argument. While claiming that gays aren’t perverts, the CCGSD also claims that laws against degeneracy disproportionately impact them. Doesn’t that undermine the original assertion?

5. Vancouver Rape Relief — Domestic Violence

The change to reverse onus bail in cases of male violence against women is an encouraging step to help reduce the number of men who immediately re-offend and attack their female intimate partners. It is a positive step because the onus is on the offender to prove why they should be let out on bail if they have a history of domestic violence. This sends a message that violence against women is a serious crime. It is, however, unfortunate that this reverse onus will not apply to those men without a criminal record for domestic violence, which will include convicted persons who received an absolute or a conditional discharge. What we see from our work is getting a conviction is rare; when it does happen often its a man of colour. As a result, we can see the possibility that something like this will disproportionately affect racialized men, while the majority of men who go without being charged and convicted remain unaccountable and undeterred.

Eliminating the mandatory use of preliminary inquiries as it relates to women who have been sexually assaulted is a positive step. We know from our experience accompanying women to court that preliminary inquiries are used by the defence as an attempt to discredit their testimony by pointing out minute discrepancies from their police statements, their preliminary inquiry evidence and their trial testimonies.

Vancouver Rape Relief brings a few interesting arguments into the discussion. First, they are upset that the “reverse onus” provisions of bail won’t apply to men without past convictions for domestic violence. Second, they support eliminating mandatory use of preliminary inquiries, which are an important step of discovery prior to trial. It doesn’t appear that they actually support the idea of due process.

6. Individuals Opposing Degeneracy Laws

Regarding the last video, the crime itself is failing to disclose HIV status with sexual partners. However, it’s frequently misnamed as “criminalizing people with HIV”. Knowing that the other person has this disease is pretty important, regardless of how deadly it might be.

It’s worth pondering: how many of those people who are okay with not disclosing HIV status to sexual partners would be okay with forcing masks and vaccines on people?

7. Does Anyone Care About These Reductions?

  • Section 58: Fraudulent use of citizenship
  • Section 159: Age of consent for anal sex
  • Section 172(1): Corrupting children
  • Section 173(1): Indecent acts
  • Section 180(1): Common nuisance
  • Section 182: Indecent interference or indignity to body
  • Section 210: Keeping common bawdy house
  • Section 211: Transporting to bawdy house
  • Section 242: Not getting help for childbirth
  • Section 243: Concealing the death of a child
  • Section 279.02(1): Material benefit – trafficking
  • Section 279.03(1): Withholding/destroying docs — trafficking
  • Section 279(2): Forcible confinement
  • Section 280(1): Abduction of child under age 16
  • Section 281: Abduction of child under age 14
  • Section 291(1): Bigamy
  • Section 293: Polygamy
  • Section 293.1: Forced marriage
  • Section 293.2: Child marriage
  • Section 295: Solemnizing marriage contrary to law
  • Section 435: Arson, for fraudulent purposes
  • Section 467.11(1): Participating in organized crime

These are not minor or unimportant crimes. In fairness, there are a few submissions that speak out about the hybridization of these offences (making them eligible to be tried summarily). Who came up with these though? Why are such crimes being shrugged off. Sure, the terrorism offence penalties caused backlash, but not these. It’s almost as if they wanted to divert attention.

As for watering down terrorism offences, where did that idea come from? CIJA, the Centre for Israel and Jewish Affairs spoke against some of these provisions. But it’s unclear who was the brains behind the proposal

Now, it should be noted that changes to the MAXIMUM sentence of certain crimes would make law students and paralegals ineligible to work on such cases. While not a defense of criminals, everyone should have access to some representation.

Who was Bill C-75 really designed for? It comes across as if a group wanted to destabilize society, and wrote collaboratively on it.

TSCE #9(G): Bit Of History – Bill C-30, Toews Gutting Internet Privacy Under Pretense Of Child Protection

On February 14, 2012, then-Public Safety Minister Vic Toews introduced Bill C-30 into the House of Commons. It would have forced internet providers to hand over customer data — without a warrant — to police during investigations. Even law abiding people had reason to be concerned, with just how broad and sweeping this Bill was. Anyhow, it didn’t get past 1st Reading.

1. Trafficking, Smuggling, Child Exploitation

Serious issues like smuggling or trafficking are routinely avoided in public discourse. Also important are the links between open borders and human smuggling; between ideology and exploitation; between tolerance and exploitation; between abortion and organ trafficking; or between censorship and complicity. Mainstream media will also never get into the organizations who are pushing these agendas, nor the complicit politicians. These topics don’t exist in isolation, and are interconnected.

2. Content Of Bill C-30

Obligations Concerning Subscriber Information
Provision of subscriber information
16. (1) On written request by a person designated under subsection (3) that includes prescribed identifying information, every telecommunications service provider must provide the person with identifying information in the service provider’s possession or control respecting the name, address, telephone number and electronic mail address of any subscriber to any of the service provider’s telecommunications services and the Internet protocol address and local service provider identifier that are associated with the subscriber’s service and equipment.
.
Purpose of the request
(2) A designated person must ensure that he or she makes a request under subsection (1) only in performing, as the case may be, a duty or function
(a) of the Canadian Security Intelligence Service under the Canadian Security Intelligence Service Act;
(b) of a police service, including any related to the enforcement of any laws of Canada, of a province or of a foreign jurisdiction; or
(c) of the Commissioner of Competition under the Competition Act.
.
Designated persons
(3) The Commissioner of the Royal Canadian Mounted Police, the Director of the Canadian Security Intelligence Service, the Commissioner of Competition and the chief or head of a police service constituted under the laws of a province may designate for the purposes of this section any employee of his or her agency, or a class of such employees, whose duties are related to protecting national security or to law enforcement.
.
Limit on number of designated persons
(4) The number of persons designated under subsection (3) in respect of a particular agency may not exceed the greater of five and the number that is equal to five per cent of the total number of employees of that agency.
Delegation
(5) The Commissioner of the Royal Canadian Mounted Police and the Director of the Canadian Security Intelligence Service may delegate his or her power to designate persons under subsection (3) to, respectively, a member of a prescribed class of senior officers of the Royal Canadian Mounted Police or a member of a prescribed class of senior officials of the Canadian Security Intelligence Service.

Miscellaneous Provisions
Facility and service information
24. (1) A telecommunications service provider must, on the request of a police officer or of an employee of the Royal Canadian Mounted Police or the Canadian Security Intelligence Service,
(a) provide the prescribed information relating to the service provider’s telecommunications facilities;
(b) indicate what telecommunications services the service provider offers to subscribers; and
(c) provide the name, address and telephone number of any telecommunications service providers from whom the service provider obtains or to whom the service provider provides telecommunications services, if the service provider has that information.

Persons engaged in interceptions
28. (1) A telecommunications service provider must, on the request of the Royal Canadian Mounted Police or the Canadian Security Intelligence Service, provide a list of the names of the persons who are employed by or carrying out work for the service provider who may assist in the interception of communications.

34. (1) An inspector may, for a purpose related to verifying compliance with this Act, enter any place owned by, or under the control of, any telecommunications service provider in which the inspector has reasonable grounds to believe there is any document, information, transmission apparatus, telecommunications facility or any other thing to which this Act applies.
.
Powers on entry
(2) The inspector may, for that purpose,
(a) examine any document, information or thing found in the place and open or cause to be opened any container or other thing;
(b) examine or test or cause to be tested any telecommunications facility or transmission apparatus or related equipment found in the place;
(c) use, or cause to be used, any computer system in the place to search and examine any information contained in or available to the system;
(d) reproduce, or cause to be reproduced, any information in the form of a printout, or other intelligible output, and remove the printout, or other output, for examination or copying; or
(e) use, or cause to be used, any copying equipment or means of telecommunication at the place.
.
Duty to assist
(3) The owner or person in charge of the place and every person in the place must give all assistance that is reasonably required to enable the inspector to perform their functions under this section and must provide any documents or information, and access to any data, that are reasonably required for that purpose.
.
Inspector may be accompanied
(4) The inspector may be accompanied by any other person that they believe is necessary to help them perform their functions under this section.

Entry onto private property
36. An inspector and any person accompanying them may enter private property — other than a dwelling-house — and pass through it in order to gain entry to a place referred to in subsection 34(1). For greater certainty, they are not liable for doing so.
.
Use of force
37. In executing a warrant to enter a dwelling-house, an inspector may use force only if the use of force has been specifically authorized in the warrant and they are accompanied by a peace officer.

Does this sound like it’s about protecting kids online? The CPC became notorious for gaslighting Canadians over privacy concerns with the line: “Either you’re with us, or you’re with the child pornographers”. Concerns over this Bill wasn’t just limited to criminals and child predators. Anyone with any expectation of privacy from internet providers should be alarmed.

Remember the days when “Conservatives” at least pretended care about personal freedoms, such as privacy and property rights?

Who’s to say that elements of this won’t be, (or haven’t already been), slipped into other pieces of legislation? If it were more arranged in a more piece-meal fashion, it could pass.

3. Backlash Felt Over Privacy Concerns

Following the predictable public outrage, Toews backed down almost immediately, saying he would entertain amendments to the Bill. At that time, the Conservative Party held a majority in Parliament, so they could have passed it if they wanted to. In the end, Bill C-30 didn’t get past First Reading, and died in that session of Parliament.

TSCE #12(C): Twitter Sued For (Allegedly) Refusing To Remove Child Exploitation Material

Twitter is being sued in U.S. District Court in the Northern District of California. It’s alleged that Twitter refused to take down pornographic material, even after becoming aware that minors were involved, and they were exploited. The site, endsexualexploitation.org, posted a copy of the complaint. The names were redacted in the papers to protect the identities of the family.

Just a reminder: at this point, it is just accusations against Twitter.

1. Trafficking, Smuggling, Child Exploitation

Serious issues like smuggling or trafficking are routinely avoided in public discourse. Also important are the links between open borders and human smuggling; between ideology and exploitation; between tolerance and exploitation; between abortion and organ trafficking; or between censorship and complicity. Mainstream media will also never get into the organizations who are pushing these agendas, nor the complicit politicians. These topics don’t exist in isolation, and are interconnected.

2. Important Links

Twitter CP Remained Up Lawsuit Filed Statement Of Claim
Endsexualexploitation,org Website Link
Interview With Epoch Times — American Thought Leaders
Twitter T.O.S.: Child Sexual Exploitation Policies
https://archive.is/PVP1w
Twitter Medical Misinformation Policies
https://archive.is/RLwRi
Twitter Misleading Information Updates
https://archive.is/zoqrD

3. Epoch Times Interviews Plaintiff’s Lawyer

Lisa Haba, lawyer for the victim, gave an interview with Jan Jekielek of Epoch Times a few days ago. This is well worth a watch. They bring up several interesting topics, including using Section 230 as a legal defense.

4. Quotes From The Lawsuit Against Twitter

This is a civil action for damages under the federal Trafficking Victims’ Protection Reauthorization Act (“TVPRA”), 18 U.S.C. §§ 1591 and 1595, Failure to Report Child Sexual Abuse Material, 18 U.S.C. § 2258A, Receipt and Distribution of Child Pornography, 18 U.S.C. §§ 2252A, and related state law claims arising from Defendant’s conduct when it knowingly hosted sexual exploitation material, including child sex abuse material (referred to in some instances as child pornography), and allowed human trafficking and the dissemination of child sexual abuse material to continue on its platform, therefore profiting from the harmful and exploitive material and the traffic it draws.

1. Sex trafficking is a form of slavery that illegally exists in this world—both throughout the United States and globally—and traffickers have been able to operate under cover of the law through online platforms. Likewise, those platforms have profited from the posting and dissemination of trafficking and the exploitative images and videos associated with it.

2. The dissemination of child sexual abuse material (CSAM) has become a global scourge since the explosion of the internet, which allows those that seek to trade in this material to equally operate under cover of the law through online platforms.

3. This lawsuit seeks to shine a light on how Twitter has enabled and profited from CSAM on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity.

4. With over 330 million users, Twitter is one of the largest social media companies in the world. It is also one of the most prolific distributors of material depicting the sexual abuse and exploitation of children.

28. Twitter explains how it makes money from advertising services as follows:
.
We generate most of our advertising revenue by selling our
Promoted Products
. Currently, our Promoted Products consist of
the following:
.
• Promoted Tweets. Promoted Tweets, which are labeled as
“promoted,” appear within a timeline, search results or profile
pages just like an ordinary Tweet regardless of device, whether it
be desktop or mobile. Using our proprietary algorithms and
understanding of the interests of each account, we can deliver
Promoted Tweets that are intended to be relevant to a particular
account. We enable our advertisers to target an audience based on
an individual account’s interest graph. Our Promoted Tweets are
pay-for-performance or pay-for-impression delivered advertising
that are priced through an auction. Our Promoted Tweets include
objective-based features that allow advertisers to pay only for the
types of engagement selected by the advertisers, such as Tweet
engagements (e.g., Retweets, replies and likes), website clicks,
mobile application installs or engagements, obtaining new
followers, or video views.

65. In 2017, when John Doe was 13-14 years old, he engaged in a dialog with someone he thought was an individual person on the communications application Snapchat. That person or persons represented to John Doe that they were a 16-year-old female and he believed that person went his school.

66. After conversing, the person or persons (“Traffickers”) interacting with John Doe exchanged nude photos on Snapchat.

67. After he did so the correspondence changed to blackmail. Now the Traffickers wanted more sexually graphic pictures and videos of John Doe, and recruited, enticed, threatened and solicited John Doe by telling him that if he did not provide this material, then the nude pictures of himself that he had already sent would be sent to his parents, coach, pastor, and others in his community.

68. Initially John Doe complied with the Traffickers’ demands. He was told to provide videos of himself performing sexual acts. He was also told to include another person in the videos, to which he complied.

69. Because John Doe was (and still is) a minor and the pictures and videos he was threatened and coerced to produce included graphic sexual depictions of himself, including depictions of him engaging in sexual acts with another minor, the pictures and videos constitute CSAM under the law.

70. The Traffickers also attempted to meet with him in person. Fortunately, an in person meeting never took place.

85. John Doe submitted a picture of his drivers’ license to Twitter proving that he is a minor. He emailed back the same day saying:

91. On January 28, 2020, Twitter sent John Doe an email that read as follows:
.
Hello,
.
Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.
.
If you believe there’s a potential copyright infringement, please start a new report.
.
If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it.
.
Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities. Taking screenshots of the Tweets is often a good idea, and we have more information available for law enforcement about our policies.
.
Thanks,
Twitter

In short, the victim met someone online pretending to be someone else, and got him to send nude photos under false pretenses. The teen — which is still a minor today — was then blackmailed into sending more.

Some of this was posted on Twitter. Despite verifying the age and identity of the victim, they refused to remove the content, saying that they found no violations in their terms of services. It was only after Homeland Security stepped in, that Twitter finally complied.

Interestingly, almost half of the complaint against Twitter consists of copies of its own rules, policies, and terms of service. Twitter has rules on the books to prevent exactly this type of thing, but (allegedly) refused to act when it was brought to their attention.

The comment about “potential copyright infringement” comes across as a slap in the face. That was clearly never the concern of the child.

Twitter has not filed a response, so we’ll have to see what happens next.

5. Current Twitter Policy On Exploiting Minors

Child sexual exploitation policy
Overview
October 2020
.
We have a zero-tolerance child sexual exploitation policy on Twitter.
.
Twitter has zero tolerance towards any material that features or promotes child sexual exploitation, one of the most serious violations of the Twitter Rules. This may include media, text, illustrated, or computer-generated images. Regardless of the intent, viewing, sharing, or linking to child sexual exploitation material contributes to the re-victimization of the depicted children. This also applies to content that may further contribute to victimization of children through the promotion or glorification of child sexual exploitation. For the purposes of this policy, a minor is any person under the age of 18.

What is in violation of this policy?
Any content that depicts or promotes child sexual exploitation including, but not limited to:
-visual depictions of a child engaging in sexually explicit or sexually suggestive acts;
-illustrated, computer-generated or other forms of realistic depictions of a human child in a sexually explicit context, or engaging in sexually explicit acts;
-sexualized commentaries about or directed at a known or unknown minor; and
-links to third-party sites that host child sexual exploitation material.

The following behaviors are also not permitted:
-sharing fantasies about or promoting engagement in child sexual exploitation;
-expressing a desire to obtain materials that feature child sexual exploitation;
-recruiting, advertising or expressing an interest in a commercial sex act involving a child, or in harboring and/or transporting a child for sexual purposes;
sending sexually explicit media to a child;
-engaging or trying to engage a child in a sexually explicit conversation;
-trying to obtain sexually explicit media from a child or trying to engage a child in sexual activity through blackmail or other incentives;
-identifying alleged victims of childhood sexual exploitation by name or image; and
-promoting or normalizing sexual attraction to minors as a form of identity or sexual orientation.

At least on paper, Twitter has very strong policies against the sort of behaviour that is outlined in the California lawsuit. It’s baffling why Twitter wouldn’t immediately remove the content. This isn’t the hill to die on for any company.

Twitter can, and does, suspend accounts for insulting pedophiles and making comments about death or castration. Yet, this incident wasn’t against their terms of service.

6. Title 47, CH 5, SUBCHAPTER II Part I § 230

(c) Protection for “Good Samaritan” blocking and screening of offensive material
(1)Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]
(d) Obligations of interactive computer service
A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.

The “Section 230” which is commonly referenced refers to the 1996 Communications Decency Act. This gave platforms — both existing, and ones that came later — significant legal protections. They were considered platforms, not publishers.

The distinction between platforms and publishers seems small, but is significant. Platforms are eligible for certain benefits and tax breaks, but are cannot (except in limited circumstances), be held liable. Publishers, however, can be much more discriminatory about what they allow to be shown.

The wording is such that it does give wiggle room for publishers to apply their own take on what material is considered offensive.

It has been suggested that Twitter could rely on its Section 230 protections, but that would not shield it from penalties for criminal actions. The allegations made in this lawsuit are not just civil, but criminal in nature.

While Twitter may not be liable for everything that goes on, this particular incident was brought to their attention. They asked for identification and age verification, received it, and then decided there was no violation to their terms of service. So claiming ignorance would be extremely difficult.

7. Loss On Social Media Anonymity?!

One issue not discussed as much is a potential consequence of legal actions against platforms like Twitter. Will this lead to the loss of anonymous accounts? Might identity verification come as an unintended consequence?

While no decent person wants children — or anyone — to be take advantage of, there is a certain security knowing that online and private life can be separated. This is the era of doxing, harassment and stalking, and as such, there are legitimate concerns for many people. This is especially true for those discussing more controversial and politically incorrect topics.

Do we really want things to go the way of Parler, who began demanding Government issued I.D., and then had a “data breach”?

8. Twitter Policies On “Medical Misinformation”

https://twitter.com/TwitterSafety/status/1267986500030955520
https://twitter.com/Policy/status/1278095924330364935
http://archive.is/fHoLx
https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html

This topic is brought up to show how selective Twitter’s commitment is to free speech, and to dissenting viewpoints. Even a charitable interpretation would be that there is political bias in how the rules and standardds are applied.

Strangely, Twitter takes a more thorough approach to monitoring and removing tweets and accounts for promoting “medical misinformation”. Despite there being many valid questions and concerns about this “pandemic”, far more of that is censored. Odd priorities.

Yet child porn and exploiting minors can remain up?

Who’s Pulling Steven Guilbeault’s Strings? (Part 2: Anti-Free Speech, Privacy)

Last year, Steven Guilbeault (rightfully) took a lot of criticism for the recommendation that media outlets be forced to obtain licenses. He later backtracked somewhat, claiming that news outlets would be exempt. Now, he’s back, pushing hate speech laws.

A disclaimer: it’s entirely possible (likely), that there are groups pushing for these laws that are not listed publicly. However, all that is listed is documented information.

Worth noting: the original intent of the bill was on “hate speech”. Sending pornography, or lewd images was just an afterthought. Still, this does raise privacy concerns, not just ones for free speech.

See Part 1 for Guilbeault’s ties to the eco-movement.

1. Free Speech Is Under Constant Threat

Check here for the series free speech. It’s a crucial topic, and is typically intertwined with other categories. Topic include: Digital Cooperation; the IGF, or Internet Governance Forum; ex-Liberal Candidate Richard Lee; the Digital Charter; Dominic LeBlanc’s proposal. There is also collusion, done by UNESCO, more UNESCO, Facebook, Google, and Twitter lobbying.

2. The Media Is Not Loyal To The Public

Truth is essential in society, but the situation in Canada is worse than people imagine. MSM in Canada (and elsewhere), has been largely obedient to the official stories since they are subsidized to do so, though they deny it. Post Media controls most outlets in Canada, and many “independents” have ties to Koch/Atlas. Real investigative journalism is needed, and some pointers are provided.

3. Important Links

https://twitter.com/s_guilbeault/status/1351219226711912454
https://twitter.com/s_guilbeault/status/1351219225302618117
Office Of The Lobbying Commissioner Of Canada
Canadian Parliament Discusses Online Hate
(Audio) Testimony Into Online Hate
Toronto Sun On Hate Crime Hoax
National Post Shrugs Off Hate Crime Hoax
National Council Of Canadian Muslims Lobbying
Centre For Israel And Jewish Affairs Lobbying
Friends Of Canadian Broadcasting Lobbying
YWCA Receives $760,000 Anti-Hate Grant
Various Initiatives/Grants From Ottawa In Recent Years
Bill C-30, Vic Toews, Online Privacy, Pornography

4. The Downside To “Hate Speech” Laws

To begin with, let’s address the elephant in the room: hate speech laws can, and often are used to silence legitimate concerns and criticisms. Worse, they are applied unevenly. When very different groups with different cultures and value are brought together, how it operates is fair discussion. What will be expected, what compromises will be made, and how to settle differences must be addressed.

Regardless of whether a person prefers a more assimilationist approach, or is more libertarian, hard questions have to be asked. When such questions cannot be asked — because of hate speech laws — it doesn’t erase the concerns, but simply erodes public trust.

Banning valid discussion with false accusations of racism, or false claims of violence, does nothing to advance open discourse. Instead, it’s used to gaslight and prevent necessary discussion.

Is this a call to violence, or to condone violence? Certainly not. But all too often, ideas and violence are wrongly conflated.

5. Hate Crime Hoaxes Undermine Public Trust

Now Toronto Police say the alleged attack on an 11-year-old girl wearing a hijab last week was a hoax. In other words, the hijabi girl and her brother simply made up the story.

We still don’t know enough whether this incident was orchestrated to further entrench the sense of victimhood among Canada’s Muslims or if it was a tale made up by the 11-year-old girl to cover up some other incident.

Khawlah Noman isn’t the first Muslim girl to pull off such a hoax, but she surely must be the youngest to do so.

Another valid question must be asked. Before passing censorship laws to combat hate speech and related crimes, how many incidents actually happened, and how many are hoaxes? Before considering such laws, it’s important to know the full scale of the problem. However, some outlets continue with the narrative, even when hoaxes are exposed.

6. Canadian Parliament On Online Hate

Check this page for information on a Parliamentary study in Canada concerning online hate. Witnesses were called to give more insight into the topic. While there was a lot of reasonable discussion, one problem remains: it’s far too easy to demonize people by CLAIMING that certain topics are hate and violence.

7. National Council Of Canadian Muslims

Subject Matter Details
Legislative Proposal, Bill or Resolution
Canadian Human Rights Act and Online Hate, respecting the repealed section 13 of the CHRA and opening the Act for legislative review.
.
Legislative Proposal, Bill or Resolution, Policies or Program, Regulation
Security & Targeted Communities: Advocating for policies to enhance the security and safety of Canadian Muslim communities and other at-risk communities given the rise in hate crimes, including the Security Infrastructure Program; countering white supremacist groups
.
Policies or Program
Anti-racism: Advocating for policy initiatives in the Department of Canadian Heritage related to combating Islamophobia and discrimination, including the updating of Canada’s Action Plan Against Racism (CAPAR); Supporting various programs to promote diversity and inclusion in Canada.
Religion: Advocating for the protection of freedom of religion in Canada and with respect to the reasonable accommodation of religious observances.

One of the groups lobbying Guilbeault is the National Council of Canadian Muslims. They claim that “white supremacists” are causing a hateful environment, and that more diversity and inclusion is needed. Of course, ask how THEY accommodate minorities, and that’s hate speech.

Also noteworthy: Walied Soliman, Erin O’Toole’s Chief of Staff, is a member of the NCCM. He’s on record as supporting their activities.

8. CIJA, Centre For Israel And Jewish Affairs

Subject Matter Details
Grant, Contribution or Other Financial Benefit
Digital Citizen Contribution Program (DCCP): The objective of the project is to combat online disinformation and hate, specifically, antisemitism and antisemitic conspiracy theories related to COVID-19 where it is spreading: online via social media. Antisemitism cannot be allowed to permeate civil discourse and become mainstream.
-Activities include:
•Collect examples of how antisemitism presents itself in the context of COVID19
•Create website landing page lor campaign to highlight the campaign’s purpose and goals
•Prepare social media calendar for the duration of the campaign
Prepare Facebook ads, prepare toolkit to distribute to partner organizations to promote the campaign
•Program content for campaign, run Facebook ads, and ensure participation from various cultural groups; and
•Report to government and stakeholders on the outcome of the campaign. The Digital Citizen Contribution Program (DCCP) supports the priorities of the Digital Citizen Initiative by providing time-limited financial assistance that will support democracy and social cohesion in Canada in a digital world by enhancing and/or supporting efforts to counter online disinformation and other online harms and threats to our country’s democracy and social cohesion.
-Provide economic support for the charitable and not-for-profit sector through a direct granting program. Donations from Canadians should be incentivized through a temporary enhancement of the charitable giving tax credit, or through a donor matching program, whereby the government matches donations from Canadians.
-Public Security threats to the safety and security of the Jewish community of Canada and the extension of funding of capital costs and staff training for security of communities at risk
-The project ‘United Against Online Hate’ aims to develop a national coalition with numerous targeted communities to actively combat online hate, following recommendations from the study conducted by the House of Commons Standing Committee on Justice and Human Rights. We have been granted $141,000 for the government’s current fiscal year (ending March 31 2021). We were also awarded $31,800 for the year April 1 2021 to March 31 2022.

The page on lobbying information is very long, but well worth a read. A lot of effort has clearly gone into writing and updating this.

9. Friends Of Canadian Broadcasting

Subject Matter Details
.
Legislative Proposal, Bill or Resolution
Canadian Heritage Committee study of online hate and illegal content and promised legislation
Possible amendment to Section 19 of the Income Tax Act respecting the deductibility of digital advertising on non-Canadian platforms
Review of the Broadcasting and Telecommunications Acts with respect to the promotion of Canadian culture and democracy.
.
Policies or Program, Regulation
Broadcasting policy: regulation, funding, licensing, Canadian programming, media concentration and restrictions on foreign ownership, equal enforcement of the Broadcasting Act, application of the Broadcasting Act to non-traditional media, support for public broadcasting, independence of CBC/Radio Canada and other related governance concerns, protecting Canadian content on air and online.

This lobbying actually covers a number of topics, but online hate is one of them.

10. YWCA, Others Get Federal Grants

October 20, 2020 – Toronto, Ontario
.
The Government of Canada is committed to taking action against online hate and preventing the promotion of racism and violence. Today, the Minister for Public Safety and Emergency Preparedness, the Honourable Bill Blair, announced $759,762 to YWCA Canada for their project Block Hate: Building Resilience against Online Hate Speech.

The four-year project will examine hate speech trends across Canada and work with experts to develop online tools and digital literacy training for young Canadians aged 14 to 30 across ten communities.

The YWCA will bring together partners from digital industry, civil society, government, and academia to better understand online hate in Canada, support those targeted by hate speech, inform technical solutions to online hate, hate crime, and radicalization to violence, and increase community resilience.

The YWCA received a grant from the Federal Government, but it is hardly alone in that. Fighting online hate and hate speech appears to be a growth industry.

One also has to ask how such hate speech regulations would be enforced? What information would internet providers, or cell phone companies have to provide? What would the process and limits for that be? What privacy protections would be in place?

11. Vic Toews, Online Privacy, Bill C-30

Since the proposal did mention punishing of sharing images (even as an afterthought), let’s address this. It was in 2012 that “Conservative” Public Safety Minister Vic Toews tried to bring in Bill C-30, which could force online providers to hand over private information without a warrant. Toews gaslighted privacy concerns as people “siding with the child pornographers”. While the Bill died in 1st Reading, could something like this happen again?

12. What Are Impacts On Free Speech? Privacy?

What will this bill look like, and what are the impacts? Until the legislation is tabled, we won’t know for sure. Even then, amendments are quite likely, as are court challenges.

This shouldn’t have to be repeated, but it is. Being critical of “hate speech” for being overreaching does not equate to supporting hate or violence. All too often, false accusations of racism, hate and bigotry are used to silence legitimate concerns and questions.

Vic Toews vilified critics of warrantless searches as “pedophile sympathizers”. Could this iteration lead to critics being smeared as “Nazi supporters”? Will a provision for warrantless searches be slipped in?

It’s also possible that such legislation will be scrapped altogether. After all, Guilbeault supported mandatory media licensing only last year, but backed down under heavy pressure. This is an important story to keep an eye on.