Protect children from porn and predators: Our urgent message to Big Tech

Draft Codes leave major loopholes for child sexual exploitation

In response to an eSafety directive, tech industry agents have pitched their first draft Codes of Practice. The Codes are intended to spell out how platforms plan to reduce access and exposure to certain types of harmful online material, known as Class 1A and 1B material. We think - if approved as they are - the draft Codes will fall well short of protecting children from predators and the predatory online porn industry.

We gave our feedback on the Codes in a Submission to ONLINESAFETY.ORG.AU. We made 17 recommendations which we believe will help improve online safety for children:

  1. Codes must include mandatory time limits on responding to complaints.
  2. Providers should make detailed data available on all complaints.
  3. Providers should include access to a mechanism for end-users to make a complaint to a third party if they are dissatisfied with the provider’s response to a complaint.
  4. Codes should remove clauses specifying that end-user accounts are terminated only if they intend to cause harm.
  5. Social media platforms must not tolerate violations of laws prohibiting CSAM material. They should remove the requirement that an end-user “repeatedly violated terms and conditions, community standards, and/or acceptable use policies.”
  6. Industry should report CSAM regardless of where or to whom it is happening.
  7. Industry must invest in tools and resources to enable providers to detect and deal with first-generation, existing, and live-streamed CSAM.
  8. Industry must address live-streamed CSAM with available technology and continuing investment in innovation and resources.
  9. Industry codes must explicitly prohibit sexual discussions and other degrading and exploitative treatment of minors.
  10. Industry codes must explicitly prohibit paedophilic networking, including the use of red flag terms known for use in connecting sexual predators and aiding trade in child sexual abuse material.
  11. The timeframe of 24 hours for reporting an instance of identified CSAM as an immediate threat to the life or health of an adult or child should be changed to “immediately” or at minimum, a two hour time frame.
  12. Industry codes must explicitly prohibit monetisation of children’s content.
  13. Industry codes must explicitly prohibit the promotion of off-site monetised children’s content.
  14. Industry codes must prohibit the use of preteen children (or children whose age is less than the platform’s approved user age) in paid promotions.
  15. Remove the suggestion that it would be sufficient to require a user to declare their date of birth during the account registration process, as this is an ineffective method.
  16. Industry must use existing tools to detect behavioural signals and CSAM materials in end-to-end encrypted services.
  17. Industry must invest in tools and resources to enable providers to detect and deal with first-generation, existing, and live-streamed CSAM in end-to-end services.

We don't want any more empty words about 'zero tolerance' for child exploitation on their platforms. If Big Tech wants us to believe they care about protecting children, they will need to demonstrate it. They can start by presenting robust Industry Codes which actually prioritise children's safety over the interests of the porn industry and men who want to prey on children.

Click here to read our full submission. 

 

See also

Submission to Social Media and Online Safety inquiry

Twitter + Instagram remove child exploitation material after we exposed them

Children over profit: Big tech needs to protect children from predators, porn - Our Submission on Online Safety (Basic Online Safety Expectations - BOSE)

 


Add your comment

  • Collective Shout
    published this page in News 2022-10-19 21:24:43 +1100

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement