This week, Australia’s eSafety Commissioner once again called out the big tech companies for failing to combat child sex exploitation. This second report comes just eight months after the commissioner issued legal notices to tech companies such as Twitter, Google, TikTok, Twitch and Discord under the Australian Online Safety Act, forcing tech companies to answer questions about what measures they have taken to address the problem.

Commissioner Inman Grant in the report said: “We really cannot hope to have any accountability from the online industry in tackling this issue without meaningful transparency, which is why we designed this notice.” However, the latest report disclosed that some tech companies had failed to comply, and the commissioner is extremely disappointed in the questions relating to the protection of children and the most egregious forms of online harm.
Also Read: Utah Sues TikTok for Getting Children Severely Addicted to Its Potentially Toxic Algorithm
eSafety Commissioner Has Released Its Second Report Calling Out Tech Companies
As mentioned above, some tech companies had failed to comply with the initial report, and the eSafety Commissioner has released a second report, calling out the tech companies. Unsurprisingly, Twitter faced the harshest criticism as the company is still in a spiral of controversy with its mishandling of information amidst the Israel-Hamas conflict.
In the record, the eSafety Commissioner highlighted that Twitter failed to answer some questions and even left some blank. For example, Twitter did not give a reasonable response to questions such as the time it takes the platform to respond to reports of child sexual exploitation, the measures it has in place to detect child sexual exploitation in live streams, and the tools and technologies it uses to detect such material.
Google also did not escape the clutches of the eSafety Commissioner with their generic responses to some of the questions. Gran added: “If Twitter and Google cannot answer key questions about how they tackle child sexual exploitation, they do not want to answer for how it might be perceived publicly, or they need better systems to scrutinise their own operations. Both scenarios are concerning and suggest they are not living up to their responsibilities and the expectations of the Australian community.”
Key Report Findings

Besides the two tech companies highlighted above, the second report spoke about YouTube, TikTok, Twitch and several other tech companies. Attached below are some key highlights from the reports.
- While YouTube, TikTok and Twitch are taking steps to detect child sexual exploitation in live streams, Discord is not doing so, saying its costs are prohibitive. On the other hand, Twitter did not provide the requested information.
 - TikTok and Twitch use language analysis technology to detect CSEA activities like sextortion across all parts of their services, while Discord does not use any such detection technology.
 - Twitter/X uses tools for public content but not for direct messages. Google uses this technology on YouTube but not for Chat, Gmail, Meet and Messages.
 - Google (except search services) and Discord do not block links to content involving child sexual exploitation, despite the availability of databases from expert organisations such as the Internet Watch Foundation, based in the UK.
 - In the three months since Twitter/Xs changed ownership in October 2022, proactive detection of child sexual exploitation material has decreased from 90% to 75%.
 - For partly community-moderated services like Discord and Twitch, professional safety staff are not automatically notified when a volunteer moderator identifies child sexual exploitation and abuse material.
 
For the full report, you can click here.
Australia Has Issued a Fine of $386,000 on Twitter for Empty Promises in Combating Child Sex Exploitation
In an effort to signify the importance of combating child sex exploitation, Australia has issued a fine of $386,000 on Twitter for failing to disclose information about how it tackles child pornography. Grant noted : “Twitter has publicly stated that tackling child sexual exploitation is the number 1 priority for the company, but it cannot just be empty talk. We need to see words backed up with tangible action.”
When asked why they specifically highlighted Twitter instead of other tech companies, the eSafety Commissioner replied: “Their answers revealed troubling shortfalls and inconsistencies. Twitter’s failure to comply was more urgent than other companies.”
Therefore, Twitter now has 28 days to either request a withdrawal of the notice or pay up. However, there is no current verdict from the platform, and we are no closer to knowing how the situation will play out.
What Must Tech Companies Do More?
While it may be much easier said than done, tech companies must do more to stop the rise of online child sex exploitation.
For instance, the National Center for Missing and Exploited Childrenin the United States highlighted that they received 32 million reports of child sexual exploitation and abuse, including 49.4 million images and 37.7 million videos from tech companies in 2022 alone. Moreover, a recent Australian Child Maltreatment Study (ACMS) report found that 28.5 per cent of Australians had experienced sexual abuse before 18. It is very alarming.
Fortunately or embarrassingly for the tech companies, a global coalition of more than 100 sexual abuse survivors, families, and child safety experts have written a letter and sent it to the executives highlighting some recommendations to combat child sex exploration. Here are the three main recommendations the letter proposes:
- Tech companies should seek the perspectives of users whose rights and safety have been gradually destroyed by their products
 - They should commit to not pursuing products or services that harm users unless there are stringent safeguards, informed by the perspectives of survivors, in place first. It includes the rollout of end-to-end encryption
 - Tech companies should create and publish impact assessments to determine who will be negatively affected by product decisions and what mitigation measures are in place to enforce their rights and safety.
 
Concluding Statement

Despite the people’s movement and efforts to combat child sex exploitation, there is a limit to the impact it could have. Not that social networking companies have not made efforts to tackle these problems in the past but clearly they have not been effective. Some of the most talked about efforts include Apple’s tools which allow parents to enable a feature that will detect and flag when their children send or receive any nude photos in a text message. Another challenge for tech companies is the lack of an interoperable platform or initiative that runs on all kinds of devices to ensure such sensitive content is erased from the hardware as well as the cloud.
That said, it’s pertinent that users should be educated about the perils of the vast exposure of the internet to their children. But at the same time, the onus is also on the tech companies to democratise such tools for easier access to even those who are not particularly tech savvy. Stephen Covey once said – what you do has a far greater impact than what you say. If you have any thoughts you would like to convey, you can contact us on our Facebook, Instagram and Twitter pages.
Frequently Asked Questions
What Is the eSafety Commissioner?
eSafety is an Australian independent regulator for online safety. They are responsible for raising Australians’ awareness of online safety risks and helping remove harmful content such as cyberbullying of children, online abuse of adults and child sex exploitation. Moreover, they can take legislative action to act against illegal and restricted online content.
What Other Movement Has the People Done in Combating Child Sex Exploitation?
Unfortunately, people are taking more measures to combat child sex exploitation than tech companies. For example, a leading child protection nonprofit called ChildFund International launched the #TakeItDown campaign to build public and legislative support to pressure tech companies to voluntarily search for and take down child sexual abuse material from their platforms. Furthermore, the campaign helps educate overwhelmed parents about the problem and explain why parents are not solely responsible for their children’s online safety.
Which Tech Companies Have the Most Reports on Combating Online Child Sexual Abuse?
In the same report by the NCMEC, Meta – the owner of Facebook, Instagram and WhatsApp – made around 27 million reports of child sexual exploitation and abuse material to NCMEC in 2022. By contrast, Apple, with its billions of handsets and iPads all connected to iCloud, reported just 234.
Author Profile
Latest entries
GAMING2024.06.12Top 4 Female Tekken 8 Fighters to Obliterate Your Opponents in Style!
NEWS2024.03.18Elon Musk’s SpaceX Ventures into National Security to Empower Spy Satellite Network for U.S.
GAMING2024.03.17PS Plus: 7 New Games for March and Beyond
GAMING2024.03.17Last Epoch Necromancer Builds: All You Need To Know About It
								



