Australia’s online safety watchdog has fined X — the social media platform formerly known as Twitter — 610,500 Australian dollars ($385,000) for failing to fully explain how it tackled child sexual exploitation content
Australia’s online safety watchdog, the eSafety Commission, has imposed a hefty 610,500 Australian dollars ($385,000) fine on Elon Musk-owned X, formerly Twitter, for its apparent failure to provide an explanation of how it deals with child sexual exploitation content. The government in Canberra has strengthened its efforts to ensure online safety and X has been its first target, according to a news report.
The eSafety Commission issued legal transparency notices earlier this year to X and other social media platforms, said the report by the Associated Press. These notices inquired about their strategies for combating the rising concerns of child sexual exploitation, sexual extortion, and the live streaming of child sexual abuse.
Julie Inman Grant, eSafety Commissioner, said that both X (Twitter) and Google failed to meet the commission's expectations by not adequately responding to several critical questions. X emerged as the most delinquent party, providing no answers to certain inquiries. Notably, the company remained mum on the number of staff remaining on the trust and safety team, responsible for preventing harmful and illegal content, since Musk's takeover.
Inman Grant pointed out, "If you've got a basic H.R. (human resources) system or payroll, you'll know how many people are on each team." This lack of transparency raised concerns regarding the platform's commitment to addressing these serious issues effectively.
Elon Musk, after acquiring the company in October last year, implemented substantial cost-cutting measures, resulting in the elimination of thousands of jobs within the organization. This has added to the scrutiny surrounding the platform's ability to maintain a robust safety framework amidst these substantial changes.
The eSafety Commission's move to fine X underscores the increasing global focus on holding social media platforms accountable for their role in combating harmful content, particularly child exploitation, in the online realm. Australia, through its dedicated agency, aims to set a precedent for other nations in the fight against online threats to safety and security.
Australia’s online safety watchdog said on Monday it had fined X — the social media platform formerly known as Twitter — 610,500 Australian dollars ($385,000) for failing to fully explain how it tackled child sexual exploitation content.
Legal transparency notices
The eSafety Commission issued legal transparency notices early this year to X and other platforms questioning what they were doing to tackle a proliferation of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse.
eSafety Commissioner Julie Inman Grant said X and Google had not complied with the notices because both companies had failed to adequately respond to a number of questions.
The platform renamed X by its new owner Elon Musk was the worst offender, providing no answers to some questions including how many staff remained on the trust and safety team that worked on preventing harmful and illegal content since Musk took over, Inman Grant said.
“If you’ve got a basic H.R. (human resources) system or payroll, you’ll know how many people are on each team,” she added.
X did not immediately respond to a request for comment.
After Musk completed his acquisition of the company in October last year, he drastically cut costs and shed thousands of jobs.
X could challenge the fine in the Australian Federal Court. But the court could impose a fine of up to AU$780,000 ($493,402) per day since March when the commission first found the platform had not complied with the transparency notice.
The commission would continue to pressure X through notices to become more transparent, Inman Grant said.
“They can keep stonewalling and we’ll keep fining them,” she said.
The commission issued Google with a formal warning for providing “generic responses to specific questions,” a statement said.
Google regional director Lucinda Longcroft said the company had developed a range of technologies to proactively detect, remove and report child sexual abuse material.
“Protecting children on our platforms is the most important work we do,” Longcroft said in a statement. “Since our earliest days we have invested heavily in the industrywide fight to stop the spread of child sexual abuse material,” she added.