The Australian watchdog for Internet security has fined the social network X, which has been owned by billionaire Elon Musk since October last year.
The measure of influence concerning this virtual platform provides for the economic responsibility of the company in the form of a fine, the amount of which is $386,000. The Australian regulator made this decision due to the refusal of the social network to cooperate with this watchdog as part of an investigation into the practice of countering child abuse.
In November last year, Elon Musk posted a publication on his account registered on the mentioned virtual platform, in which he stated that the removal of sexual exploitation of children is priority No. 1 for X. He also urged users in the comments to this message to report facts on the corresponding topic, which the social network should pay attention to.
The eSafety Commission, an independent regulator of safety in the digital environment, accused X of empty phrases about the fight against child exploitation. Watchdog’s claims concern not only the social network owned by Elon Musk but also Google, TikTok, Twitch, Discord, Apple, Meta, Microsoft, Skype, Snap, WhatsApp and Omegle. At the same time, the regulator noted that X and Google did not comply with the notifications provided to them at the beginning of this year about the need to carry out work to protect children from exploitation. With this statement, the watchdog in a sense hinted that these companies demonstrate the least response and do not seek to correct the shortcomings of their activities in the context of the mentioned issue to a greater extent than other brands that have been complained about.
The current legislation of Australia, which came into force in 2021, provides that Internet firms are required to provide, in response to a request from the regulator, information about what methods of work are used to ensure security in the virtual space. In case of violation of the norms, a measure of influence in the form of a fine is applied to companies. If the firm does not fulfill its obligations under the mentioned form of liability, it will face legal proceedings.
Google received a warning for non-compliance with the watchdog’s request for information on the processing of materials published on platforms owned by this company for the availability of data on child abuse.
In case X, the violations turned out to be more serious. The Australian regulator said that the social network owned by Elon Musk did not provide any answers to some of the regulator’s questions. The watchdog clarified that the virtual platform did not report how much time it needed to respond to reports of violence against children. X also did not provide information on what measures she is taking to identify the facts of the exploitation of minors in an intimate sense in live broadcasts. Another claim of the regulator concerns the fact that the social network did not report on the tools and technologies used to detect materials about child abuse.
X confirmed to the watchdog that after the reduction of 80% of the company’s workforce, the virtual platform has lost employees whose activities were related to compliance with Australian government policy. Before the acquisition of the social network by Elon Musk, two workers were responsible for the compliance of the content of materials published on the social network with the norms of Australian law.
The Watchdog claims that within three months after the change of ownership, the level of active detection of publications about child exploitation decreased to 75% from 90%. At the same time, the regulator’s report notes that the situation has improved in 2023.
eSafety Commissioner Julie Inman Grant said the spread of child abuse is a growing problem not only in Australia but around the world. She also noted that technology companies have a moral responsibility to protect minors from abuse that is stored, transmitted, and perpetrated through their services.
Julie Inman Grant says that the lack of answers from X and Google to the main questions about the practice of countering child abuse may indicate either that companies are not sure what kind of public reaction to their methodology will have, or that Internet giants should create better systems for analyzing their operations.
eSafety Commissioner also reported that industry codes and standards will come into force in Australia next year, which will be aimed at ensuring security in the virtual space.
Julie Inman Grant says that in solving this problem, accountability of the online industry is not a sufficient measure to be sure of a high level of positive perspective.
In September, Australian researchers criticized X for disabling a feature that users could apply to report election misinformation. This decision of the social network was perceived especially excitingly since access to the mentioned function was terminated on the eve of the referendum on expanding the rights of indigenous peoples of Australia.
As we have reported earlier, X Tests Three Tiers of Premium Service.