🗞️Tech firms are being questioned about their response to violent extremism and terrorism.

blue red and green letters illustration

Google, Meta, Twitter/X, WhatsApp, Telegram, and Reddit have all received legal notices from Australia's eSafety Commissioner requiring them to provide information on the measures they are taking to shield Australians from violent extremist content and activities, including terrorist threats. With the terrorist attacks in 2019 in Halle, Germany, and more recently in Buffalo, New York, it is clear that social media and other online services can be used by violent extremists to spread radicalisation and pose a threat to public safety. This raises concerns about the spread of this material and its role in online radicalisation, both in Australia and internationally. The six companies will have to respond to a series of in-depth questions regarding their approaches to addressing the problem, as per the Online Safety Act's transparency powers, which allowed the online safety regulator to issue the notices.With the growing momentum of a bipartisan push for harsher penalties, social media companies are being warned to be more proactive in taking down offensive content or false information from the internet.Following the Sydney shopping centre massacre on Saturday, calls for stricter penalties for social media companies are growing as disturbing video of the incident was posted online and false information circulated.eSafety Commissioner Julie Inman Grant said eSafety continues to receive reports about perpetrator-produced material from terror attacks, including the 2019 terrorist attack in Christchurch, that are reshared on mainstream platforms.  

“We remain concerned about how extremists weaponise technology like live-streaming, algorithms and recommender systems and other features to promote or share this hugely harmful material,” Ms Inman Grant said.  

“We are also concerned by reports that terrorists and violent extremists are moving to capitalise on the emergence of generative AI and are experimenting with ways this new technology can be misused to cause harm.  

“Earlier this month the UN-backed Tech against Terrorism reportedExternal link that it had identified users of an Islamic State forum comparing the attributes of Google’s Gemini, ChatGPT, and Microsoft’s Copilot.

“The tech companies that provide these services have a responsibility to ensure that these features and their services cannot be exploited to perpetrate such harm and that’s why we are sending these notices to get a look under the hood at what they are and are not doing.”

Nothing is free like a free press. Give now to help sustain independent journalism in your community.

It's evident from recent events that there has never been a better moment to support local news. Donate now to help sustain independent reporting.

Prime Minister Anthony Albanese said the companies had a social responsibility to stop the spread of the stabbing footage after a week filled with trauma and anger following the murders in Westfield Bondi Junction.

A journalist questioned Prime Minister Anthony Albanese about the possibility of stricter regulations for social media firms in the wake of the shocking vision that was clearly displayed over the weekend.

Mr. Albanese said , “ Well, we've taken strong action already. We quadrupled funding for the eSafety Commissioner. I say this, media companies, including social media companies, have a responsibility to act. It shouldn't need the eSafety Commissioner to intervene, to direct companies, in this case X and Meta, to take down violent videos that show people who have lost their lives as a result of what occurred with the perpetrator committing that atrocity on Saturday. The fact that that was circulated is something that had a real detrimental impact.”

The Minister for Communications has directed those companies to take down that footage. I also make this point that the police made last Saturday, which is that for people who had video footage of last Saturday, their first thought should not have been to post it online. Their first thought should have been to forward it to police to assist their investigations. We all have, because social media makes all of us publishers of content, we all have a responsibility. But the social media companies that make a lot of money out of their business have a social responsibility. And I want to see social media companies start to understand their social responsibility that they have to others as well, because that's where they get their social licence. -

According to a recent OECD reportExternal link, Telegram is the number one ranked mainstream platform when it comes to the prevalence of terrorist and violent extremist material, with Google’s YouTube ranked second and Twitter/X coming in third. The Meta-owned Facebook and Instagram round out the top five placing fourth and fifth respectively.  

WhatsApp is ranked 8th while reports have confirmed the Buffalo shooter’s ‘manifesto’ cited Reddit as the service that played a role in his radicalisation towards violent white supremacist extremism.  

“It’s no coincidence we have chosen these companies to send notices to as there is evidence that their services are exploited by terrorists and violent extremists. We want to know why this is and what they are doing to tackle the issue,” Ms Inman Grant said.

“Transparency and accountability are essential for ensuring the online industry is meeting the community’s expectations by protecting their users from these harms. Also, understanding proactive steps being taken by platforms to effectively combat TVEC is in the public and national interest.  

“That’s why transparency is a key pillar of the Global Internet Forum to Counter Terrorism and the Christchurch Call, global initiatives that many of these companies are signed up to. And yet we do not know the answer to many of these basic questions.    

“And, disappointingly, none of these companies have chosen to provide this information through the existing voluntary framework – developed in conjunction with industry – provided by the OECD. This shows why regulation, and mandatory notices, are needed to truly understand the true scope of challenges, and opportunities.”

As part of these notices, eSafety will also be asking Telegram and Reddit about measures they have in place to detect and remove child sexual exploitation and abuse.  

The six companies will have 49 days to provide responses to the eSafety Commissioner.

Got a News Tip?

Contact our editor via Proton Mail encrypted, X Direct Message, LinkedIn, or email. You can securely message him on Signal by using his username, Miko Santos.

More on The Evening Post AU

  • Get Evening Post Wrap - for nighly bite size news around Australia and the world.

  • Podwires Daily - for providing news about audio trends and podcasts.

  • Podwires Asia - for reporting on podcasting and audio trends in South East Asia

  • There’s a Glitch - updated tech news and scam and fraud trends

  • The Freeman Chronicle Podcast - features expert interviews on current political and social issues in Australia and worldwide.

  • That Podcast Exchange - This podcast is an insightful conversation with people at the top of their game and deconstructs them to find the tools, tactics, and tricks to help you achieve your dream goal as Podcast Manager.