CANBERRA, Australia -- Australia's Parliament passed legislation on Thursday that could imprison social media executives if their platforms stream real violence such as the New Zealand mosque shootings.
Critics warn that some of the most restrictive laws about online communication in the democratic world could have unforeseen consequences, including media censorship and reduced investment in Australia.
The conservative government introduced the bills in response to the March 15 attacks in Christchurch in which an Australian white supremacist apparently used a helmet-mounted camera to broadcast live on Facebook as he shot worshippers in the two mosques.
Australia's government rushed the legislation through the last two days that Parliament sits before elections are expected in May, dispensing with the usual procedure of a committee scrutinizing its content first.
"Together we must act to ensure that perpetrators and their accomplices cannot leverage online platforms for the purpose of spreading their violent and extreme propaganda -- these platforms should not be weaponized for evil," Attorney General Christian Porter told Parliament while introducing the bill.
The opposition's spokesman on the attorney general portfolio, Mark Dreyfus, committed his centre-left Labor Party to support the bill despite misgivings. If the Labor wins the election, the law would be reviewed by a parliamentary committee.
The law has made it a crime for social media platforms not to remove "abhorrent violent material" quickly. The crime would be punishable by three years in prison and a fine of 10.5 million Australian dollars ($7.5 million), or 10% of the platform's annual turnover, whichever is larger.
Abhorrent violent material is defined as acts of terrorism, murder, attempted murder, torture, rape and kidnapping. The material must be recorded by the perpetrator or an accomplice for the law to apply. Platforms anywhere in the world would face fines of up to AU$840,000 ($597,500) if they fail to notify Australian Federal Police if they are aware their service was streaming "abhorrent violent conduct" occurring in Australia.
Dreyfus described the bill as "clumsy and flawed," and the timetable to pass it as "ridiculous." Labor first saw the legislation late Monday.
The bill could potentially undermine Australia's security co-operation with the United States by requiring U.S. internet providers to share content data with Australian Federal Police in breach of U.S. law, Dreyfus said.
The Digital Industry Group Inc. -- an association representing the digital industry in Australia including Facebook, Google and Twitter -- said taking down abhorrent content was a "highly complex problem" that required consultation with a range of experts which the government had not done.
"This law, which was conceived and passed in five days without any meaningful consultation, does nothing to address hate speech, which was the fundamental motivation for the tragic Christchurch terrorist attacks," the group's managing director Sunita Bose said in a statement.
"This creates a strict internet intermediary liability regime that is out of step with the notice-and-takedown regimes in Europe and the United States, and is therefore bad for internet users as it encourages companies to proactively surveil the vast volumes of user-generated content being uploaded at any given minute," Bose added.
Arthur Moses, president of the Australian Law Council, the nation's top lawyers group, said the law could lead to media censorship and prevent whistleblowers from using social media to shine a light on atrocities because of social media companies' fear of prosecution.
"Media freedom and whistleblowing of atrocities here and overseas have been put at risk by the ill-informed livestream laws passed by the Federal Parliament," Moses said.
The penalties would be "bad for certainty and bad for business," which could scare off online business investment in Australia, Moses said.
Australian Industry Group chief executive Innes Willox, a leading business advocate, said more time was required to ensure the law did not unnecessarily impinge on existing fundamental media rights and freedoms.
Scott Farquhar, co-founder of the Sydney-based software company Atlassian, predicted job losses in the technology industry.
"As of today, any person working at any company (globally) that allows users to upload videos or images could go to jail," Farquhar tweeted. "Guilty until proven innocent."
Fergus Hanson, head of the International Cyber Policy Center at the Australian Strategic Policy Institute, saw problems in the legislation's definitions, including how long a company had to "expeditiously" remove offence material.
Facebook livestreamed the Christchurch massacre for 17 minutes without interruption before reacting. Facebook said it removed 1.5 million videos of the shootings during the first 24 hours afterward.
It was filmed by Brenton Harrison Tarrant, 28, whose video and writings included anti-Muslim views and detailed how he planned the attack. Tarrant is scheduled to appear in court Friday and will face 50 murder and 38 attempted murder charges, according to New Zealand police.
Executives of Facebook, Google, Twitter, internet service providers and Australian phone companies met Prime Minister Scott Morrison and three ministers last week to discuss social media regulation. Communications Minister Mitch Fifield said Facebook "did not present any immediate solutions to the issues arising out of the horror that occurred in Christchurch."
Facebook did not immediately respond to a request for comment on Thursday. CEO Mark Zuckerberg used an op-ed in The Washington Post last week to invite a more active role by governments and regulators to deal the harmful online content.
"The rules governing the internet allowed a generation of entrepreneurs to build services that changed the world and created a lot of value in people's lives," Zuckerberg wrote. "It's time to update these rules to define clear responsibilities for people, companies and governments going forward."
Morrison wants to take the Australian law to a Group of 20 countries forum as a model for holding social media companies to account.
New Zealand's Justice Minister Andrew Little said his government had also made a commitment to review the role of social media and the obligations of the companies that provide the platforms. He said he had asked officials to look at the effectiveness of current hate speech laws and whether there were gaps that need to be filled.
Little said he didn't see any irony in that people were watching hearings into a bill that would place new restrictions on guns in real time on Facebook, the same platform the shooter used to broadcast the massacre.
"There's a world of difference, I think, between the exercise of a democratic function and a democratic institution like a national parliament, and some of the more toxic stuff that you see put out by individuals," he said.
----------
Associated Press writer Nick Perry in Wellington, New Zealand, contributed to this report.