In a report released on Thursday, U.K. lawmakers alleged of “consciously failing” to stop their sites from being used to promote terrorism and recruit extremists on the part of social media firms including Facebook, Twitter and YouTube.
Urging the technology giants to do more to remove extremist content, claims that U.S. platforms have become the “vehicle of choice in spreading propaganda” were made by the Commons home affairs select committee, which is made up of British members of parliament (MPs).
“These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands,” the report said.
“If they continue to fail to tackle this issue and allow their platforms to become the ‘Wild West’ of the internet, then it will erode their reputation as responsible operators,” the report added.
A number of attempts to get Twitter posts and YouTube videos by radical Muslim preacher Anjem Choudary taken offline were made by British authorities and the lawmakers’ accusations come after those unsuccessful efforts. Choudary was found guilty by a U.K. court last week of supporting Islamic State.
Moves to try and fight extremist materials are being made by social media companies. The fact that the company had suspended 235,000 accounts since February related to the promotion of terrorism, was pointed out be a Twitter spokesperson.
It has a “trusted flagger” program that lets approved users highlight content which they have concerns about, Google told the MPs.the YouTube staff then reviews this.
An accuracy rate for trusted flaggers was claimed by Google to be 90 percent, the report said. Facebook and Twitter “did have arrangements with government agencies” even though they did not have similar schemes, the social media companies reportedly told MPs, according to the report.
“We take our role in combatting the spread of extremist material very seriously. We remove content that incites violence, terminate accounts run by terrorist organisations and respond to legal requests to remove content that breaks UK law. We’ll continue to work with Government and law enforcement authorities to explore what more can be done to tackle radicalisation,” YouTube spokesperson told CNBC in an email.
The social network deals “swiftly and robustly” with reports of terrorism-related content, said Simon Milner, director of policy for Facebook in the U.K.
“In the rare instances that we identify accounts or material as terrorist, we’ll also look for and remove relevant associated accounts and content,” Milner said.
“Online extremism can only be tackled with a strong partnership between policymakers, civil society, academia and companies. For years we have been working closely with experts to support counter speech initiatives, encouraging people to use Facebook and other online platforms to condemn terrorist activity and to offer moderate voices in response to extremist ones.” Miller said.
The companies’ methods of rooting out extremist content are insufficient, the lawmakers however said.
“It is therefore alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts and that Twitter does not even proactively report extremist content to law enforcement agencies,” MPs said.
(Adapted from CNBC)
Categories: Regulations & Legal