A new definition of Terrorist Organizations

Facebook disclosed a series of transformations to check and limit hate speech and extremism on its site, as scrutiny is growing on how the social network may be radicalizing people.

The company commenced its announcements by stating it would expand and develop its definition of terrorist organizations, adding that it planned to implement artificial intelligence to be able to better spot and block live videos of shootings. Hours later, in a letter to the chairman of a House panel, Facebook stated it would stop links from the fringe sites 8chan and 4chan from being posted on its platform. And later, it specified how it would develop an oversight board of at least 11 members to review, analyse and oversee content decisions.

Facebook, based in Silicon Valley, publicized the changes a day before the Senate Commerce Committee questioned the company, Google and Twitter on Capitol Hill about how they handle violent content. The subject of online extremism has progressively flared up among lawmakers, with the House Judiciary Committee holding a hearing in April about the escalation of white nationalism and the part that tech platforms have played in spreading hate speech. In addition, a bipartisan group of congressmen also sent a letter to Twitter, Facebook and YouTube about the presence of international terrorist organizations on the sites and how those groups stimulate hate.

Facebook in specific has been under intense pressure to limit the spread of hate messages, pictures and videos through its site and apps. As the world’s leading social network, with more than two billion users, as well as being the owner of the photo-sharing site Instagram and the messaging service WhatsApp, Facebook has the scale and audience for violent content to multiply quickly and globally.