Facebook's Middle East operation is ramping up efforts to weed out extremist content and fake news disseminated by users, responding to global pressure on technology giants to better regulate their channels. “For us, there’s absolutely no space for terrorism on our platform,” said Jonathan Labin, Facebook’s managing director for Middle East and North Africa. “It’s something we’re taking very, very seriously.” More than a year after the world’s four biggest social media players – Facebook, Twitter, YouTube and Microsoft – signed up to a voluntary European Commission code of conduct pledging to remove ‘hate speech’ within 24 hours of a complaint, they are facing renewed pressure to identify and eliminate such content faster. In the UK, Prime Minister Theresa May proposed a two-hour deadline to remove extremist content. She said in a speech to the United Nations in September she had seen “too many innocent people murdered” by terrorists radicalised online. Facebook has already announced plans this year to recruit 3,000 staff to review content – and a portion of those are to be based in its regional headquarters in Dubai. The company is also working to deploy more sophisticated algorithms and artificial intelligence (AI) to alert its teams to potentially harmful content. "Our [stance against] extremist content is very clear in our community standards; it's very clear in everything we do and say. And when it comes to what we do, we do a lot," Mr Labin told <em>The National </em>as Facebook marked the opening of a new headquarters for the region in Dubai Knowledge Village, with an expanded capacity of around 150 staff, up from about 40 in the previous office. “It’s a mix of things – we use technology and people,” he continued. “On the one hand, machine learning and AI play an important role, but it’s also about human review of content. “It’s not something we’d say we’re perfect on yet, but we’re trying to tackle it through technology, through people, and through cooperation with third parties because these are challenges you can’t figure out on your own. You need to work with civil society, policy makers and other tech companies to try to solve these issues.” Facebook has worked with global anti-extremist organisations, including the UAE and US-funded think tank Hedayah, and has an expanding in-house policy team that specialises in countering violent extremism. In June, Richard Allan, vice-president, EMEA, of public policy at Facebook, said that “over the last two months, on average, we deleted around 66,000 posts reported as hate speech per week — that’s around 288,000 posts a month globally". This figure includes posts that may have been reported for hate speech but deleted for other reasons, he added. Cracking down on so-called ‘fake news’ is a newer battle for Facebook. “Again, it’s a topic we’re taking extremely seriously,” said Mr Labin. “When we think about [our approach to it], we look at what we stand for as a company. “One of the things we want to do is to help build a community. And one of the core pillars of the community we want to build is an informed community. So misinformation is against everything we stand for.” ‘Fake news’, the buzz phrase popularised during the US general elections last year, covers a wide spectrum of issues. These could range from inaccurate news stories and misleading advertising, to false and dishonest information spread by people, companies or governments with a damaging agenda. Mr Labin said Facebook seeks to strike a balance between curbing the spread of fake news and supporting genuine debate and freedom of speech. “We try to look at what is really false news and false information, and what is opinion, as we want to protect different opinions,” he said. He detailed a three-point strategy to tackle false news: shutting down fake accounts, which he said are often responsible for disseminating misinformation; making it easier for Facebook users to report fake news and monitoring feedback, and making it difficult for fake news sources to advertise on Facebook, by limiting their reach. “Often misinformation is not politically driven but monetarily driven, so we go after those monetary incentives to [misinform],” he explained. Tackling fake news is also part of the worldwide Facebook Journalism project, which launched in the UAE on October 10. The initiative will see Facebook partner with news and entertainment outlets to help them reach bigger audiences, monetise their work and protect journalists’ profiles. On Tuesday, Facebook released global guidelines on how publishers can adapt to the company’s efforts to fight fake news and other low-quality content. <strong>Facebook hopes to bring immersive video service 'Watch' to MENA</strong> Facebook intends to "move fast" on rolling out immersive video service 'Watch' in the Middle East following trials in the US at present, according to the company's regional managing director Jonathan <span>Labin</span>. Watch is a new platform for watching TV shows via Facebook. Its roster includes live and recorded shows and events where viewers can interact directly with the cast and presenters. Mr <span>Labin</span> told The National the company hopes to roll it out globally, including in <span>MENA</span>, as soon as it is ready. Mr <span>Labin</span> also said Facebook plans to develop its Messenger for Business service in the Middle East to enable regional businesses to communicate with their customers via Messenger. Thousands of <span>MENA</span> businesses have already used this service, for example Mercedes Benz, whose customers have booked test drives or enquired about new models via the app. Announcing the opening of a new regional HQ in Dubai on Thursday, <span>Labin</span> said: "Across Saudi Arabia, the UAE, Egypt and many more countries in the region, we are seeing users, content creators, and brands use Facebook in new ways. We are only 1 per cent finished in our journey here."