Facebook & Friends: UK report reveals how social media is used to undermine global democracy
Facebook has come under fire in the US and the UK for allowing the manipulation of its platform by shady political interests intent on fomenting anti-immigrant sentiment, racism and intolerance. Meanwhile, on Tuesday, Facebook disclosed that it had identified a campaign aimed at influencing the US midterm elections.
Reporting on the “jolting” revelation by Facebook on the midterm wangle, the New York Times said that the social media platform had not “definitively” linked the campaign to Russia but had found that some of the tools and techniques used by the accounts “were similar to those used by the Internet Research Agency, the Kremlin-linked group that was at the centre of an indictment this year alleging interference in the 2016 presidential election”. The disclosure was made to US lawmakers during private briefings on Capitol Hill and in a public Facebook post. Meanwhile, a UK House of Commons Digital, Culture, Media and Sport committee (DCMS) interim report on an inquiry into disinformation and fake news prior to the Brexit referendum in 2016 has warned that the targeting of “hyper-partisan” views to instill fear or fuel prejudice in voters “places democracy at risk”. The report also revealed that there is evidence that Russia had manipulated social media in 48 countries in 2017. In April 2018, the committee heard that Facebook had carried about 15,000 posts that contained “hate speech” prior to the political violence that swept through Myanmar in 2016/17, forcing around 650,000 Rohingya Muslim refugees to flee to Bangladesh. In its 89-page “Disinformation and ‘fake news’: Interim Report” released at the weekend, the DCMS described the “global phenomenon of foreign countries wanting to influence public opinion through disinformation” as “an active threat”. The 18-month inquiry held 20 oral evidence sessions, including two informal background sessions, heard from 61 witnesses and asked over 3,500 questions. It received over 150 written submissions, numerous pieces of background evidence, as well as undertaking substantial exchanges of correspondence with organisations and individuals. “A report from the University of Oxford published in July 2018 identified evidence of formally organised social media manipulation campaigns in 48 countries, up from 28 countries last year.” This evidence had led the committee “to the role of Russia specifically, in supporting organisations that create and disseminate disinformation, false and hyper-partisan content, with the purpose of undermining public confidence and of destabilising democratic states”, said the report. The committee had also heard evidence of “a co-ordinated, long-standing campaign by the Russian Government to influence UK elections and referenda, and similar evidence of foreign interference is being investigated by the US Congress in respect of the 2016 US Presidential Election. “Thanks to these hearings we know that, during the Presidential Election, the Russians ran over 3,000 adverts on Facebook and Instagram to promote 120 Facebook pages in a campaign that reached 126 million Americans. In further evidence from Facebook given to our Committee, we know that the Russians used sophisticated targeting techniques and created customised audiences to amplify extreme voices in the campaign, particular those on sensitive topics such as race relations and immigration.” Facebook, according to Tuesday’s New York Times report, stated that it had discovered activity “around issues like a sequel to last year’s deadly ‘Unite the Right’ white supremacist rally in Charlottesville, Va. Activity was also detected around #AbolishICE, a left-wing campaign on social media that seeks to end the Immigration and Customs Enforcement agency.” “At this point in our investigation, we do not have enough technical evidence to state definitively who is behind it,” Nathaniel Gleicher, Facebook’s head of cybersecurity policy said, adding: “But we can say that these accounts engaged in some similar activity and have connected with known I.R.A accounts.” The UK inquiry described “disinformation” as “an an unconventional warfare, using technology to disrupt, to magnify, and to distort. “According to research from 89up, the communications agency, Russia Today (RT) and Sputnik published 261 media articles on the EU Referendum, with an anti-EU sentiment, between 1 January 2016 and 23 June 2016. Their report also showed that RT and Sputnik had more reach on Twitter for anti-EU content than either Vote Leave or Leave.EU, during the Referendum campaign. A joint research project by the Universities of Swansea and of Berkeley, at the University of California, also identified 156,252 Russian accounts tweeting about #Brexit and that they posted over 45,000 Brexit messages in the last 48 hours of the campaign.” Committee members said that they had first learned of the enormity of the issue from Clint Watts, a senior fellow at the Center for Cyber and Homeland Security, George Washington University, while visiting New York in February 2018. Bill Browder, CEO and co-founder of Hermitage Capital Management, had told the committee that “the purpose of Russian disinformation and Russian propaganda is to plant a seed of doubt in everybody’s mind. If they can create that kind of confusion, they have accomplished their objectives”. Edward Lucas, writer and security-policy expert, informed the committee that while Russia’s population was about about one-seventh that of the UK, “its GDP is about one-fourteenth. But it still has the capacity to do us harm. It poses a military threat in the Baltic states, where geography and Nato’s weaknesses make it hard to muster a strong conventional defence. It has a proven ability to confuse, distract and distort decision-making, both by targeted attacks on elites, and exerting broader influence on public opinion.” Jeff Silvester, Chief Operating Officer of AggregateIQ, a Canadian marketing firm with links to Cambridge Analytica, had confirmed to the UK inquiry that there had been an 80% overlap in terms of common members of audiences that had been used in campaigns run by both SCL (the British parent of Cambridge Analytica) and by AIQ. Silvester had confirmed to the inquiry that during the US presidential primaries, AIQ had advertised, using specific custom audiences with names and email addresses. Aleksandr “Dr Spectre” Kogan, a Moldovian-born data scientist who developed the app that allowed Cambridge Analytica to collect the personal data of 80 million Facebook users, told the committee that he he had worked at the University of St Petersburg, Russia, in the summer of 2013. “As a result of that initial work, Dr Kogan was involved in a research group at the same university, studying the issue of cyber-bullying, between 2014 and 2016. Dr Kogan carried out this work at the same time as he was working with Cambridge Analytica. When asked about the financing of the research group, Dr Kogan told us he thought that the Russian Government gave a block grant to the university,” said the interim report. Since Kogan had given evidence to the inquiry the UK’s Information Commissioner, Elizabeth Denham, had been investigating him. Denham and her deputy, James Dipple-Johnstone, recently met with law enforcement agencies in the US. Dipple-Johnstone had confirmed that “some of the systems linked to the investigation were accessed from IP addresses that resolve to Russia and other areas of the CIS (Commonwealth of Independent States)”. The committee noted that it was “of concern that people in Russia could have benefited from the work that Dr Kogan carried out in the UK, in connection with his work for Cambridge Analytica. We look forward to reading the ICO’s findings on this issue in due course.” With regard to Facebook, the committee said that it had attempted to gain information from October 2017 to June 2018, on the extent of Russian interference in UK political campaigns. “Time and again, Facebook chose to avoid answering our written and oral questions, to the point of obfuscation. Facebook finally agreed, in January 2018, to expand its US investigation into alleged Russian interference in the EU Referendum. However, it downplayed the extent of the problem, and told us that the St Petersburg-based Internet Research Agency (IRA) had bought only three adverts for $0.97 in the days before the Brexit vote. This did not include unpaid posts, and Facebook did not broaden its investigation beyond those IRA ‘troll farms’ identified during the US presidential election investigation.” According to evidence that Facebook had submitted to Congress, and later released publicly, Russian anti-immigrant adverts had been placed in October 2015 targeting the UK, as well as Germany and France. “These amounted to 5,514.85 roubles (around £66). We asked Facebook to confirm the total amount of political advertising paid for by Russian agencies targeting Facebook users in the UK since October 2015, to date.” Facebook had replied in June 2018 that it had not found “any systematic targeting of the UK by the IRA in the Referendum period (15 April to 23 June 2016), only the minimal activity we reported to the committee already”. When the committee met with Facebook in Washington DC, Simon Milner, Facebook’s then Policy Director UK, Middle East and Africa, had said: “Unlike the US election, we have still not been furnished with any intelligence reports from the UK authorities to suggest that there was direct Russian interference using Facebook in the Brexit Referendum.” However, it was pressure from the US Senate, and not specific US intelligence staff, that forced Facebook to conduct research into the US election, said the report. “We deem Mr Milner’s comments to the Committee to have been disingenuous and typical of Facebook’s handling of our questions. There has been a continual reluctance on the part of Facebook to conduct its own research on whether its organisation has been used by Russia to influence others. Facebook knows its system better than anyone else, and should not be passively reacting to outside concerns before they carry out their own research and take action.” In January 2018 British Prime Minister, Theresa May, announced the establishment of a dedicated national security communications unit “to be charged with combating fake news and disinformation by state actors and by others”. The recommendations in the report include a demand for Facebook founder Mark Zuckerberg to appear before the committee, for the full auditing and scrutiny of tech companies including security mechanisms and algorithm readings and for new powers for the UK’s Electoral Commission. The report also recommends an overhaul of current legislation governing political adverts during elections and calls for the strengthening of the Information Commissioner’s Office. It suggests a levy on companies operating in the UK as well as recommending a ban on microtargeted political advertising. “In this rapidly changing digital world, our existing legal framework is no longer fit for purpose,” said the authors of the report. The report quotes Tristan Harris, Co-founder and Executive Director at the Center for Humane Technology – “an organisation seeking to realign technology with the best interests of its users” – who informed the inquiry about “the many billions of people who interact with social media”. According to Harris: “There are more than 2 billion people who use Facebook, which is about the number of conventional followers of Christianity. There are about 1.8 billion users of YouTube, which is about the number of conventional followers of Islam. People check their phones about 150 times a day in the developed world.” Which equated to once every 6.4 minutes in a 16-hour day. “This is a profound change in the way in which we access information and news, one which has occurred without conscious appreciation by most of us.,” said the report. Since the inquiry had commenced in January 2017, “the Electoral Commission has reported on serious breaches by VoteLeave and other campaign groups during the 2016 EU Referendum; the Information Commissioner’s Office has found serious data breaches by Facebook and Cambridge Analytica, amongst others; the Department for Digital, Culture, Media and Sport (DDCMS) has launched the Cairncross Review into press sustainability in the digital age; and, following a Green Paper in May, 2018, the government has announced its intention to publish a White Paper later this year into making the internet and social media safer.” The report also calls for a full investigation into millionaire Bristol businessman, Arron Banks who bankrolled Leave.EU as well as his relationship and dealings with Russian officials. The committee also called for an investigation into the source of Banks’ donations, which are believed to have been derived from diamond mines in South Africa and Lesotho. DM