Commission asks online global giants more progress on transparency in view of EU elections

Brussels, 1 March 2019 – The electoral campaigns for the European elections  are going to start and the EU Commission is raising concerns about fake news and misinformation related to the elections in particular on online platforms.

“We urge Facebook, Google and Twitter to do more across all Member States to help ensure the integrity of the European Parliament elections in May 2019” reads the statement issued by the Commission stressing platforms have to “strengthen their cooperation with fact-checkers and academic researchers to detect disinformation campaigns and make fact-checked content more visible and widespread”.

The multiplatform global search engine Google and the two main social media Facebook and Twitter are signatories of the ‘Code of Practice against disinformation’ and have to report monthly on their actions ahead of the European Parliament elections in May 2019.

But, so far, not much progress has been done in monitoring and reporting on progress on the scrutiny of adverts placement, transparency of political advertising, closure of fake accounts and marking systems for automated bots.

Vice-President for the Digital Single Market Andrus Ansip, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King, and Commissioner for the Digital Economy and Society Mariya Gabriel said in a joint statement:

“The online platforms, which signed the Code of Practice, are rolling out their policies in Europe to support the integrity of elections. This includes better scrutiny of advertisement placements, transparency tools for political advertising, and measures to identify and block inauthentic behaviour on their services. However, we need to see more progress on the commitments made by online platforms to fight disinformation. Platforms have not provided enough details showing that new policies and tools are being deployed in a timely manner and with sufficient resources across all EU Member States. The reports provide too little information on the actual results of the measures already taken”.

The remarks of the second monthly report are: 

Facebook has not reported on results of the activities undertaken in January with respect to scrutiny of ad placements. It had earlier announced that a pan-EU archive for political and issue advertising will be available in March 2019. The report provides an update on cases of interference from third countries in EU Member States, but does not report on the number of fake accounts removed due to malicious activities targeting specifically the European Union.

Google provided data on actions taken during January to improve scrutiny of ad placements in the EU, divided per Member State. However, the metrics supplied are not specific enough and do not clarify the extent to which the actions were taken to address disinformation or for other reasons (e.g. misleading advertising). Google published a new policy for ‘election ads’ on 29 January, and will start publishing a Political Ads Transparency Report as soon as advertisers begin to run such ads. Google has not provided evidence of concrete implementation of its policies on integrity of services for the month of January.

Twitter did not provide any metrics on its commitments to improve the scrutiny of ad placements. On political ads transparency, contrary to what was announced in the implementation report in January, Twitter postponed the decision until the February report. On integrity of services, Twitter added five new account sets, comprising numerous accounts in third countries, to its Archive of Potential Foreign Operations, which are publicly available and searchable, but did not report on metrics to measure progress.

The five main goals of the Code are:

1) Disrupt advertising revenue for accounts and websites misrepresenting information and provide advertisers with adequate safety tools and information about websites purveying disinformation.

2) Enable public disclosure of political advertising and make effort towards disclosing issue-based advertising.

3) Have a clear and publicly available policy on identity and online bots and take measures to close fake accounts.

4) Offer information and tools to help people make informed decisions, and facilitate access to diverse perspectives about topics of public interest, while giving prominence to reliable sources.

5) Provide privacy-compliant access to data to researchers to track and better understand the spread and impact of disinformation.

The Code of Practice is part of the Recommendation, included in the election package, announced by Juncker in his 2018 State of the Union Address to ensure free, fair and secure European Parliament’s elections. The measures include the possibility to impose sanctions for illegal use of personal data to deliberately influence the outcome of the European elections.

Justine de Braeme