The “Macron Leaks” operation

Introduction:

In the wake of the 21st century, information has become a key component in an increasingly digitalised environment, transforming the impact of non-state actors. Adversaries are enabled to use propaganda and disinformation to assault political will, manipulate public opinion and fragment democracies by eroding their socio-political institutions (Global Security Initiative 2022). As the citizens’ trust stands on the well-functioning democratic institutions, it is necessary to diminish the possible threats. One of the political frauds that is becoming increasingly threatening is an electoral intervention which has been made easier by the exponential development of digital platforms and the correlative increased risk of information manipulation (Jean-Baptiste 2019: 1). This paper will allow for a better understanding of how disinformation can be used to undermine political processes – such as elections. Examining the disinformation campaign in the “Macron leaks” operation will tackle the question of: “How did the Russian intelligence and American alt-right actors launch a disinformation campaign to undermine the 2017 French elections?” This will be accomplished by 1) providing an overview and definitions of the relevant theories and concepts, 2) explaining the case and its relevance 3) linking the theories and concepts with the case. By doing more research and accumulating more knowledge about the processes threatening the citizens’ trust in democracy and its institutions, we can ensure a better response and policies to tackle such threats. Therefore, developing a better understanding of the disinformation process and its impact on geopolitical events and human behaviour is vital to maintaining security and safeguarding a flourishing democracy.


Theoretical Framework:

The aim of this paper is to accumulate knowledge on the soft part of cyber operations that entangles the influence of information transmitted through cyberspace – targeting the psyche and the cognitive dimension of the information environment (Ducheine 2022). This paper will zoom into the disinformation aspect of influence operations that is used to manipulate the cognitive dimension of a social psyche in a strategic way to achieve political objectives set by the involved actors. The theory will be applied to a specific case – the “Macron leaks” operation to explain how Russian intelligence and American alt-right actors launched a disinformation campaign to change the outcome of the 2017 elections by targeting the cognitive dimension of the French citizens.

The underlining concept used in this paper is “influence operation”, which is conceptualised as an organised attempt “to achieve a specific effect among a target audience. In such instances, a variety of actors—ranging from advertisers to activists to opportunists—employ a diverse set of tactics, techniques, and procedures to affect the decision making, beliefs, and opinions of a target audience” (Thomas et al.). In this paper, the “tactics, techniques, and procedures” used to affect the decision-making will be operationalised to “information manipulation” – “a theory which focuses on the creation of a deceptive message and on the motivations of the sender” (IGI Global 2022). The actors involved in the Macron leaks operation have manipulated information to affect the voters’ behaviour and thereby aimed to alter the outcome of the 2017 French elections.

The influence operation can be further classified as “cognitive hacking”, which is part of a cyberattack “that seeks to manipulate the perception of people by exploiting their psychological vulnerabilities” (Wigmore 2022). The “Macron leaks” disinformation campaign was launched to manipulate the voters’ perception of president Macron by targeting their psychological vulnerabilities.

To achieve the desired outcome via cognitive hacking, the actors involved in the “Macron leaks” operation launched a “disinformation process” which aimed to influence the citizens “through intentionally crafted information to achieve political or strategic aims” (Ducheine 2022) This was the case of the “Macron leaks” operation as its goals were political, intending to change the outcome of the elections. Furthermore, the process evolved around “disinformation”, which is defined as “false information deliberately and often covertly spread (as by the planting of rumors) in order to influence public opinion or obscure the truth” (Merriam Webster 2022). The false information can appear in the form of rumours, fake news, and forged documents. First, a “rumour” can be defined as “a story or piece of information that may or may not be true, but that people are talking about” (Collins 2022). Secondly, “fake news” is seen as “false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke” (Cambridge dictionary 2022). When it comes to the “Macron leaks” operation, the actors have incorporated the first two of the three manoeuvres of disinformation mentioned above, to influence the public’s opinion and consequently modify the voters’ behaviour.

The disinformation process starts with preparation. The preparation begins with an intent which, in this case, is to undermine Macron’s candidacy through information operation. This follows by preparing a “narrative”, which is defined as “cognitive lenses through which people view, interpret, and make meaning of the social world (Ducheine 2022). In the influence operations, the narrative is created strategically to undermine understanding and the decision-making of the targeted people (ibid). In the “Macron leaks” operation an Islam narrative built around the association of Islam with extremism, was used to let the voters perceive Macron through the manipulated lenses that will discourage them from voting for him (Jean-Baptiste 2019: 6). The next step in the preparation is operationalising the narrative through “framing”. To frame is to “select some aspects of a perceived reality and make them more salient in a communicating text” (Pijpers 2022). In this case, the narrative will be framed by selecting certain information that exposes Macron in a negative light and hence will manipulate the voters to not vote for him. The manipulation will be further achievable by using “heuristics”, where “heuristic processing, although a constructive process, is based on the limited consideration of information” (Forgas 2001). The limited consideration of the manipulated information enables the actors to invoke biased judgements and undermine their “ability to process” (Pijpers 2022).

Furthermore, the narrative can be operationalised for two purposes: Firstly, to fit the “identity-grievance campaigns” that operate on the following assumption: “The seeking and belief of information that matches one’s pre-existing beliefs/comes from an ideologically-aligned source” (Ducheine 2022) Thereby, the narrative in these campaigns is framed to match the pre-existing beliefs of the targeted audience. In the case of “Macron leaks”, the narrative was framed to portray Macron as an “aristocrat who despises the common man, a rich banker, a globalist puppet, a supporter of Islamic extremism and uncontrolled immigration” (Jean-Baptiste 2019: 6). The portrayal of Macron as an anti-socialist and anti-nationalist develops more apathy towards him from people of socialist/nationalist beliefs.

Secondly, by using “incidental exposure campaigns,” the actors increase “foreign audience’s everyday incidental exposure to false or misleading information through international state-controlled broadcast media, online news websites, blogs, or social media” (Ducheine 2022). The banal exposure to disinformation can result in unmotivated audiences accepting and internalising the exposed disinformation (ibid). This process was used in the disinformation campaign as an anti-Macron tweet was shared more than two hundred times in half an hour, including by presidential candidates Le Pen (her aunt) and Fillon. (Jean-Baptiste 2019: 6). This campaign functions on activating fast and automatic processing that relies on heuristical thinking, emotions and habits where limited consideration of the given information can lead to acceptance.

Moreover, suppose the manipulated narratives are shared by many people. In that case, it can accumulate even more support through the “bandwagon effect”, which is a “psychological phenomenon in which people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override” (Investopedia 2022). In the “Macron leaks” operation, the actors have created fake French identities and sock puppet social-media accounts, and they’ve hijacked Twitter hashtags, social-media posts, and comments sections on news sites with memes attacking Macron (Jean-Baptiste 2022: 6).

Lastly, the actors that can carry out the processes mentioned above are labelled as “internet trolls” that “deliberately try to offend, cause trouble or directly attack people by posting derogatory comments on Facebook posts, blogs, under YouTube videos, on forums and other social media, such as Twitter and Instagram.” (Endsleigh 2022). The trolls involved in the “Macron leaks” operation were all over the digital platforms. They pretended to have found scandalous stories that would serve to discourage people from voting for Macron (Jean-Baptiste 2022: 28).

The phenomenon described above, in which a large number of messages are broadcast rapidly, repetitively, and continuously over multiple channels of news and social media without regard for truth or consistency can be linked to a “firehose of falsehood” propaganda (Kennedy 2020). Firehosing inundates the audience with a plethora of wild opinions to the extent that it becomes exhausting to continually disprove them, which was the aim of the actors involved in the ”Macron leaks” disinformation campaign (ibid).


Research strategy: Case-study approach

The closeness of a case study to real-life situations and its considerable wealth of details makes it a suitable method for acquiring an in-depth understanding of a subject. (Flyvbjerg 2006: 223). Therefore, I chose a case-study method focusing on a specific context. It allows me to generate a complex understanding of how a soft cyber operation plays out in a real-life situation. Hence, analysing the cognitive dimension through a context-dependent setting allows me to produce detail-rich discoveries that can be further compared to other relevant cases.

The case study used in this paper will be the disinformation campaign of “Macron leaks” operation that can be classified as an electoral interference attempt in the 2017 French presidential election. There was a coordinated effort to undermine Macron’s candidacy via a 3-dimension information operation: 1) a disinformation campaign involving rumours and fake news; 2) a hack targeting the computers of his campaign staff; 3) a leak—15 GB of stolen data (Jean-Baptiste 2019: 3). This paper will zoom into the first dimension of the information operation – the disinformation campaign and analyse its processes by linking it to concepts and theories of cognitive cyber operations. However, it can be stated that – 2) the hack and 3) the leak, add to the credibility of the “incidental exposure campaigns” by being able to deface the party/campaign.


Analysis & Synthesis:

The disinformation campaign against Macron began with rumours and insinuations, escalating in January and February 2017 (idem: 4). The timing was no coincidence as it was around the time when Macron became a front-runner in the polls as his biggest rival, François Fillon from the Republicans, was weakened by a political-financial scandal (ibid). As Macron surpassed him in the polls and appeared to have a viable shot at the presidency, he became the target of more aggressive and more organised attacks from two sources: the Kremlin media (RT and Sputnik) and the American alt-right (ibid). The actions of these actors concerning the disinformation campaign will be analysed in the following sections.


Russian intelligence

According to Mahjoubi – the digital manager of Macron’s campaign team, RT France and Sputnik have been the first source of fake news related to Macron (ibid). Sputnik is the property of Rossiya Segodnya – the Russian government’s news agency, whose objective is to “secure the national interests of the Russian Federation in the informational sphere” (ibid). In February 2017, the Atlantic Council’s Digital Forensic Research Lab analysed Sputnik France’s coverage and found a distinct bias against Macron, presenting him as an “US agent” (ibid). Three days after the Sputnik announcements, Macron humorously denied such accusations (idem: 5). The Sputnik news operating under the Russian government displays the implementation of an identity grievance campaign to manipulate the targeted audience with Russian -like anti-western ideology explicitly, associating Macron with the United States. By this tactic, people who already held pro-Russian anti-western beliefs are more likely to believe such accusations on behalf of Macron as it is coming from an ideologically aligned source – Russian sponsored news. As the information published by Sputnik was not accurate – they used fake news for the strategic purpose of discouraging like-minded people from voting for Macron.

The use of fake news, however, was not the only case. RT and Sputnik also maneuverer information manipulation. This was the case as they expressed a strong bias, left out important information and hid behind specific quotations of frequently partisan people that suited their viewpoint (ibid). The quotations were a strong tactic as they provided an illusion of truth and journalistic integrity (ibid). An example of this was when they interviewed Nicolas Dhuicq, a pro-Russian, right-wing French member of parliament. They have used his narrative on Macron as it matched their ideological stand and then operationalised it to a frame by selecting the juiciest quotations. Thereby, they have published unbalanced reports giving only one side of the story whilst leaving the other silent. Lastly, these actors have also manipulated information by taking a photo out of context and linking it to another event (idem: 6). For example, Sputnik has published a photo of a group of people wearing En Marche! T-shirts (ibid). However, these people were the En Marche! team in Moscow, not French press correspondents in Moscow. Therefore, by using information manipulation as a tactic to alter the cognitive dimension of the voters, the actors were able to expose only their preferred narrative, which makes the individuals more vulnerable subjects to manipulation.


Summing up the role of the Russian intelligence, the actors have used fake news implemented through incidental exposure campaigns fuelled by the Russian government that allowed them to manipulate people with the same ideological stand to develop more apathy towards Macron which makes them less prone to vote for him. Secondly, by manipulating information and presenting only the preferred frames, the voters were more exposed to such narrative and thus more likely to absorb and believe it.


American alt-right

However, the Russian intelligence was not the only player involved in the disinformation campaign, and certain attacks against Macron originated from the West. A journalist Josh Harkinson has referred to these actors as “Marine Le Pen’s ‘Foreign Legion’ of American Alt-Right Trolls”(ibid). Harkinson also stated that these “trolls” were using fake French identities and sock puppet social-media accounts, hi- jacked Twitter hashtags, social-media posts, and comments sections on news sites with memes portraying Macron as a stooge of Jewish financiers who will sell out the working class and capitulate to Muslim terrorists (ibid). The purpose of these trolls was to prevent people from voting for Macron by flooding the internet with anti-Macron content. People being exposed to this content can be influenced in the following way. Suppose the trolls are using many accounts that the people perceive as various identities. In that case, it can lead to a bandwagon effect, meaning that people will believe something because other people believe it and share it on digital platforms. This can be the result regardless of their prior beliefs, which they may ignore or override.

Moreover, the presence of identity grievance campaigns can also be noticed with the American alt-right actors that were seen with the Russian intelligence. The information spread revolved around labelling Macron as an aristocrat who despises the commoner, a wealthy banker, a globalist puppet, and a supporter of Islamic extremism and uncontrolled immigration (ibid). All these characteristics are associated with the left-wing, meaning that the actors aimed to influence the potential right-wing voters from not voting for Macron by associating him with the opposing ideology. As the information is coming out from ideologically aligned sources, right-wing voters are more likely to believe it and act accordingly.

Another example of disinformation was based on the “Islam” narrative. It was about a March 2 article designed to appear as if it came from the Belgian newspaper Le Soir, headlined “Emmanuel Macron, Saudi Arabia’s preferred candidate in the French presidential election” (ibid). The article appeared on a cloned website, imitating almost perfectly the design and layout of Le Soir, but using a different URL, lesoir.info instead of lesoir.be (ibid). It was circulated by Marion Maréchal-Le Pen, a parliamentarian and niece of Marine Le Pen, indignantly asking: “30% of the Macron campaign financed by Saudi Arabia? We demand transparency!” (ibid). Her tweet was shared more than two hundred times in half an hour. The rapid spread of the tweet has produced effects of an incidental exposure campaign as the targeted audience was intensely exposed to false information on social media platforms. The incidental exposure combined with fast processing and heretical thinking can achieve a limited consideration on behalf of the voters, leading them to accept it and hence likely to dislike Macron.

Lastly, there was a “#MacronGate” rumour. Two hours before the final televised debate between Macron and Le Pen, on Wednesday, May 3, at 7:00 p.m.,41 a user with a Latvian IP address posted two fake documents on 4chan (idem: 9m). The documents suggested that Macron had a company registered in Nevis, a small Caribbean Island, and a secret offshore bank account at the First Caribbean Bank, based in the Cayman Islands (ibid). As the rumour is spread amongst people, it can also produce a bandwagon effect that would make people more likely to believe the rumour if a large amount of people shares it. That is why it was a good tactic from the American alt-right actors who wanted to use it to influence these people’s decisions and manipulate them into not voting for Macron.

To summarise the role of the American alt-right actors, they have developed the disinformation campaign by using trolls, anti-rightist narratives, and rumours to manipulate people’s cognitions and avert them from voting for Macron.


Conclusion & Reflection:

By analysing the process of the actors involved in the disinformation campaign in the “Macron leaks operation”, we can conclude their tactics and procedures took shape in the form of influence operations. These were aimed at a cognitive dimension of the targeted audience to undermine their decisions and avert them from voting for Macron. These influence operations relied on heuristics and fast processing that enabled the use of the following tactics: fake news, rumours, incidental exposure and identity grievance campaigns that were shaped around specific narratives framed to suit the interest of the actors, allowing them to undermine the 2017 French elections.

Considering this research was solely based on one case study with two actors involved, it may vary between different actors using a different range of methods, hence the transferability of this research may be limited. It can also be argued that these methods may not always accomplish what they are perceived to be efficient at. As the disinformation campaign failed in the sense that the result of the election did not coincide with the aim of the attackers, the methods were not proved to be efficient enough.

Bibliography:

Cambridge dictionary (2022). “fake news”, https://dictionary.cambridge.org/dictionary/english/fake-news. Consulted on May 7 2022.

Collins (2022). “rumour”, https://www.collinsdictionary.com/dictionary/english/rumour. Consulted on May 7 2022.

Ducheine, P. (2022). 5512DWTF6Y Digital Warfare: The Future of Cyber Threats in the Twenty-first Century. University of Amsterdam.

Endsleigh (2022). “What is internet trolling?”, https://www.endsleigh.co.uk/blog/post/what-is-internet-trolling/. Consulted on May 7 2022.

Forgas, J.P. (2001). “Social Cognition and Affect, Psychology of”, International Encyclopedia of the Social & Behavioral Sciences

IGI Global (2022). “What is Information Manipulation Theory 2”, https://www.igi-global.com/dictionary/too-good-to-be-true/74696. Consulted on May 7 2022.

Investopedia (2020). “Bandwagon Effect”, https://www.investopedia.com/terms/b/bandwagon-effect.asp. Consulted on May 7 2022.

Jean-Baptiste, J.V. (2019). “The “Macron Leaks” Operation: A Post-Mortem”, Atlantic Council.

Kennedy, J. (2020). “The Firehose of Falsehood and How to Survive the Flood”, https://aninjusticemag.com/the-firehose-of-falsehood-and-how-to-survive-the-flood-2b291c5c2ffb. Consulted on May 8 2022.

Merriam-Webster (2022). “disinformation”, https://www.merriam-webster.com/dictionary/disinformation. Consulted on May 7 2022.

Pjipers, P. (2022). 5512DWTF6Y Digital Warfare: The Future of Cyber Threats in the Twenty-first Century. University of Amsterdam.

Thomas, E., Thompson, N. and Wanless, A. (2020). “The Challenges of Countering Influence Operations”, https://carnegieendowment.org/2020/06/10/challenges-of-countering-influence-operations-pub-82031. Consulted on May 7 2022.

Wigmore, I. (2022). “cognitive hacking”, https://www.techtarget.com/whatis/definition/cognitive-hacking. Consulted on May 7 2022.