Home Insights AI-powered Manufactured Realities In The Context Of US Elections – IMPRI Impact...

AI-powered Manufactured Realities In The Context Of US Elections – IMPRI Impact And Policy Research Institute

31
0
insights template

Satyam Tripathi

In the digital age, social media platforms have become fertile ground for foreign interference in democratic processes. By exploiting vulnerabilities in online platforms and leveraging the influence of social media influencers, malicious actors can manipulate public opinion, spread disinformation, and undermine democratic institutions.

The rise of Artificial Intelligence is acting as a catalyst in the realm of manufactured realities, particularly within political campaigns. This phenomenon is not only evident among rival domestic parties in the United States but is also being exploited by both foreign state and non-state actors. This tends to undermine the electoral process of a sovereign state. Penetration into society through digital media, effectively makes the traditional borders of an independent country redundant. 

Digital Media and US Elections

When it comes to social media usage, 78 percent of the adults in the 18-29 year bracket use Instagram, as well as 74 percent of these adults use at least 5 of the social media apps asked in the Pew Research Survey[i]. The age group also comprises the new and first-time voters, making it more susceptible to political mobilization and manipulation.

A notable attempt to manipulate the election process using information technology occurred during the 2016 U.S. elections[ii] when Russian hackers targeted Hillary Clinton. Russian military intelligence agents used Google security notifications to hack into volunteers’ inboxes, extracting thousands of emails. The release of these emails in the election’s lead-up generated negative news cycles for Clinton and distracted voters. In the indictment after the 2016 US elections, it was revealed that attempts were made of larger information warfare[iii] by infecting the electorate by posting propagandist divisive content and reaching the relevant targeted group to sway the results.

Generative AI tools can create realistic-looking fake images, videos, and text, making it difficult to distinguish between genuine and fabricated information. AI algorithms can be used to target specific demographics with tailored propaganda, spreading misinformation and amplifying divisive narratives. For example,  recently Trump posted AI-generated images of Women wearing t-shirts bearing “Swifties for Trump” in an attempt to create mass psychological influence in Pop star Taylor Swift’s fans in favor of Trump.

Social influencers are being used to reach a wider audience in a manner that insinuates the narrative in the followers using AI. In early 2024, RT executives covertly recruited American influencers to support their malign influence campaign, using a front company to hide their involvement[iv]. There were 32 internet domains recently seized by the US Department of Justice that were linked to Russia. According to the US attorney general, Russia is using state media to use unwitting or unaware American influencers as conduits for spreading propaganda to target the American consumer of Social Media content. It can be linked with a motive to elicit the desired response or behavior with subconscious psychological influencing.

AI & Societal Behavioral Influencing

The foundation of AI is based on Machine Learning. It is a process of training the system or computer algorithms to learn and recognize sets of combinations & patterns based on the data fed into the system to perform functions and yield results through experience rather than a fixed set of codes or instructions made by humans. What is used in the process is the feedback mechanism based on reinforcement. A positive or desired outcome is reinforced by the algorithm and an undesired or negative outcome or behavior is invalidated or substantiated similar to the trial and error method. This allows the system algorithms to map patterns with time by feeding upon data and becoming accurate and smarter through reinforcement learning.

Both machine learning and B.F. Skinner’s theory of operant conditioning shares the common goal of influencing behavior through the application of rewards and punishments. While machine learning algorithms learn and adapt autonomously, Skinner’s principles involve deliberate manipulation of the environment to shape the behavior of organisms, particularly humans.

In his work, Walden Two (1945), Skinner, for the first time during World War 2, conceived of a utopian society based on the principles of reinforcement[v]. He believed that by understanding the science of causes of behavior, we can not only predict but also manipulate and to an extent even control human behavior[vi] His theory of operant conditioning was based on providing positive/negative reinforcements to the response of the testing subject, yielding a desired behavior over time. Consequently, behavioral engineering was Skinner’s attempt to apply operant conditioning techniques to human affairs.

While Skinner’s behavioral engineering sought to shape individual behavior through direct manipulation, Sohana Zuboff’s concept of instrumentarianism reveals a more insidious form of control – the mass manipulation of behavior through the surveillance and exploitation of personal data. Soshana Zubofff, Professor at Harvard Business School in her groundbreaking work – The Age of Surveillance Capitalism, published in 2019, proposed the term ‘Instrumentarianism’. The ability of an entity to commodify and feed on individual experiences in the form of data extraction makes it an instrumentation power. It uses economic reasoning to program these aggregate individual data to predict targeted advertising[vii]

According to Zuboff, the ability to produce behavioral data and thereby modify behavior is achieved through various analytics and techniques such as digital nudges, and behavioral engineering resulting in the rise of instrumentarianism. These intelligent systems, trained on vast datasets of human behavior, can identify and profile patterns, preferences, and vulnerabilities. By utilizing this knowledge, AI can design digital environments that subtly influence our choices, often without our conscious awareness. AI-driven machines can accurately model user preferences, raising concerns about potential manipulation of individual behavior.

This resulting emergence of an instrumentarian society,  galvanized by AI technologies where, according to Zuboff “people are herded as machines, based on social confluence, in which group pressure and computational certainty replace politics and democracy.”

PsyOps and Compromised Sovereignty

Skinner’s utopian vision, once a theoretical concept during World War 2, now finds its practical application in the realm of modern psychological operations (PsyOps). This doctrine, employed by militaries worldwide, aims to influence the behavior of target populations, often by manipulating their perceptions and beliefs.

The evolution of aerial leaflet dissemination from a targeted wartime tactic to a constant barrage of AI-enabled information and disinformation reflects a new era of low-intensity conflict, where the lines between civilian and combatant are increasingly blurred and traditional notions of war and peace are challenged. This has brought battlegrounds at home augmented by the rise of digital media penetrating into society and the subconscious insinuation of narratives.

These narratives and propaganda have spillovers to the extent of undermining the world’s oldest democracy and its sovereignty by manipulating public opinion in elections. AI-based voter analytics tools can be effectively used to distort the democratic mandate of the people by pushing the agenda that advances the external state or non-state actors’ interests to shape and influence voting behaviors.

This is evident from the engineering of the public reception recently by Hamas against the state of Israel on the global footing, which has only intensified since the October 7 attack. Shaping the public perception against Israel led to a rise in anti-semitism tendencies as well as posing Hamas and non-state regimes and actors in the regions such as Hezbollah as legitimate political entities and freedom fighters[viii].

A Chinese Communist Party network called Spamoflauge is weaponizing the Israel-Gaza war narrative, using generative AI technologies to shape global opinions against the US and portray it as a war-driven state that uses taxpayers’ money to fund and provoke other countries into warfare.[ix]

This situation has significant repercussions for audiences in Arab states as well as within U.S. political circles. According to a Pew Research Center[x] survey conducted in March 2024, 43 percent of Democrats and Democratic leaners opposed U.S. military aid to Israel. By September 2024, this percentage had risen to 53 percent, as reported by Statista[xi].

There has also been a notable shift in opinions among Republicans and Republican leaners. In the March 2024 Pew Research survey, about 50 percent of Republicans supported military aid to Israel. However, in Statista’s September 2024 survey, only about 32 percent favored maintaining the same level of military aid. 

In July 2024, President Biden drew attention to the growing antisemitism at the holocaust remembrance ceremony[xii] while former President Trump called out at the antisemitic propaganda in colleges and universities[xiii], whereas Republicans went on to accuse Kamala Harris of embracing antisemitic protestors.  The significant shift in opinions of a country through synthetic means and manufactured realities signals a backsliding of democratic tendencies.

The rapidity and effectiveness of digital influence attacks are concerning due to their broader implications for social cohesion and the undermining of the political sovereignty of independent nations, such as interference in US elections. Ultimately posing a compound security threat[xiv] for a nation-state with wider geopolitical repercussions.

Conclusion

The case of the world’s oldest democracy and its election process being a playground of manufactured realities in itself presents a disturbing image with wider repelling effects for the world. Liberty gave people freedom of expression but no orchestrated mechanism to prevent artificial manipulation of this expression based on individual perception, which is being exploited by external state and non-state actors as easily and swiftly as creating an Instagram post or reel.

AI sure will be a disruptive force if not destructive in the times to come. The most challenging aspect of this Socia Media-based psyops is the blurred distinction between soldiers and civilians. There is a need for counter-narratives and promotion of agendas that bolster the national interests.

In conclusion, the convergence of AI and social media has created a powerful tool for manipulating public opinion and undermining democratic processes. The ability to generate hyperrealistic content, target specific demographics, and spread disinformation at an unprecedented scale poses a significant threat to the integrity of elections and the stability of democratic societies.

As AI continues to evolve, it is imperative to develop robust countermeasures to mitigate the risks associated with its misuse. This includes enhancing digital literacy, promoting media literacy, and investing in AI-powered tools to detect and debunk misinformation. Additionally, policymakers must work with technology companies to establish ethical guidelines and regulations for the development of algorithms and deployment of AI technologies. Moreover creating mechanisms to prevent human biases from translating into a system.

The future of democracy in this digital age and the emergence of dual-use technologies hinges on our ability to adapt to the evolving landscape of digital warfare. By fostering critical thinking, promoting transparency, and strengthening democratic institutions, we can safeguard our societies from the insidious influence of manufactured realities.


[i] Gottfried, J. (2024, January 31). Americans’ Social Media Use. Pew Research Center. https://www.pewresearch.org/internet/2024/01/31/americans-social-media-use/

[ii] Abrams, A. (2019, April 18). Here’s What We Know So Far About Russia’s 2016 Meddling. TIME. https://time.com/5565991/russia-influence-2016-election/

[iii] Abrams, A. (2019, April 18). Here’s What We Know So Far About Russia’s 2016 Meddling. TIME. https://time.com/5565991/russia-influence-2016-election/

[iv] Treasury Takes Action as Part of a U.S. Government Response to Russia’s Foreign Malign Influence Operations. (2024, September 4). U.S. Department of the Treasury. https://home.treasury.gov/news/press-releases/jy2559

[v] Capshew, J. H. (1993). Engineering Behavior: Project Pigeon, World War II, and the Conditioning of B. F. Skinner. Technology and Culture, 34(4), 835–857. https://doi.org/10.2307/3106417

[vi] Skinner, B. F. (n.d.). SCIENCE AND HUMAN BEHAVIOR.

[vii] Weiskopf, R. (2020). Book Review: The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. Organization, 27(6), 975–978. https://doi.org/10.1177/1350508419842708

[viii] Perception Warfare as Both Threat and Opportunity in Israel’s Post-October 7 Existential War. (n.d.). Jerusalem Center for Security and Foreign Affairs. Retrieved October 31, 2024, from https://jcpa.org/article/perception-warfare-as-both-a-threat-and-opportunity-in-israels-post-october-7-existential-war/

[ix] Pro-CCP network ‘Spamouflage’ weaponizes the Gaza conflict to spread anti-US sentiment. (n.d.). ISD. Retrieved October 31, 2024, from https://www.isdglobal.org/digital_dispatches/pro-ccp-network-spamouflage-weaponizes-gaza-conflict-to-spread-anti-us-sentiment/

[x] Mohamed, L. S., Becka A. Alper, Scott Keeter, Jordan Lippert and Besheer. (2024, March 21). 2. Views of the U.S. role in the Israel-Hamas war. Pew Research Center. https://www.pewresearch.org/2024/03/21/views-of-the-u-s-role-in-the-israel-hamas-war/

[xi] Opinion Israeli military aid by party U.S. 2024. (n.d.). Statista. Retrieved October 31, 2024, from https://www.statista.com/statistics/1459435/opinion-israeli-military-aid-party-us/

[xii] Biden condemns ‘ferocious surge’ of antisemitism in US at Holocaust remembrance ceremony. (2024, May 8). Hindustan Times. https://www.hindustantimes.com/world-news/us-news/biden-condemns-ferocious-surge-of-antisemitism-in-us-at-holocaust-remembrance-ceremony-101715105639718.html

[xiii] Ulmer, A., & Ulmer, A. (2024, September 5). Trump says US colleges could lose accreditation over “antisemitic propaganda” if he’s elected. Reuters. https://www.reuters.com/world/us/trump-says-us-colleges-could-lose-accreditation-over-antisemitic-propaganda-if-2024-09-05/

[xiv] Forest, J. J. F. (2021). Political Warfare and Propaganda: An Introduction. CrimRxiv. https://doi.org/10.21428/cb6ab371.21e5b7e6