The boomerang-axis: An insight into the expropriation of personal data and democracy in the 2016 US general election

Photo by Andrew Neel on Unsplash

In 2016 the political and democratic world as we knew it once more turned on its axis. In leaving behind the familiar democratic age, a new politics of identity and distribution under the Trump administration entered the corridors of the White House, bearing much of its success to the great influence of pervasive information organisation. For under this data-driven election, every post, search and status shared online across the US contributed to the techno-political matrix which assembled the road for Trump’s success. 

Indeed, the organisation of our social media feeds, based upon illicit data hording from private companies, and the perpetuation of techno-monopoly classes compounded the social immobility in the US democratic election. In configuring a new techno-capitalist class, the election suggested that your personal data, yes yours, worked in configuration as a mobile source for surveillance capitalism to monitor before heading to the ballot box.

For whilst we, in our Facebook, Apple and Instagram bubbles, make the information for such companies to exploit and monetise, we don’t actually own it or even realise its value in terms of algorithmic policing. This was the small print, the “terms and conditions” that we didn’t read. 

As the United States head towards the next US election, we must reconsider our psychological and shared- personal relationship with, what Mckenzie Wark describes in her book, “Capital is dead”, the ‘meta-technological’ industry which uses our information as a material, ‘designed to observe, measure, record, control and predict’ what we can- or should- do.

Photo by René DeAnda on Unsplash

Whilst ‘surveillance capitalism’ seems to anchor itself aside from classic economic modernity, it is my interpretation that it is rather the hybrid product between consumer psychology and digital information, given its two central methodologies; affective conditioning and data surveillance. 

Allow me to explain; ‘there is a whole political economy that runs on asymmetries of information as a form of control’, using our data as the mantlepiece on which to place new models of political governance and privacy exploitation. Information has thus become a material for instrumental organisation, creating ‘predictive models which further subordinate all activity to the same information political economy’. Quite frankly- if you are getting your media for free, then you are the product. 

And yet, we as consumers are galvanised into the great information matrix, abidingly selling our personalities to the universal surplus of data, none the wiser to the hidden reality actively conceptualising behind our very screens. Lured in by the offering of a seemingly ‘free service’, we subscribe our personal details, life stories and very whereabouts to a faceless data-collecting machine. 

Perhaps to best understand the vicissitudes within techno-capitalism, we should first observe the mechanisms of consumption- indeed one of the main drivers of the techno-capitalist ethos. A fundamental concept, what Marx himself described as, ‘commodity fetishism’ describes the social character of “demand” as well as the material relations between the commodity’s production dynamics. “A commodity appears at first sight an extremely obvious, trivial thing” writes Marx in volume 1 of Das Kapital, “but its analysis brings out that it is a very strange thing, abounding in metaphysical subtleties and theological niceties.” These ‘theological niceties’ translate to intangible qualities which reflect self-representation and social value, ‘metaphysical subtleties’ and hyperreal brilliances which grasp at our objectification for wants and pre-conditions our sense of reality. In psychology, this is called ‘affective conditioning’; the objectification of our desires to one singular commodity. 

Whilst hidden data analysis is used to interpret our personalities, the same hedonism for consumption online, perhaps more than in industry, is still benevolent as ever. ‘A good deal of snake oil still goes into persuading ad buyers that advertisers have magical means of persuasion that will galvanise people’s attention, lodge the brand in memory, and mobilize people’s desires toward actually buying the product’. Same old, same old. 

In branching out to online formats, the culture industry and commercial businesses thrive on personal data to pre-empt their gains and losses. With a much larger platform outstretching into the cyber cosmos, our organised attention to glossy and expensive technology is not only conducted by information corporates and undercover artificial intelligence services but is crucial to the very existence of this techno-capitalist era. Ingrained into our conditioning, we are programmed to assume material forms as a part of our cultural way of life, our dependencies on products, services, subscriptions and (not so) free trials are endless and incalculable. 

But where there is a dark side to such services, it is our personal data which has become the undercover commodity gold-dust.

Photo by Luke Chesser on Unsplash

The techno industry reaches out to all of us at a time of its own choosing. By grabbing our attention through alarming subconscious configurations, we stop mid-track, pause and unlock instantly at the sound of an alert or notification bleep. On social media, we are led to believe that infinite cyber possibilities lie at our fingertips- or fingerprints- but in a paralleled reverse of power, it is we who are succumbed to the tech-industry’s calling. 

In an Netflix own documentary- ‘The Great Hack’, 2019- the prevailing worry among post- modern technological thinkers isn’t only that corporates are expropriating our personal data, injecting it into the global marketplace as the prey amongst corporate predators, but that they are ‘tracking our next move’. 

To the ordinarily sound mind, this may well seem absurd, even freakishly dystopian, for there’s no way that one- least of all a computer- can predict our next motive? After all, wouldn’t such a logic displace our belief in rational, human autonomy? Yet despite this wholly sensible counter-contention, as Netflix’s documentary exposes, monitoring mechanisms- which are largely based on predictive algorithms- create the technological infrastructure in which it is possible to build a digital model of ourselves. 

By monitoring the context that we digitally interact with; the photos that we click on, comment on and the seconds- yes seconds- that we spend viewing them, the condition of our psyche-framework is collected and soon becomes materialised in the form of a cyber-portfolio. Whether taking a ‘personality quiz’ during our lunch hour or regularly following a trending hashtag, the fruits of digital content configure a deep epistemology to our human senses, personalities and behaviours. 

But where does it all go? Or, perhaps, to whom is this data sold? Advertising companies are certainly a large sector who manifest personal information for their own profits, but there’s certainly not enough information to stimulate the world economy alone. 

Photo by Kon Karampelas on Unsplash

In 2016, information leaked about an underground and previously unknown corporation- Cambridge Analytica- came to light. They appeared to exist between political parties in America and many social media sites, the biggest being Facebook. Funded by unnamed political hedge funds, billionaires and giant monopolies, a new class with a profound vision for the next president of the United States as well as the overall direction of society took an unfathomable insight into data-collective modalities that were unreachable from the public eye. In expropriating data from Facebook and other social-media giants, rich and conclusive chains of personal information started to outline a labyrinth of voter predictability. 

Let’s imagine that a Facebook user in the run-up to the 2016 election had shared a concern about the election or a personal incident which had attributed to their life (; a family death, a factory closure or a nearby ecological disaster), digital content crafted into the form of a news story or a click-bait article would become increasingly displayed onto their feeds, stimulating and directing rivers of emotional flow; the objective being to appeal to the user’s known experiences to externally influence their voting strategy. 

In this sense, information returns in the shape of a boomerang; the user dispenses personal information about their concerns and, as such, it returns in a new hyper-real form. By devising specific content-related articles, new technological innovation could predict not only the content visible to its users- regardless of how reliable it was- but also predict in what ways the user was likely to react- or more specifically- which way they would vote. 

The veins of social media have run wild and seem, now, to have taken a route of their own. 

‘…There were meaningful, systemic changes happening around the world because of these platforms. They were positive! It grew naiver about the flip side of these coins’,

Tim Kendall, (ex-president of Pinterest) about the growing and changing powers of social media.

Another Netflix documentary a year later, ‘The Social Dilemma, 2020’ once again brought the 2016 election embezzlement to light. However, this time other dark avenues of digitalization were also examined, such as the dangerous impact social media imposes upon youth culture and teenagers. 

Now that the secret of Cambridge Analytica and its associates have been long exposed, the pre-conceived future for social media envisions a daunting reality. The question is no longer limited to how specific content navigates our actions and behaviour, but now asks who is responsible and how can we, as users, ensure privacy of our own data. With the second-term election a matter of weeks away, America has moved on- but it hasn’t forgotten the dark eclipse of party democracy and electoral fairness. 

But what point is to be taken from all this? 

Well- from my interpretation- the answer shouldn’t be a matter of simply reducing our social media usage or time. Such a myopic perspective fails to harness the real question here; the challenge to authority. For whilst a digital detox is welcomed, the long reality of our social media sustains an integral and useful segment of our lives, bringing us closer to, or at least more active within, our friend’s and family’s lives- and why should that stop? 

In the run-up to this election, Facebook has kept a low-profile and its hardly surprising why! The real roots of antagonism are not about ‘how much’ or ‘how little’ time we spend on social media, but about the systems of governance which surround its popularity and the psyche-analytics which keep us in its place. 

Surely, technological systems must be held traceable and accountable; their primary function is to applaud a pluralist connexion between peoples and communities. Indeed, the start of the techno-digital age took off in the early 1960s where in an acid-induced utopian dreamscape, many hippie entrepreneurs felt the freedom to articulate a creative personalized and war-free world was only possible in a cyber-active existence. Once harmonizing, creative and fair, there is no reason why the same intentions surrounding social media cannot be reborn today. 

It was not the users of Facebook who dismantled core tenants of liberal democracy in 2016, but the monopolisers and corporate alliances who drove democracy out of the spheres of statecraft and empirical truthfulness. If we ever hope to recover and regenerate a healthy, truth-based democracy then we must look with a sceptical gaze at the media sources that surround us. Only once misguidance and predestined calculations of voting behaviour are eliminated is it possible for the democratic axis to turn back to its original state; where self-sovereignty and common education and influence work in tandem to produce a fair and transparent electoral process.  

Leave a Reply

Your email address will not be published. Required fields are marked *