Future Politics explores the transformation of politics and brings some light on the future of power, liberty, democracy and justice.
If there is a family obsessed with the future, that must be the Susskinds. The patriarch, Richard, is a known expert on the future of the legal functions; the eldest son, Daniel, is a known economist and academic who focuses on the future of work, while the youngest one, Jamie, has centered his work on the future of politics. I have already reviewed the work of Daniel, A World without Work; this time, it’s the turn of Jamie and his formidable work Future Politics.
I read (and thoroughly enjoyed) this book a few months ago, and I was planning to write a review about it for a while, but for some reason, it’s one of those things that I have been parking for later.
Then the US election happened at the beginning of this month, with its polarisation and excessive partisanship, Trump’s post-truth-based rejection of the results and the whole post-election circus (its peak being Giuliani’s farcical press conference in the Four Seasons Landscaping parking lot, between the crematorium and the porn shop). The fact that 88% of Trump voters believe the election was stolen when no reliable evidence has been presented tells me there is a problem with the American political system. This problem is certainly present also in other democracies around the world, and a part of this problem has its origins in technology and the way our lives are being transformed by it.
This is what Jamie Susskind’s Future Politics is about. It has many vital insights that help us understand in what direction our politics is going and what challenges our political systems will be facing in the next years and decades. I thought this was the right time to write this overdue book review finally.
Like his brother Daniel’s, Jamie’s work is well researched. His theses and theories are well sprinkled with plenty of anecdotes and stories that enrich the story and make the reader learn new and exciting things. If you are an avid reader of works on AI, technology, and the future of work, you will be familiar with some of these stories, but there will surely be some new and surprising ones too.
The book has six parts. In the first one, the author introduces the concept of the “digital lifeworld,” which is the digital space where we will increasingly live our lives instead of the physical, non-digital one. This lifeworld will be amply referenced to in the rest of the book, so it is good that the author spends some time explaining what it is, how our lives are increasingly being quantified and monitored, or the increasing powers of technology.
The next four parts focus on key topics of politics, such as the future of power, liberty, democracy, and justice. This is where the book’s real beef is, so we’ll delve more into the detail below.
The book ends with a final part on the future of politics, which deals with transparency and a new separation of powers, and it serves as a conclusion of sorts.
Politics is about power
Politics is about power above anything else. Many people enter into politics for public service, but many more enter it in search of power and influence. In our modern societies, we have delegated the main monopoly of power and the legitimate use of force to the State. Still, as anybody who has read Machiavelli or The Godfather knows, power is not exerted only through force or violence. There are many more subtle ways to exercise power over people.
As one of Jamie’s chapter title indicates, “Code is Power.” Nowadays, our interactions and dealings with companies, the government, and other people almost always pass through a program with a machine learning or AI algorithm within, so we are increasingly at the mercy of what is written in the code of those systems.
An oft-cited example is that of the self-driving car that is about to have an accident, has to choose between killing a child crossing the street, or hit a wall and possibly killing the car’s driver and passengers. The decision of what to do has to be ingrained in the code of the software, but who decides what the best course of action in situations like this is? Who decides who should live and who should die?
As Lessig puts it, “this is the stuff of politics”:
“Code codifies values, and yet, oddly, most people speak as if code were just a question of engineering. Or as if code is best left to the market. Or best left unaddressed by government.”
As Foucault argued when talking about the Panopticon and Orwell disturbingly portraited in 1984, scrutiny and observation are also ways to exert power.
During the cold war, around one in six people in East Germany were Stasi informers, creating probably the biggest spy-state that has ever existed in history. Everybody was spying and informing on everybody else. The problem the Stasi had was that they didn’t have enough resources to analyse the enormous amounts of data they collected on their citizens, so a big part of the information provided by informants went unnoticed and unregistered. Compare that with today’s world, where we don’t need human informers, but private companies and governments amass vast troves of data about their citizens and have the technology and computing power to analyse it all. Nowadays, we have surveillance technology that the Stasi could only dream of!
Today Facebook, Google, and other companies, but also the NSA, the CIA, and other government agencies, know much more about our everyday lives than any other institution in history, and it doesn’t seem like this trend will be bucked any time soon.
As Susskind argues in Future Politics, society will become more and more scrutable as we willingly share more and more about our lives. This scrutiny will be increasingly intimate, as we share more about where we eat, who our friends are, or what we like to do in our free time.
What is also happening is that our behaviours are becoming increasingly predictable by algorithms. As Harari states in Homo Deus, Facebook’s algorithms will know our tastes and inner desires better than ourselves, and we will end up delegating most of our decisions to an AI. Will governments use AI to predict crimes, and will they arrest people before committing a crime because the system predicted they were about to do it, like in Minority Report?
We will rely on algorithms to decide whether you should get a mortgage, should get to prison, or have access to social services. As Susskind says, we will be “who the algorithms say we are, whether we like it or not.”
An interesting concept the author presents concerning power and other themes is that of perception-control. Perception-control is about controlling what people know, think, the views they hold about the world, and thus their desires.
In today’s world, dominated by the Internet and social media, how we perceive the outside world depends a lot on the filtering done by others: what information is presented to us, in what format, with what context, etc. Filtering is a powerful means of perception-control, as if you control the flow of information in society, you can influence how society thinks.
As Manuel Castells said, “the way we feel and think, determines the way we act”. Those who control the means of perception will control the way we feel, think, and act. The Internet was supposed to usher in a new era of freedom of expression, diversity of thought, and open-mindedness, but as we have seen in the Social Dilemma documentary, the opposite is true: each of us is immersed in a different echo chamber in which we only receive the news the social media algorithm thinks we will like and engage with.
The news I and my neighbours or friends consume can be completely different, causing the polarization and radicalization of society. This is one of the reasons why today, a vast majority of Republicans think the election was stolen.
Liberty is one of the most important pillars sustaining liberal democracies, but there are different concepts of it. Not everybody understands liberty in the same way. Susskind reviews the various political currents and how they traditionally view the idea of liberty, then it applies them to the future of politics and the digital lifeworld.
The end result is interesting. For example, he believes that for Digital Libertarians “freedom in the future will mean freedom from technology,” as technology is a means of power that can be exerted over people. Digital Liberals will be those that think “technology should be engineered to ensure the maximum possible individual liberty for all.” He also has definitions for Digital Confederalism, Republicanism, or Paternalism/Moralism.
Regardless of the theoretical definitions, what is clear is that through digital means, governments and corporations will have a growing number of resources and possibilities to exert power over individuals via information and perception-control, the capacity to study and analise enormous quantities of data, or technologies like face-recognition. Liberty and freedom of individuals end where the power of institutions and corporations begins, so the more powerful these become, the less liberty is left for us, the people.
As Churchill famously quipped, “democracy is the worst form of Government except for all those other forms that have been tried from time to time.” People in the West think democracy is the best way to govern a society. Many people in Belorussia, Hong Kong, or Thailand, where people are protesting en masse for their democratic rights, risking punishment and incarceration, seem to agree.
Democracy as we know it is facing various threats and risks, though. Susskind takes us through them with his usual erudition, anecdotes, and examples.
The first threat to democracy is the abovementioned perception-control. Our perceptions and the way we perceive the world, think about it, and therefore act upon it are more subject to control than ever, often by the very institutions that we are supposed to hold accountable through the democratic process. The world is increasingly determined by what we see through digital lenses, and whoever controls those lenses can control what we think and how we act, including how we vote.
The second threat to democracy outlined by Susskind is the fragmented reality we are living in. We are living in the era of post-truth politics, epitomised by Trump, but he isn’t the only one thriving on it or promoting it. We all get our information from different sources.
Susskind quotes the US Senator and Ambassador to the UN, Daniel Moynihan, who said, “you are entitled to your own opinion, but you are not entitled to your own facts.” The problem is that rival factions are now claiming their own facts. We have seen this in play regarding climate change or the use of masks, probably costing thousands of avoidable deaths.
If there is no space for a shared and common understanding of what constitutes truth or fact, understanding between different factions and parties becomes impossible, and a democracy unworkable.
The next two threats are linked. One is the anonymity that the Internet allows and fosters, enabling people to behave horribly and treat each other atrociously online, with none of the social penalties that such behaviour would merit in the traditional social conditions of non-anonymity.
Last but not least, the Internet is being invaded by an army of bots, sometimes with laudable aims, but in most cases with malicious intent. Susskind cites a 2017 study showing that around 48 million Twitter accounts, or about 10% of all accounts at the time, are estimated to be bots. It seems that many people spend hours angrily discussing with inanimate systems online. With the great advances in AI, which is capable of creating more and more human-like automatic responses, the capacity of bots to supplant human beings credibly will increase, and with that, their presence on the Internet. As Susskind puts, “can Deliberative Democracy survive in a system where deliberation itself is no longer the preserve of human beings?”
Susskind closes this worrying part on the future of democracy with what he calls the “epistemic epidemic.” Historically many philosophers and thinkers have promoted the epistemic superiority of the democratic system over all other political systems. They would argue that it is a system that is objectively better for those governed than any other system we know. However, can we still say this to be true with the threats listed above? Can we talk about liberty, equality, or justice under these circumstances?
It is a bleak picture, but some of the issues listed above are technical and can thus be solved. Time will tell whether we are capable of doing so, but literally, the future of our democracy depends on it, so hopefully manage to do it.
When talking about justice, the author talks about “algorithmic injustice”, which is what happens “where an application of an algorithm yields consequences that are unjust.” As explained above, algorithms will be or are already being used to decide who should receive a mortgage or credit, the insurance premium amount, who gets a job, and who is sentenced to prison.
As we rely more and more on technology to help us make decisions, there is more scope for those algorithms failing and creating injustices. As Susskind says, social engineering and software engineering will become hard to distinguish, as social justice will be increasingly based on a combination of algorithms, code, and data.
We cannot talk about justice without dealing with inequality. As already discussed elsewhere, the digital lifeworld is an inequality accelerator. Capital is getting more returns than labour and is getting more concentrated every year. Digital capital owners are amassing increasing quantities of capital by employing shrinking numbers of people. Capital is being concentrated in the hands of a small number of companies, themselves in the hands of a small number of people. As Susskind puts it, this is “a perfect storm of ever-growing inequality between a tiny rich elite and a poor majority,” what he aptly calls the “Wealth Cyclone.”
Tech companies have network effects and operate in a winner-takes-all manner. They are not competing for market share, but for entire markets, and with the amount of data they have and their increasingly powerful AI, they are getting ever more competitive.
What can be done about it? The author offers some solutions, from more taxation to fund the famous UBI, to the nationalization of some of the digital resources, moving into a sharing economy, a commons economy, or other interesting ideas. Still, it is not clear that any of them would be successful. Governments are starting to look at how to regulate tech companies. They have become too powerful not only economically but also politically, but there doesn’t seem to be an easy way to do it without having other pernicious consequences to consumers or the economy itself.
Fractious and polarized elections like the one we have witnessed in the US will become more common in the rest of the world if we don’t do anything about it. Fortunately, it seems that the US institutions are robust and are working reasonably well, and there will be a peaceful transfer of power (this should be a given, but I wasn’t so sure about it only a few weeks ago).
Jamie Susskind’s Future Politics proves that even if there are many great things about the digital lifeworld and its possibilities for better politics are immense, there are also innumerable risks and threats we will have to navigate. The now-ubiquitous digital tools, AI, and technology in general have severe implications for power, liberty, democracy, and justice. Susskind’s book is an excellent guide on how to navigate those risks and offers some creative solutions to them.