top of page

Book Review: Schaake, M. (2024), The Tech Coup, Princeton University Press

  • Peter Lorange
  • Sep 8
  • 8 min read
ree

Introduction

This book offers a frightening look at our modern tech-obsessed world. It also suggests how democracies can regain control over technology to build a better future before it is too late.

 

The book touches on the systematic erosion problems of democracies, that exist at the head of technology companies. But the author suggests that we might have what we need to fight back, built around a clear agenda for public policy. However, for this to happen, new coalitions of people need to get involved – politicians, public sector leaders and ordinary citizen communities. Once that happens, gaining back control over the technologies that run our lives might finally be within reach.

 

Which technologies are we talking about? AI (artificial intelligence) is already one. Yet AI may not be the only game changer. A series of other new technologies can also change our lives irreverently: i.e., biotech and neurotech developments, driven by venture capital investments, are racing for global dominance.

 

The author, Schaake, now at Stanford (from 2009 to 2019, a member of the European Parliament), leans heavily towards governmental regulation, perhaps not surprisingly so, given her background. While some of the suggestions she is making are perhaps be relatively less implementable, she does come up with a number of constructive decisions for rebalancing our democracies’ control over large tech. For Schakke, much comes down to more effective, proactive control through legislation.

 

Let us now review this highly relevant book:

 

The Challenge

A gradual erosion of democracy is being impacted by the growing power of technology companies, not only the well-known US companies (such as Google, Microsoft, Meta and Amazon) but also by tech firms from other countries such as the Israel-based NSO Group that has developed and sells the spyware system Pegasus. This industry seems excessively powerful, and there is little to no accountability. There are simply very few legislative guard-rails that apply! This imbalance seems to be further escalated by the fact that many corporate tech leaders may deeply believe that they serve their users better than governments can serve their citizens. This is perhaps no wonder – after all, the legacy of 30 years of US technology policies seems to be one of deferential treatment, indeed abdication of responsibility.

 

Data centers are springing up in many places, to enhance AI and cloud computing. Large computers are needed, heavily dependent on electric energy and water abundance for cooling. The enormous amounts of electricity as well as water used, together with the typically rather secretive behavior of data centers’ operators, tend to increase public skepticism and decrease trust.

 

Acceleration of Digital Transformation

Digital transformation appears to have significantly increased in the aftermath of COVID-19. Technology companies have taken even larger roles! How is the public interest being protected? Powerful data analysis companies have emerged. These private firms attempt to single out individuals who might have committed crimes, entered a country illegally, and so on. Yet these systems cannot “guarantee” that accuracy in identifying potential culprits. A key problem for defendants, who have been “identified” though these types of systems, is that it is usually difficult to defend oneself after being singled out by these types of software.

 

Kenya and Ukraine

The author was in Kenya as an election observer, sent by the EU in 2017. Safran, a France-based technology company had supplied a voting system and an advanced digital registration system to Kenya, but these systems broke down early on in the voting process. The messy situation that resulted led to a wide array of problems with the voting processes.  Paradoxically, technology set Kenya back significantly, instead of contributing to progress.

 

The war in Ukraine has widely been fought with technologically advanced support. Elon Musk’s SpaceX, for instance, provided support to Ukraine for enemy troop locations and movements. But, while Musk initially offered this for free, it did not take long before he began billing the US government for this “service”. Defense tech has become a very attractive business segment!

 

Private firms seem to be leveraging their technologies for power consolidation. Tech CEOs are becoming generals in geopolitical battles all over the world. Many are even taking on the roles of quasi diplomats.

 

Some of the bad players

The London-based Cambridge Analytica stands out. This company acquired data from 50 million people from Facebook which it used in various political instances, most notably, in Trump’s presidential campaign and the UK’s vote to exit the EU. While it seems clear that Facebook knew what was going on, they have never admitted to this. And when the US Senate had its infamous hearing on the issue, it became clear that only few, if any, of the senators understood what was actually going on. Facebook’s CEO/owner, Mark Zuckerberg “escaped” from his hearing without any “scars”.

 

So called “framing” seems to be a common tactic for getting around potentially restrictive regulations. These firms present potential benefits of innovations in favorable terms, thereby making it more difficult for regulators to act. In addition to creating such narratives, tech firms have also built massive lobbying activities so as to be relatively certain that the industry can regulate itself – a totally unrealistic proposition, however, according to the author.

 

Several of the tech giants have established so-called Oversight Boards, to cope with a potential increase in decisions such as the pursuance of certain research activities. However, in reality, these boards seem to be heavily restricted, as in the case of the Oversight Board of Meta (formerly Facebook). But the aim of convincing Government that things seem to be ok, seems to be successful. To focus on typically relatively weak internal constraints rather than being guided by external legislative constraints seems to work!

 

This lack of externally set limits may be especially serious when it comes to AI. Open AI’s release of ChatGPT products is perhaps the clearest example of why self-justification might not work. Whether “answers” from ChatGPT are correct or not can often be questioned. There are regrettable grounds for emergence of conspiracy theories, and racial slurs. Speed to market seems more important for Open AI than thorough testing.

 

There are efforts among industry leaders to call for regulation when it comes to AI (Musk, Wozniak, …) but these calls seem to be rather weak. As the author says, “absolute corporate power corrupts absolutely” (p. 171).

 

Reclaiming the primacy of governance

Proper regulation could be the answer, according to the author. EU seems to be in the lead here. But the so-called GAFA (Google, Amazon, Facebook, Apple) are trying to revise. National security has been raised as an issue, for instance, in the US when it comes to the operation of China owned TikTok. The US appears ready to forbid a Chinese-owned TikTok to operate in the US, out of fear that personal information will be leaked back to the Chinese Government. Similar actions apply when it comes to restricting Chinese-owned tech giants such as Huawei and ZTE. Huawei, for instance, was “shut out” as a potential provider of 5G telecommunications technology by many countries (Europe, US), even though this company’s offering at the time appeared the most interesting. Again, defense-related concerns were raised.

 

The author calls for more leadership when it comes to regulating technology. She hails the EU for being in the lead here. Conversely, the US seems to be falling behind, being much more ad hoc, defense related.

 

A major problem with regulation of technology is which legislation applies to today’s technology— new evolutionary approaches often tend to come when the legislation would apply. So, the disruptive technological revolution goes on – a true paradox!

 

The author suggests four areas where legislation may be called for.

-       Spyware, such as Israel’s NSO Group’s Pegasus. This should be regulated to become a “no-no” regarding usage.

-       Data brokers. Collecting and selling of personal data should also be outlawed.

-       Facial recognition systems. Leading firms who are active in this category are Clearview AI and Worldcoin (owned by Open AI’s CEO, Sam Altman). Here too, regulation seems to be needed to protect us. The systems seem to work in a far too arbitrary way to lend themselves to some sense of reliability.

-       Crypto currency. These seem to be speculative, driven by high volatility. What is the economic or social value added? Anonymity, typically shielding non-lawful activities, may not be a good enough reason to keep crypto currency. Again, perhaps a ban might be called for.

 

The author calls for more transparency, and highlights five areas of concern in particular:

-       Identify artificial intelligence. Where is this coming from? More transparency here might clarify the true motivations behind such initiatives.

-       Transparency of investments and bids. Are non-democratic investment entities, such as the Saudi “independent” investment fund behind various high-tech firms? An example may be Musk’s X (previously Twitter) where the Saudis are large owners. Or when it comes to bidding for land to build new data centers, it seems important that the public know which high-tech firm stands behind the particular bid.

-       The public accountability extension. Transparency is a precondition for journalists, parliamentarians and members of civil society organizations to do their work.

-       Leverage purchase power. The US Government is by far the world’s largest purchaser of software. It, as well as other governments, may use this funding vehicle to impose restrictions on high-tech firms that provide software.

-       Create technological expert services. We often see that many parliamentarians lack the specific expertise to be effective in delineating realistic legislation. Teams of independent legal experts could be appointed to support this.

 

Moving now to building accountability mechanisms, the author makes these suggestions:

-       Identify systematically important technology institutions. The EU has identified 17 very large online platforms, and two very significant search engines. Such entities are treated as if they are “too big to fail”, thus, to be handled as systematically more important than other entities in these spaces. Presumably, this might imply that somewhat leaner legislation would apply, but how? The author does not explain.

-       Courts for cyber incidents. It might be essential that independent processes be put into place to find key facts behind cyber-attacks, hence this approach.

-       Joint demographic focus. This might call for establishing a new institution structured along similar lines as the International Atomic Energy Agency. Clarity of mission, and procedures for collaboration among member states, as well as delineation of enforcement options, would be called for.

 

Finally, for reinvigorating the digital space for the public sphere, the author calls for the following initiatives:

-       Build a public entity. Along the lines as those found in entities such as the Personal Democracy Forum in NYC, or the Initiative for Digital Public Infrastructure.

-       Turn the regulation-enforcement model upside down. To better cope with potentially adverse effects from new technology developments, the relationship between lawmakers and enforcement entities many have to be changed. Present-day legislation tends to focus on today’s situation, as we have discussed, without being able to find ways to cope with new tech developments. Perhaps such developments might be not only anticipated but handled proactively. But how? The author does not tell us.

-       Knowledge to the people. When critical information is locked inside the corporate environment, citizens may simply become deprived of knowledge and understanding. This touches on the core of this book, namely, how to establish critical knowledge, as well as hopefully more understanding for many of us. Unfortunately, this knowledge base has eroded recently, according to the author. High-tech firms are the culprits!

 

To live in a world dictated by tech companies and their senior executives may not be an acceptable reality for many of us. Strengthened democratic government is needed!

 

After having read this book, this reviewed is partly shocked, partly bewildered. The shock comes from the fact that many of the adverse facts that are discussed in the book appear truly alarming! The bewilderment comes from the reviewer’s sense of questioning the realism of many of the author’s suggestions for solutions. We may perhaps still have a way to go before we are can realistically deal with the present set of successful high-tech firms. The book though, represents a clear step in the right direction. As such, it is a must read. 

 
 
 

Comments


Contact

Thanks for submitting!

bottom of page