Resistance and renewal
The most interesting puzzle in our times is that we so willingly sleepwalk through the process of reconstituting the conditions of human existence (Winner, 1986).
Re-constituting the present
This book has considered various aspects of the real-world matrix in order to know it, to deepen our understanding of what it is and what it means. The previous chapters provided substantive rationales and various proposals for taking informed action. This chapter discusses some of the innovations and responses now under active consideration. Table 5 provides a summary of various propositions and proposals that have surfaced in this space. The specifics of each will evolve and more detailed treatments no doubt follow. Yet even this limited sample provides clear evidence of an increasingly credible shared agenda. To the extent that if worked out, developed, valued and resourced it can become a valuable source of actions and strategies that can lead away from a high-tech Dystopia toward more desirable futures.
Table 5: | Suggested actions |
---|---|
Cook (2020) | Remember that ‘tech cannot fix itself.’ Understand what went wrong inside Silicon Valley (SV). Understand its psychological deficiencies and the full implications of the values it has chosen to follow. Monitor its (lack of) emotional intelligence and its structural biases. Promote healthier psychological norms and revise its ethical foundations. |
Snowden (2019) | Question the widespread use of illegal surveillance. Challenge its legitimacy and that of those employing it. Enact new laws to prevent it re-occurring. To avoid a nightmare future, individuals need to take back ownership of their own data. |
Doctorow (2020) | Recognise ‘fake news’ as an existential threat to social integration and the well-being of society as a whole. Rather than be distracted by arguments about surveillance per se, re-focus on the raft of issues that arise from the unrestrained re-growth of monopolies. Reduce or eliminate these using anti-trust and related regulations. Ensure that everyone’s digital rights are respected. |
Morozov (2018) | Introduce legislation to force companies to pay for the data they extract. Improve citizens’ rights to access data obtained from public sources (such as CCTV). Combine data protection with a proactive social and political agenda. Use the ‘data debate’ to re-think other utilities and services (such as welfare, unions and bureaucracy). |
Howard (2020) | Establish the principle that ‘public life belongs to the public.’ Require companies to routinely contribute such data to archives, libraries and similar public institutions. Explore new opportunities for civic, as opposed to commercial, engagement. |
Cadwalladr (2020) | Regulate in relation to four main categories. I. Safety. No product should be sold / shipped until it demonstrates safety and is free from obvious bias. 2. Privacy. Treat all private data as a human right, not an asset. 3. Honesty. Remove the oligopolistic power now exercised by companies such as Facebook and Google, especially as they affect ad networks. 4. Competition. Strengthen and enact the relevant anti-trust laws that encourage entrepreneurism and innovation. |
Tarnoff & Weigel (2019) | Don’t see IT issues as separate. Human beings have co-evolved with technologies over time. The focus should therefore be on ‘humanity/technology co-evolution.’ Society is not served well by having technologies imposed (or sold) from above. Society as a whole should be involved in deciding how to live with technology. IT companies should follow specific rules that retain democracy as a guiding principle |
Lavelle (2018) | Invert operating principles of Facebook, Google etc. Users to opt in rather than search for escape routes. They should be provided access to clearly documented and user-friendly tools for managing their data. Calibrated fines needed to deal with knowing misuse of data. Users have option of retaining all their data for a fee. |
Sample / Berners-Lee (2019) | In 2019 Tim Berners-Lee, an early internet pioneer, drafted a ‘contract for the web.’ It sought to protect human privacy, provide access to individuals’ data and establish a right to not have the latter processed. It argued for community consultation prior to products being launched, for the web to be safe and remain open for all users. Berners-Lee has also created Solid, a more person-centred data system. |
Deibert (2020) | New laws to restrain how tech companies gather, process and handle personal information. Companies required to open up algorithms etc to external scrutiny and public interest auditing. Legal protection of worker’s rights in the ‘gig’ economy. Repeal section 280 of the1996 Communications Decency Act. Apply ‘Retreat,’ ‘Reform’ and ‘Reset’ procedures grounded in strong underlying principles. |
Eggers (2018) | Update the Universal Declaration of Human Rights and add two new amendments. 1. Assert that all surveillance is inherently abhorrent and undertaken only by law enforcement with judicial oversight. 2. Resist placing everything online. Ensure that human beings can continue to live real analogue lives offline as much as possible. |
Zuboff’s (2019) magisterial critique led her to articulate two fundamental needs of supreme and vital importance to all human beings. They are the need to recover the future tense and the need for sanctuary. Both are clearly of great significance to Futurists and foresight practitioners. In relation to the former she frames her decision to spend seven years working on this book as an act of will that constitutes part of her own personal claim to the future. She states:
Will is the organ with which we summon our futures into existence…The freedom of the will is the bone structure that carries the moral flesh of every promise…These are necessary for the possibility of civilisation as a ‘moral milieu’…(They are) the basis of contracts…collective decisions to make our vision real (Zuboff 2019, p.331-333).
The notion of ‘civilisation as a moral milieu’ is a powerful and compelling one. By contrast, the conditions and agreements demanded by Google, for example, require centuries of human legal practice to be set aside in favour of what she calls ‘Uncontracts,’ (Zuboff, 2019). These are forced ‘agreements’ created by the “positivist calculations of automated machine processes.” In place of human qualities such as dialogue, problem solving and empathy, the ‘Uncontract’ leads back to ‘the blankness of perpetual compliance’ referred to above (Zuboff, 2019, p. 334-6). The ‘right to sanctuary’ is also of primary significance (Zuboff, 2019). It is among the most ancient of human rights and thus of vital and enduring value. But it is far from impregnable when ‘physical places, including our homes are increasingly saturated with informational violations as our lives are rendered as behaviour and expropriated as surplus,’ (Zuboff, 2019). Moreover, the power of Big Other ‘outruns society and law in a self-authorised destruction of (this right) as it overwhelms considerations of justice with its tactical mastery of shock and awe,’ (Zuboff, 2019). What is required, therefore, are ‘new forms of countervailing authority and power,’ (Zuboff, 2019). In place of a swelling ‘social void’ this depth critique envisages both ‘direct challenges’ to the power of Surveillance Capitalism and a commitment to ‘new forms of creative action’ (Zuboff, 2019, p.479-86). Zuboff (2019) also advances a number of broad suggestions about what, in her view, needs to be done to rein in Surveillance Capitalism (SC). In summary they include:
- Naming and establishing our bearings, re-awakening our astonishment and sharing a sense of righteous dignity.
- Giving voice to our collective outrage and refusal of the diminished futures on offer.
- Becoming alert to the historical contingency of SC by calling attention to ordinary values and expectations that existed before it began its campaign of psychic numbing.
- Establishing new centres of countervailing civic power equipped with laws that reject the fundamental legitimacy of SC’s declarations and interrupt its most basic operations (Zuboff, 2019, p.395-421).
A new regulatory regime equipped with adequate laws will clearly take time and effort to achieve. Of the three key suggestions that Zuboff makes at least two are based on historical precedents:
First, interrupt and outlaw surveillance capitalism’s data supplies and revenue flows. This means, at the front end, outlawing the secret theft of private experience. At the back end, we can disrupt revenues by outlawing markets that trade in human futures knowing that their imperatives are fundamentally anti-democratic… Second, research over the past decade suggests that when ‘users; are informed of surveillance capitalism’s backstage operations, they want protection, and they want alternatives. We need laws and regulation designed to advantage companies that want to break with surveillance capitalism… Third, lawmakers will need to support new forms of collective action, just as nearly a century ago workers won legal protection for their rights to organise, to bargain collectively and to strike. Lawmakers need citizen support, and citizens need the leadership of their elected officials (Zuboff, 2019b).
Kathy Cook’s exploration of the psychology of Silicon Valley identified similar points of clarity and reached similar conclusions. She confirmed that we are facing an ‘unprecedented transition,’ (Cook, 2020). Related to this is a strong belief that that ‘tech cannot fix itself.’ For her ‘the notion that more tech is the answer to bad tech is psychologically curious, irrational and self-serving; yet it happens constantly, not only within the tech industry, but within society,’ (Cook, 2020). She adds that ‘our increased reliance on technical solutions is rooted in a cultural narrative that purports the boundless power of technology’ (Cook, 2020, p.233). Clearly the embedded symbolic power of such cultural narratives also needs to be accounted for and moderated. What might be called the ‘dual nature’ of technology also helps clarify why the values, beliefs and practices that drive its use in these forms won’t be corrected by its promoters and developers. A staff writer for The Atlantic who attended a 2020 Las Vegas consumer electronics show concluded that all available ‘solutions’ on offer involved the use of yet more technology. Given that most existing forms have known faults and costs, she emerged with a strong sense that this high-tech industry was less concerned with solving real problems than ‘capitalising on the anxieties of the affluent.’ As such it clearly fits a wider pattern. (Mull 2020). To be at all useful initiatives must originate elsewhere. Hence Cook’s (2020) instance on:
- Understanding what went wrong in the first place.
- Understanding the psychology and values driving the industry … (in the belief that) that the world can be a better place; and,
- Working to ensure the industry moves forward with better values and healthier psych norms (which, in turn) requires a revisioning of the tech industry’s ethical foundations.
Snowdon’s (2019) account originated within the privileged spaces of the intelligence community. He saw how, under the pressure of the 9/11 attack and a renewed sense of threat, the character of that ‘intelligence’ gained new and problematic features (Snowdon, 2019). This is where events in Silicon Valley connect back directly to themes, narratives, values and priorities in the wider culture of the US. It is a nation that has a long track record of sponsoring ideologies, trends and, indeed, technologies without paying a great deal of attention to the likely consequences. Snowdon (2019) is far from alone in wanting us to ‘reclaim our data’ and, in so doing, take active steps to avoid the kind of diminished future that his own experiences have led him to fear. As noted, Doctorow (2020) has a closer, more fine-grained view of the structures, processes and products of the IT revolution and he sees ‘fake news’ as a particularly serious existential crisis. His main concern is to bring back anti-trust regulation in order to reduce or eliminate the extremes of monopoly power.
Turning the tide?
Steps are slowly being taken that seek to challenge and limit the power of the Internet Oligarchs. They’re driven by actors in several countries working on behalf of governance and civil society. For example, during 2019 the French data watchdog fined Google Euro50m ‘for failing to provide users with transparent and understandable information on its data use policies’ (Hern, 2019). The European Union (EU) has flexed its regulatory muscles on several occasions in relation to privacy, taxation and monopolistic behaviour and especially via General Data Protection Regulation (Wikipedia, 2020). The UK has begun the process of establishing critical infrastructure to enforce a new raft of regulations. It includes a new Competition and Markets Authority (CMA) that contains a dedicated Digital Markets Unit (DMU) with the power to levy serious fines upon companies that fail to abide by the new rules. Even the USA, which has been so slow to react, has shown signs of following suit. For example, in October 2020 the US justice department sued Google for illegal monopoly in the online search market. In December the US Federal Trade Commission sued Facebook for breaking anti-trust laws and threatening to break it up into smaller units (Canon, 2020). Only time will tell if Congress will have the courage to repeal the infamous Section 230 of the Communications Decency Act of 1996 mentioned above. In the absence of strong and coordinated regulatory efforts, however, attempts by individual nations to enforce a comprehensive international tax regime upon the oligarchs have proved ineffectual thus far.
During 2020 the Australian government took several small but significant steps. It confronted Google and Facebook and forced them to compensate news organisations for the loss of their advertising income and the illegal use of their material (Spears, 2020). Concerns were also expressed about how children and young people in particular are exposed to both the opportunities and the very real dangers of the on-line world. Cyber bullying is of particular concern (Ham, 2020). Very young children are particularly vulnerable since they have no defence against the digital incursions that have occurred through children’s TV programs, games, YouTube and so on. During late 2020 a report surfaced about the fact that ‘always on’ digital assistants in the home were attracting the attention of very young who were unconsciously providing family information to the remote listeners (Tapper, 2020). In response the Australian government announced that it would create an ‘online harms bill’ to augment other measures such as its existing ‘e-safety’ site. The very real threat of direct exploitation of children and young people for criminal purposes also led to increased support to the Australian Federal Police (AFP). This was part of an even larger grant of AUD$1.66 billion for a cyber-security package provided to the AFP to help the nation defend itself from the growing threat of cybercrime and cyberwar (Galloway, 2020). Tangible results did not take long to appear.
In mid-2021 the AFP, in collaboration with the FBI, revealed an undercover sting operation known as ‘Ironside’ that severely disrupted prominent drug cartels, uncovered large amounts of illegal drugs and of money and led to multiple arrests both in Australia and overseas. Instead of being frustrated by the co-option of encryption technology by criminals, law enforcement had turned it to positive use by clandestinely making the AnOm app available to them. Messaging between criminal networks previously considered ‘secure’ proved to be anything but. The operation not only led to many arrests it also demonstrated that law enforcement would, henceforth, be there in the background using the very latest tech themselves. It was a watershed moment. While what Peter Harcher calls the ‘cat and mouse game’ will certainly continue, criminal organsations everywhere were placed on notice that they were no longer as safe as they’d assumed (Harcher, 2021).
Taken at face value such practical responses on the part of various Western governments may appear to support the notion that the ‘tide’ may indeed be turning. Yet 2020 was not merely another year. Covid-19 pandemic was a classic ‘wild card’ familiar to futurists and foresight practitioners. As is well known it impacted humanity with all the force of an unstoppable biological hurricane. Under the pressure of necessity large numbers of people were driven online. Almost everyone learned how to use Zoom but few grasped how increased dependence on an already dysfunctional system would place them at greater long term risk. In the midst of a torrent of unwelcome change it’s all too easy to lose one’s bearings. All of which evokes a playbook and a text that is decidedly less optimistic. As Klein (2017) explains in her analysis of ‘disaster capitalism,’ it is during just such times of shock and disruption, while public attention is diverted, that powerful entities quietly but actively pursue their own specific interests.
As Covid-19 proceeded physical money almost disappeared only to be replaced by digital alternatives such as card and ‘contactless’ payments. Few were disposed to consider the longer-term costs of a cash-starved society, but they are considerable, especially for informal uses and the poor (Kale, 2020). They include greater anxiety for, and increasing exploitation of, unbanked people; fewer options for women fleeing abusive relationships; and reduced funding for for charities that previously relied on physical money for their cash flow. Overall, the wider public becomes more fully locked into a private banking system from which they have no escape and decreasing autonomy (Kale, 2020). Many organisations dispensed with offices requiring decision-makers and other employees to work from home and meet ‘virtually.’ Once again, the products and services offered by the Internet giants took centre stage and few involuntary ‘customers’ had time or opportunity to think beyond the moment. Journalist Anna Krien (2020), however, took a close look at the online ‘distance learning’ arrangements adopted by many schools during the pandemic. She found disturbing connections between companies like Apple and Microsoft, whose dedicated delivery platforms and content were widely taken up by schools and parents alike. During school visits she expressed her growing concerns, but to little avail. Since these companies had been courting them quietly for years it was easy for schools to slip all-too-readily into using commercially designed packages rather than those created by educators according to educational criteria (Krien, 2020).
Ronald Deibert (2020) and the Citizen Lab at the University of Toronto have considered these and similar questions. In their view too much attention has been focused on micro-issues, such as the uses and misuses of particular apps. Meanwhile, ‘an entire landscape has been shifting beneath our feet.’ Specifically, and in relation to the pandemic they suggest that:
This explosion of pandemic-era applications will invariably amplify the defects of the mobile marketing and location tracking industry – a sector made up mostly of bottom-feeder companies whose business model relies on collecting billions of user-generated data points, later sold and repackaged to advertisers, law enforcement, the military, customs and border agencies, and private security services (not to mention bounty hunters and other dubious characters). A shocking number of entrepreneurs and policy makers are nonetheless turning to this cesspool of parasitic firms – poorly regulated and highly prone to abuses – as a proposed pandemic solution… The entire ecosystem presents a bonanza for petty criminals, ransomware opportunists, spyware firms and highly sophisticated nation-state spies alike (Deibert, 2020).
Moreover, such concerns are unlikely to recede once the pandemic is over. Indeed:
Some argue that this COVID-19-era innovation cycle will pass once there is a vaccine. But the more we embrace and habituate to these new applications, the deeper their tentacles reach into our everyday lives and the harder it will be to walk it all back. The “new normal” that will emerge after COVID-19 is not a one-off, bespoke contact-tracing app. Rather, it is a world that normalizes remote surveillance tools such as Proctorio, where private homes are transformed into ubiquitously monitored workplaces and where shady biometric start-ups and data analytics companies feed off the footloose biosurveillance economy (Deibert, 2020).
This raises the very real question as to just how societies already weakened by the virus and its multi-faceted aftermath will be able to gather the will, imagination, resources and organisational capacity to somehow ‘disembed’ themselves from these very same devices and systems. As mentioned in a previous chapter there is one country where a very different dynamic has been underway for some time. For reasons best known to itself, the Chinese government has already exceeded the predations and incursions of Western Internet Oligarchs into civil society and is proceeding with the construction its very own high-tech digital dystopia. The retreat of American leadership over recent decades and the impacts of the pandemic have allowed it to proceed with its strangely arid and inhuman desire for complete state manipulation and control of its population. A valuable study by Khalil on Digital Authoritarianism examines how China viewed the pandemic as a ‘proof of concept’ opportunity to show that ‘its technology with ‘Chinese characteristics’ works and that surveillance on this scale and in an emergency is feasible and effective.’ She continues:
With the CCP’s digital authoritarianism flourishing at home, Chinese-engineered surveillance and tracking systems are now being exported around the globe in line with China’s Superpower Strategy. China is attempting to set new norms in digital rights, privacy, and data collection, simultaneously suppressing dissent at home and promoting the CCP’s geostrategic goals.’ Khalil considers this dangerous for other countries since it may well ‘result in a growing acceptance of mass surveillance, habituation to restrictions on liberties, and fewer checks on the collective use of personal data by the state, even after the public health crisis subsides.’ (Khalil, 2020).
An obvious lesson to be drawn from this particularly dangerous precedent is the greatly increased need for Democratic nations to work together and be ‘vigilant in setting standards and preserving citizens’ rights and liberties.’ If anything, it adds urgency and salience for the free nations of the world to get their own houses in order and, in so doing, present a common front. What will this take?
As discussed earlier, it’s useful to consider responses at several levels of aggregation, each of which may be appropriate to different tasks and actors. Effective coordination between different levels and types of response would certainly increase the chances that more effective options for de-coding and re-constituting the matrix will emerge. At the individual level, for example, we’ve already seen how, over the past two decades, powerful insights have constantly emerged from the efforts, the sense of agency and commitment, of particular individuals. Of the many others that could be included we should mention Tim Berners-Lee’s Contract for the Internet, Pascale’s New Laws of Robotics and author Dave Eggers bid to re-imagine the UN Declaration of Human Rights are worthy of mention (Sample, 2019; Funnell, 2020; Eggers, 2018). At the next level progressive community organisations play a strongly facilitative role. While some, such as the Oxford Internet Institute and the University of Toronto’s Citizen Lab, are located overseas, Australia also happens to be well-resourced in this area. For example, the Australia Institute hosts the Centre for Responsible Technology which published The Public Square Project (Guiao & Lewis, 2021). The report usefully identifies a number of vital themes and strategies for creating and extending public digital infrastructure. Similarly, a related organisation known as Digital Rights Watch also speaks for civil society by, for example, seeking a ban on facial recognition systems and the ‘microtargeting’ of individuals for political or commercial gain. Both organisations have active campaigns underway in relation to such matters and are easily located online. Finally, we’ve noted that government agencies have not been idle. We have recent, highly relevant proof that Australian citizens and organisations have the active support of powerful digital defence capabilities at the national level to moderate digital crime and cyber-aggression. Nor has the Australian Human Rights Commission been idle as its final and substantial report to the government, Human Rights and Technology, clearly demonstrates (Santow, 2021). In summary, while such contributions may be far from the public mind at any particular time, they are each vital players in the fight to ‘delete dystopia.’
Other, perhaps less obvious, factors may also serve to focus and undergird these efforts. For example, one of the most serious charges to be laid against the internet oligarchs, their supporters, investors and other interested parties is that in pursuit of unlimited self-interest they have worked to sustain an environment characterised by stress, conflict and confusion when what the times call for are clarity, integrity and far-sighted care. Yet at present, few seem to be explicitly aware that none of these over-confident, over-powerful entities possess anything remotely like a social licence for the intensive extractive and merchandising procedures they’ve undertaken, or for the many unauthorised uses to which this stolen ‘behavioural surplus’ has been put. To say nothing of those who divert high-tech equipment and expertise to support openly criminal enterprises. A case in point is the way that Mexican drug cartels are reported to have purchased hightech spyware from their country’s own police force (Schillis-Gallego & Lakhani, 2020). In principle, therefore, democratic agencies have every right to strip them of their illegitimately acquired dominance and power. There is certainly a huge task of institutional innovation and ‘back-filling’ to accomplish first. Ironically enough, some parts of the necessary institutional infrastructure do not need to be re-created from scratch. It may be recalled that back in 1972 an Office of Technology Assessment (OTA) was established to advise the US Congress on the ‘complex scientific and technical issues of the late 20th Century.’ By 1995 it had produced studies on a wide range of topics including ‘acid rain, health care, climate change and polygraphs.’ It was highly successful and widely emulated yet abolished in 1995 by the Reagan administration which claimed it was ‘unnecessary’ (Wikipedia, 2015). The point is that, prior to the emergence of the IT revolution and the development of surveillance capitalism, prevailing political elites in the US chose to eliminate this core institutional capability leaving the nation (and world) ever more vulnerable to the unanticipated costs of high-tech innovation (and, as we now know to our cost, entirely foreseeable events such as global pandemics). Almost three decades on Institutions of Foresight (IoFs) remain uncommon. Very few nations have a high-quality foresight capability installed at the national level to advise governments on the issues such as those discussed here. But this could change fast if what has been learned from previous iterations were to be taken up and consistently applied.
In the absence of high-quality scanning, foresight and technology assessment societies remain profoundly vulnerable to a wide variety of future hazards. These obviously include further high-impact technological innovations and their accompanying disruptions. This is particularly the case with poorer and less developed nations such as the Pacific Islands which, at the time of writing, were about to be connected to the internet by high-speed undersea cable. Needless to say, scant preparation for the ensuing social and cultural impacts had been carried out (Higginbothom, 2020). This particular example is a reminder that there are still few or no effective, non-commercial, ‘filters,’ ‘barriers’ or ‘testing / proving grounds’ through which new technologies and applications are required to pass prior to implementation.
The steady rise of Artificial Intelligence (AI) is among the most serious issues of concern, especially when united with new generations of high-tech weapons (Chan, 2019). Google’s Deep Mind project generates headlines each time it makes new discoveries but as the property of a vast private company it raises far more questions than it answers. For example, a 2020 Guardian editorial in noted that ‘Only 25% of AI papers publish their code. DeepMind, say experts, regularly does not.’ Lanier goes as far as to suggest that AI should be seen less as a technology than as an ideology. The core of the ideology is that a suite of technologies, designed by a small technical elite, can and should become autonomous from and eventually replace, rather than complement, not just individual humans but much of humanity’ (Lanier, 2020). Similar issues also proliferate in the open market as consumer electronics become more complex and powerful. Apple has, for example, been working to develop its ‘consumer smart glasses’ without reference to any substantive external foresight evaluation. These devices are intended to be worn like regular glasses but include a visible layer of digital information known as AR (Artificial Reality). While this may sound useful it raises profound questions indeed not merely about data access, privacy, regulation and so on, but about the kind of ‘cyborg’ society that would result. If, as suggested here, current IT frameworks and installations are frequently pernicious and defective, we need ways of enquiring at the social level whether such devices have any legitimate place at all in our lives, let alone those of our children.
AR glasses would not be free standing. They would become one of countless other devices engaged in what’s being called ‘world scraping.’ That is, the constant recording and up-loading of information on more or less everything. It was referred to by one IT developer as ‘a big tech dream – and a privacy activist’s nightmare.’ He added that:
Smart glasses turn people into walking CCTV cameras, and the data a company could gather from that is mindboggling. Every time someone went to a supermarket, their smart glasses would be recording pricing data, stock levels and browsing habits; every time they opened a newspaper, their glasses would know which stories they read, which adverts they looked at and which pictures they lingered on (Hern, 2020).
In this context the need for more appropriate values, enhanced worldviews and a new sense of reality and purpose is paramount. New institutions and institutional settings are required to provide the means by which societies can refresh their view of the past, present and possible futures. The hard questions are indeed right there in plain sight. How, for example, can a society ‘find its bearings’ without putting in place learning contexts in which the broad issues of history, the constitution of the present and the span of possible future options can be freely examined and discussed? How can any social entity make considered choices about its present commitments and aspirations for the future without access to high quality, dedicated foresight capabilities and services? How can anyone gain a critical purchase on existing and new technologies without the embodied social capacity to do so? It takes years of effort and application to produce highly trained people who qualify as pathfinders and guides to the chaos ahead. None of these things can happen until societies wake up to the existential predicament that humanity has created for itself. But there are distinct signs of hope. The ‘pushback’ against the Internet as a medium of extraction, exploitation and abuse has already progressed from a few lonely voices to a growing chorus of dissent. If the means can be rapidly put in place to invest in state backed, cooperatively owned and operated social media, the Oligarchs can be retired from history. They will become redundant as the character and functions of IT shift from one cultural universe (invasion, dispossession and exploitation) to another (respectful fulfillment of authentic needs).