Vital Foresight: open preview, final seven days
(A copy of the latest newsletter from London Futurists)
Over the last two weeks, I’ve had the pleasure to receive a number of very helpful pieces of feedback on the draft content of my forthcoming new book, Vital Foresight: The Case for Active Transhumanism.
That feedback has helped me to make improvements throughout the seventeen chapters of the book.
I’ve decided to keep the prepublication preview files open for another seven days, so that members and friends of London Futurists can dip into any parts of the material that catch their interest – and (if inspired) add comments or suggestions into the files.
The draft content exists in a series of Google Doc files. The starting point is here. To read the full content of any of the subsequent chapters, all you’ll need to do is to click on a link to request access, and wait (hopefully not too long) until I’ve noticed your request and added your name to the reading list.
As well as feedback, I’m also interested at this time in short endorsements anyone is prepared to make, which I can include, along with their name, inside the book, on its back cover, and on websites promoting the book. If you particularly like something you read in the preview material, and would like to encourage a wider readership for it, please don’t hesitate to let me know.
If you’re wondering whether you’ll find the content useful, here’s my summary of twenty key topics covered in Vital Foresight that I don’t think you’ll find anywhere else:
(1) “A little foresight is a dangerous thing” – why many exercises in predicting the future end up making the future worse, rather than better.
(2) Insights from examples of seemingly bad foresight – what we can learn from looking more closely at past mis-forecasts of famines, plagues, climate change, fast progress with AI, war and peace, and terrorism.
(3) The eleven “landmines” (and “meta-landmines”) that pose the most threat of extensive damage to human civilisation. And how to avoid detonating any of them.
(4) “Shortsight” – The eight ways in which evolution has prepared us poorly to anticipate, evaluate, and steer the existential risks and existential opportunities that now confront us.
(5) “A little learning about disruption is a dangerous thing” – what most sets of recommendations get badly wrong when advocating disruption, exponentials, moonshots, and “accelerating returns”.
(6) “Surprise anticipation” – seven principles for managing the inevitable contingencies of any large transformation project.
(7) The design and use of canary signals, illustrated via the eleven landmines.
(8) “Hedgehogs, good, bad, and vital” – the importance, but also the danger, of having a single-minded vision for what the future can bring.
(9) What past sceptics of the potential for the Internet and distributed computing can teach us about the potential of the technologies of the fourth industrial revolution.
(10) “Technology overhang” – the special significance of inventions or breakthroughs that turn out to surprisingly fruitful. And why they complicate foresight.
(11) The multiple interconnections between the ‘N’, ‘B’, ‘I’, and ‘C’ quadrants of the NBIC convergence that is driving the fourth industrial revolution.
(12) Fifteen ways in which AI could change substantially over the next 5-10 years – even before AI reaches the level of AGI
(13) Why the “superlongevity”, “superintelligence”, and “superhappiness” aspirations of transhumanism need to be supplemented with “superdemocracy” and “supernarrative”
(14) Eight areas of the “transhumanist shadow” – attitudes and practices of people associated with the transhumanist movement that (rightly) attract criticism
(15) “Thirteen core transhumanist values” that underpin what I describe as “active transhumanism”, as a counter to the tendencies in the transhumanist shadow, and as the means to steer humanity toward the truly better future that lies within our grasp
(16) Sixteen criticisms of transhumanism that are unfair or confused, but which are worth exploring, since they enable a richer understanding of the issues and opportunities for transhumanism
(17) The applications of active transhumanism in both politics and geopolitics
(18) Six ways in which today’s educational systems await profound upgrades – and a proposed “vital syllabus” with twenty-one areas covering the skills everyone will need in the 2020s and beyond
(19) Examples of different kinds of potential forthcoming technological singularity, beyond simply the advent of AGI
(20) “The Singularity Principles” – 21 principles which are intended to provide the basis for practical policy recommendations, to guide society away from risks of a radically negative encounter with emergent technology toward the likelihood of a radically positive encounter.
That might make my book sound like a collection of check lists. But you’ll find that there are plenty of discursive narratives in the book too. I hope you’ll enjoy reading them.
For more information about the preview, click here.
// David W. Wood - Chair, London Futurists
PS Until Vital Foresight has been published, I’m taking a break from hosting gatherings of London Futurists. But watch out for more news soon.