There has been a fair bit of buzz lately regarding a number of high-profile players in the technology field calling for a pause in the development of artificial intelligence. Countless influencers and pundits are predicting doom and gloom, whether in the classroom, in dozens of professions facing redundancy, or in some Terminator-style dystopia where machines replace humanity, reducing people to surplus drains on the economy.
It was a viewpoint undoubtedly shared by most of those employed in the manufacture of buggy whips and spinning wheels a scant two centuries ago.
Technology has changed the human landscape numerous times over the last millennia, not always for the better, however more comfortable it has made life (at least for those living in the first world). That comfort has come with an existential threat to the entire human race and a third ‘great extinction’ of species due to our interactions with the environment—including accelerated climate change.
The refrigerator has made most of the ice block suppliers of the world redundant, the cell phone threatens the livelihood of the landline installer, taxi drivers Uber-beware and what the heck is a slide rule? Ballpoint pens, once the bane of Grade 3 cursive teachers, have themselves gone the way of inkwells and pigtails. Times change.
The argument can easily be made that technology is moving too fast for humanity (or any other benighted species with whom we share this planet). But checking into history it becomes clear there is no succor in the path of the Luddite (Google it). Like it or not, this is another instance where humanity must adapt or perish.
There are no shortages of those whose preferred course of action is to put their heads in the sand and pretend that change can be ignored, or stalled, or time turned back to some imagined Golden Age. Barring the current conflicts around the globe which threaten to turn into nuclear Armageddon, blasting what might remain of humanity back to the stone age, artificial intelligence, the information (disinformation?) age is here to stay. We need to do more than just “get used to it,” our very survival (and much of what remains of the Earth’s other species) depends on humanity “growing up.”
Pretending the predator doesn’t exist does not work for the ostrich in the end. In this case, the predator is us. Unprepared for the nuclear age, humanity has barely dodged atomic annihilation in several well-documented instances. But for the grace of one man pausing to think before pushing his finger down on the button, our time would have long been over. The First World War made global conflict unthinkable, the slaughter of the trenches made it the war to end all wars—then came the Second.
Finally, nuclear weapons and the threat of global annihilation put an end to… oh wait. It is when something becomes utterly unthinkable that it becomes so very possible.
The time is coming, provided we can manage to not unleash the fires of nuclear hell upon us, when much of humanity will become redundant—at least as far as physical or mental labour is concerned—and humanity is woefully unprepared for it. When viewed through the lens of history, any call for a few months of hiatus from the development of artificial intelligence becomes little more than a publicity stunt.
Artificial intelligence may be a threat to humanity, but AI is nowhere near as big a threat to us as is the lack of “natural” intelligence. Wisdom is in too short supply these days—but greed has rarely been more ascendant.
Humanity, and our systems of resource distribution, must adapt to the new realities brought about by technology. Humanity’s very existence depends upon it. Or we can stick fingers in our ears, sing ‘lalala,’ and bury our heads in the sand as we have been doing for far too long. The time when we could afford denial is past. It is time to grow up.