Society

Technology that “empowers the individual” can threaten all of us

December 5, 2021

Whenever I hear about a new technology that “empowers the individual,” I know that one thing is likely to be true about it:  It will soon (if not already) be turned to negative and harmful ends. And yet, we as a society keep falling for the line that somehow every new technology will give us more control over our lives and make us somehow happier, more connected, safer and more powerful (but only in a good way).

It’s true that practically any technology can be turned toward harmful ends; we haven’t banned knives because they are used both to cut food and kill people. But it is the scale of damage that can be done by an individual that is changing.

Newspaper columnist Molly Ivins used to joke that she was not anti-gun, but pro-knife. In a 1993 column she wrote:

In the first place, you have catch up with someone in order to stab him. A general substitution of knives for guns would promote physical fitness. We’d turn into a whole nation of great runners. Plus, knives don’t ricochet. And people are seldom killed while cleaning their knives.

Ivins was getting at the increased scale of damage that can be done by, say, automatic weapons versus a knife.

Guns have been around for centuries and have been made more lethal over time. But their lethality may someday soon seem quaint given the future of “empowerment” that awaits us.

I start with unmanned aerial vehicles which are more familiar to us as drones. Their initial use case was actually as toys, remote-controlled model airplanes for which there remains robust demand among hobbyists. How innocent all that seems compared to the killer drones now deployed by militaries around the world! Hardly a week goes by without a report about what is called a “drone strike.” Last week was no exception.

As terrible as the power to kill individuals or groups remotely from thousands of miles away seems, more terrible is the evolution of military drones toward autonomous attacks without any contemporaneous human supervision. The ultimate expression of this evolution comes from a chilling short video seemingly depicting a sales presentation for so-called “slaughterbots,” cheap, small-scale, artificial intelligence (AI) enabled drones that can be programmed to seek out and kill specific individuals and large groups. The video is, of course, fictional, but not science fiction according to the director, Stuart Russell, a computer science professor at the University of California, Berkeley. The technology to make this type of drone a reality is already available.

To take things one step further, imagine a private individual not affiliated with any military or police organization wants to kill a rival, a spouse, or people of a race or ethnicity he doesn’t like. He may soon be able to outfit a drone to do his mayhem for him while never getting near the site of the murder (or mass murder as the case may be).

The lethality of drones and AI together and the ubiquitousness of cheap drones may turn what is also a convenient way to deliver goods, rescue people, take aerial photos or send aid to remote areas into a cause for constant surveillance of everyone—to make sure they don’t launch their own private drone strikes.

Another example of “empowerment” that is now portrayed as a “hobby” could lead to catastrophic consequences, either intentional or unintentional. For some time genetic engineering kits have been available online for do-it-yourselfers. But what exactly will they do? Making yeast glow as one kit allows seems harmless.

The technology, however, could certainly be adapted to other forms of life; viruses come to mind. Playing around with viruses for fun could get tricky and no one is selling hobby kits to do that (that I can find). But if a virus hobbyist kit arrives, it will be hard to distinguish ahead of time those simply trying to be entertaining from those hoping to be dangerous. The ability to make dangerous designer viruses has been around for a while now and the consequences could be civilization-destroying. What advances in the life sciences could be worth risking that? The question is almost never asked.

“The technology and economics of large-scale DNA synthesis have driven the cost of gene synthesis down approximately 250-fold in just 10 years,” according to this 2018 research article. As it becomes even cheaper to engage in what is called synthetic biology, more people will have access to it—and not necessarily well-intentioned ones.

Our lust for the new and the advanced is multiplying the systemic and catastrophic risks we face as a global society. In my view, if we as a species want to survive the century, we must do the unthinkable: Abandon technologies that pose the risk of systemic ruin or at the very least severely restrict and monitor their use. One group is calling for a global ban on autonomous weapons. Regulating synthetic biology will be tricky as laid out in this piece.

Finally, it is important to realize that threats from various novel technologies do not stand in isolation. These threats can be combined to increase their dangers. I’m imagining a drone that simultaneously delivers a lethal designer virus to a group of cities targeted by an adversary, whether a country or a non-state group. You can bet that someone else is imagining that, too!

Photo: Dax delivery robot in Corvallis, Oregon (2018). By Lizzythetech. Via Wikimedia Commons https://commons.wikimedia.org/wiki/File:Dax_robot_with_dog.jpg

Kurt Cobb

Kurt Cobb is a freelance writer and communications consultant who writes frequently about energy and environment. His work has appeared in The Christian Science Monitor, Common Dreams, Le Monde Diplomatique, Oilprice.com, OilVoice, TalkMarkets, Investing.com, Business Insider and many other places. He is the author of an oil-themed novel entitled Prelude and has a widely followed blog called Resource Insights. He is currently a fellow of the Arthur Morgan Institute for Community Solutions.