I've been carefully reading Richard Danzig's
latest post,
Technology Roulette: Managing Loss of Control as Many Militaries Pursue
Technological Superiority. I want to put this piece in context - first of all, Richard Danzig is one of the best policy writers, and one of the deepest American policy thinkers currently active. Secondarily, this paper is a product of a deeply conservative government reaction to the ascendant Cypherpunk movement and is in that sense, leading the wrong direction.
Ok, that sounds melodramatic. Let me sum up the paper thusly:
- New branches of science introduce upheaval and each comes, as a party gift, with a new weapon of mass destruction and general revolution in how war works.
- We used to get one a century or so, which was possible to adapt to, like a volcano that erupted every so often
- We built treaties and political theory and tried not to kill everyone on the planet using the magic of advanced diplomacy
- Now we are getting many new apocalyptic threats at a time
- AI
- 3d-Printing
- Drones
- Cyber War
- Gene editing techniques
- Nanotechnology
- Rate of new world-changing tech is INCREASING OVER TIME.
- Our ability to create new international political structures to adapt to new threats appears moribund
Most legal policy experts look askance on the "libertarian" views of the computer science community they have been thrust into contact with as if a Japanese commuter on the rush hour train. But the computer science world is less big L Libertarian than philosophically Cypherpunkian, tied to the simple belief that the advance of Technology is at its sum, always net positive for human liberty. Where society conflicts with the new technologies available to humanity, society should change instead of trying to restrict the march of technology.
Hence, where government experts are scared of disintermediation, as evidenced by a paranoia over Facebook's electoral reach, the computer world sees instead that newspapers were themselves centralized control over the human mind, and worthy of being discarded to the dustbin of history.
Where the FBI sees a coming crisis in the "Going Dark" saga, they find exactly no fertile ground in the technology sector, as if the field they would plant their ideas in was first salted, and then sent into space on one of Elon's rockets.
The US Government and various NGOs were both surprised and shocked at the unanimity and lack of deference of the technological community with regards to the Wassenaar cyber controls or the additional cryptographic controls the FBI wants. This resistance is not from a "Libertarian" political stance, but a from the deep current of cypherpunkism in the community.
These days, not only do Cypherpunks "write code", to quote Tim May's old maxim, but they also "have data". The pushback around Project Maven can be described on a traditional political platter, but also on a tribal "US vs THEM" map projection.
Examine the conversation around autonomous weapons. Of course, an autonomous and armed flying drone swarm can be set to kill anyone in a particular building. This is at least as geographically discriminatory as a bomb. Talks to restrict this technology even at the highest principal level so far restrict only an empty set of current and future solutions.
Part of this is the smaller market power of governments in general for advanced technology. A
selfie drone is essentially 99.999% the same as a militarized drone, and this trend is now true for everything from the silicon on up, and some parts of the US Govt have started to realize their sudden weakness.
As Danzig's paper points out, the platitude that having a "human in the loop" to control automated systems is going to work is clearly false. Likewise, he argues that our addiction to classification hamstrings us when it comes to understanding systemic risk.
The natural tendency within the
national security establishment is to minimize the
visibility of these issues and to avoid engagement with
potentially disruptive outside actors. But this leaves technology
initiatives with such a narrow a base of support
that they are vulnerable to overreaction when accidents
or revelations occur. The intelligence agencies should
have learned this lesson when they had only weak public
support in the face of backlash when their cyber documents
and tools were hacked.
But his solution is anything but. We're in a race, and there's no way to get out of it based around the idea of slowing down technological development.