Moral choices nag like a sore tooth. Where do you stand? Dr. Jack Poulson, Google scientist, took his stand. And we all benefit by reflecting on his choice.
With the rise of AI we are facing a new kind of arms race. The New York Times reported on Tuesday (10/8/2018) that Dr Jack Poulson quit his job as a Google scientist out of a moral choice: he refused to continue his work on the Chinese search engine code-named “Dragonfly” because it will enhance Chinese government censorship of the Internet.
The Times article quotes an excerpt from a Google company email: “We won’t and shouldn’t provide 100 percent transparency to every Googler, to respect our commitments to customer confidentiality and giving our product teams the freedom to innovate.” In other words, Chinese Googlers don’t need to know about Project Dragonfly, nor do US Googlers need to know about Project Maven. Project Maven is a Google project to build artificial intelligence for the Defense Department to target drone strikes.
Dr Poulson’s sacrifice is indisputably a Gandhi-like act of courageous principle. Should I use my talents to empower a government to restrict access to information, to watch my every thought and action?
But isn’t the cat is already out of the bag? Doesn’t Big Tech already know everything about us? Haven’t we already given tacit approval to Amazon, Microsoft, Google, Facebook, ad nauseum, for the total invasion of our privacy? The Federal government is a laggard in this process. Is Project Maven pure evil, or can better targeted drone strikes save innocent lives?
The cartoon I selected refers to the arms race of the Cold War. Many of the scientists working on the Manhattan Project suffered terrible pangs of conscience having participated in the creation of such a destructive weapon.
No individual could have prevented the creation of atomic weapons. Not even if Oppenheimer backed out. Similarly, no one will stop Dragonfly. No one will stop total surveillance. Every technology unfolds its potential and must play itself out.
I believe our moral choices are more subtle, and more difficult. Our consciences must decide on how we, as a society, and a world, will use the technology. So far, we haven’t blown up the planet. And, for the most part, we are still free.