Google seems aware of
this, at least from a public relations perspective. Likely concerned
about what it looks like for the company to be developing military
hardware, the company recently
donated $5 million to the World Wildlife Federation for drones to track down rhinoceros poachers in Africa. It's as if they're out to prove that drones are not necessarily evil.
Still, we can't help but
do a bit of evil when we build technology upon technology, without
taking a pause to ask what it's all for.
New technologies give us
the opportunity to reevaluate the systems we have been using up until
now, and consider doing things differently.
But the
stock-market-fueled culture of Silicon Valley too often focuses on
efficiency of execution rather than clarity of purpose.
The result is that our
best Stanford computer science graduates end up writing algorithms that
better extract money from the stock market, rather than exploring
whether capital is even serving its original purpose of getting funds to
new businesses.
Or the engineers behind
Bitcoin develop a brilliant new digital currency without evaluating the
purpose of money in our society. The problem to be addressed is that too
much cash has ended up stuck in the coffers of the speculators. Instead
of thinking about how to encourage peer-to-peer transaction, Bitcoin's
developers simply built another speculative currency, only this time on
digital steroids.
Likewise, war is not a
great approach to conflict resolution. Adding robot soldiers to the mix
merely improves the efficiency of killing. How might robots be used to
reduce conflict instead of enact it?
When we develop
technology in a vacuum, disconnected from the reality in which people
really live, we are too likely to spend our energy designing some
abstract vision of a future life rather than addressing the pains and
injustices around us right now. Technology becomes a way of escaping the
world's problems, whether through virtual reality or massive Silicon
Valley stock options packages, rather than engaging with them.
But the don't-do-evil
mandate doesn't even ask Google's programmers to evaluate the purpose of
a technology -- only to perform a basic "checksum," or error
correction, for evil itself. How pathetically binary. This is not enough
for a company that appears dedicated to uploading human consciousness
to the cloud, no matter how many robot warriors we have protecting our
virtual reality servers from the people we leave behind.