engineers were so successful in creating a silent automotive interior
that customers complained. They missed engine roar and road noise. So
BMW spent hundreds of thousands of dollars to develop an audio algorithm
to generate engine noises to be played through the car's stereo system.
BMW claimed its system accurately replicated engine sounds over the full range of RPMs, operating conditions and speed.
an associate professor of psychology at Cornell, cites BMW's expensive
reversal of its initial engineering achievement as an example of what
happens when our intuition and our technology are out of sync. In fact,
Pizarro argues that our social and moral intuitions increasingly fail us
as we are confronted with fast-paced changes in science and
technological innovation. In a lecture at Edge.org,
Pizarro describes how subjects in an experiment on trustworthiness
quickly engaged with a robot called Nexi that had very limited facial
features and movements and visible wires. The robot, with its
unmistakable mechanical appearance, had been programmed with nonverbal
cues experimentally associated with trustworthiness.
30 seconds people were actually talking to Nexi as though she were a
human being, in fact saying things that were quite private," Pizarro
said. He added that some participants even thought Nexi was a
technologically advanced talking robot, "when in reality there was a
graduate student behind the curtain, so to speak." Pizarro quoted early
psychological research indicating our social intuitions build in
intentionality and agency, even when they're not there. During a discussion after the lecture, economist Sendhil Mullainathan, recalled stories in Everett Rogers' book Diffusion of Innovation,
describing how people adopt new technologies in ways that are congruent
with older intuitions. When Indian farmers started using tractors, for
example, they went to the tractor every night and put a blanket over it.
want to kick the vending machine that doesn't deliver the candy bar and
bellow at the computer when Windows delivers the blue screen of death.
We feel bad if a computer game stops playing with us. When we get those
pop-up ads based on an earlier purchase or search, we get a creepy
feeling that someone has been watching us and reading our email. And
that's even when we know about algorithms that generate personalized
don't have intuitions for algorithms," Pizarro said. "As technology
advances, there is no way in which we can rapidly generate new
intuitions. So...when we hear about self-driving cars, we get nervous,
even though we're certain that percentage-wise this would reduce the
number of traffic accidents. It just doesn't feel right." Pizarro fears
some new technologies may be stifled by old intuitions that have
evolved from earlier eras. We could end up making erroneous moral
judgments about technological advances with the potential to cure
diseases and improve lives. By the way, a Car and Driver story by K.W. Colwell explains BMW is not the only auto manufacturer to pipe fake sounds to the drivers.
believes we have yet to define what constitutes an error in judgment in
many areas of emerging technology. For instance, he asks, does the
impersonal nature of drones and robots in war constitute an immoral
action? Is the problem the lack of human agency? How does one figure out
acts of omission vs. acts of commission when technical tools are
What about genetically modified humans? The New York Times
reports that with mitrochondrial manipulation technology, the nuclear
material can be removed from an egg or an embryo of a woman who has an
inheritable mitrochondrial disease and inserted into the healthy egg or
embryo of a donor whose own nuclear material has been discarded. The
resulting child would have the genetic material of three people. The federal Food and Drug Administration is considering the issue.