Tuesday, June 9, 2015

ROBOTIC KILLING: What could possibly go wrong? (Ask a kid)

Massachusetts Peace Action contest winners

Earlier this year, Massachusetts Peace Action sponsored a contest for posters and videos from local students to be used in promoting the April Peace and Planet events in New York City.

The results were impressive. I was particularly struck by this video:

The whole problem with the NPT, summed up in two minutes and thirty-nine seconds!

The lesson for me: if we want someone to help us explain an antiwar message, maybe we should ask a kid.

The movement against drone killing and drone surveillance could benefit from imaginative, no-holds-barred, tech-savvy, explorations of the problem.  The more people -- especially young people -- we get talking about this, the better!

The problem we're up against is that drones are portrayed as an out-of-sight, out-of-mind solution to military action, and so the general public is happy to simply not worry about it.

Just imagine what would happen if we could ask kids to think about the question . . .

Robotic killing: 
What could possibly go wrong?

How can we interest educators in schools across the country in asking the question: "Robotic killing: What could possibly go wrong?"

What do Stephen Hawking and Elon Musk say? "Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity," they say in a joint letter by concerned scientists. (See Ban Killer Robots Before They Take Over, Stephen Hawking & Elon Musk Say)

Related posts

We can now entrust all the dirty work -- including war -- to robots. (Or can we?)

(See A Modest Proposal: Debate the Drones  )

With drones, people become just dots. "Bugs." People who no longer count as people . . . .

(See Drone Victims: Just Dots? Just Dirt? )

The panopticon was a prison design that reversed the old paradigm, in which prisoners were stored away, "out of sight, out of mind," and instead arrayed them in a way in which they could be observed as efficiently as possible by the fewest number of managers.

(See Drones, 1984, and Foucault's Panopticon)