Briefly Noted

Things Read, Seen or Heard Elsewhere

On second thought, San Francisco decides killer robots aren't a good idea. Crazy that it got so far.

In the just because you could, doesn’t mean you should department, the San Francisco Board of Supervisors reversed course on a proposal that gave city police the authority to arm remote-controlled robots with lethal explosives to confront dangerous subjects. In other words, to kill them.

In a second round of voting on the matter – occurring just over a week after the first – the Board relented in the face of civil rights and activist opposition to the measure. Fortunately, they came to their senses and realized their authorization was a bit dystopian. It’s incredible though that this came so far.

Via Ars Technica:

Shortly after the initial news broke, a “No Killer Robots” campaign started with the involvement of the Electronic Frontier Foundation, the ACLU, and other civil rights groups. Forty-four community groups signed a letter in opposition to the policy, saying, “There is no basis to believe that robots toting explosives might be an exception to police overuse of deadly force. Using robots that are designed to disarm bombs to instead deliver them is a perfect example of this pattern of escalation, and of the militarization of the police force that concerns so many across the city.”

As the Associated Press reports, Dallas police used the first lethal robot back in 2016 when they armed it with explosives and killed a holed-up sniper.

Over the years we’ve seen a gross militarization of our police. It’s part of a Pentagon program that distributes surplus army equipments to local law enforcement.

As Dean Preston, one of the San Francisco supervisors puts it, “There have been more killings at the hands of police than any other year on record nationwide. We should be working on ways to decrease the use of force by local law enforcement, not giving them new tools to kill people.”

Thoughts? Ideas? Comments?

Send me a note or reach out on Mastodon.

Date Noted: