Column/Opinion

OPINION | PHILIP MARTIN: To serve man


We have seen the terrifying robots.

In the Boston Dynamic videos they run and leap and do parkour. They will not be eluded. They will find you. They are quick and indefatigable and merciless. They will one day hunt us.

But not today. Maybe.

Boston Dynamics and five other companies that make advanced mobile robotics signed an open letter last year pledging not to "weaponize" their "advanced-mobility general-purpose robots" and not to support others who would do so. "When possible," they write, "we will carefully review our customer's intended applications to avoid potential weaponization. We also pledge to explore the development of technological features that could reduce or mitigate these risks."

Does that sound reassuring?

If it does, consider the carefully worded phrase "advanced-mobility general-purpose robots." So they're only pledging not to arm these specific kinds of robots--ones intended for general purposes (which would include housekeeping chores like folding laundry).

They even make a specific exception for military robots: "To be clear, we are not taking issue with existing technologies that nations and their government agencies use to defend themselves and uphold their laws."

In this light, the pledge feels impotent. I can imagine all kinds of specific-purpose robots, like the robot Boston Dynamics sold to the Los Angeles Police Department last year, a souped-up version of their Spot model, a four-legged robot that in some respects suggests an organic dog as stylized by Giacometti.

Spot is about 28 inches high, weighs about 70 pounds, and is equipped with 360-degree cameras, speakers and microphones that feed information in real time to an officer controlling it through a tablet device. The LAPD model has a mechanical arm it can use to open doors and sensors capable of mapping its surroundings in three dimensions.

You can see how this robot--which carries a price tag of about $280,000 (a basic consumer model Spot runs about $75,000, and there's a waiting list)--could be useful to police. It's a real upgrade to the remote control devices currently used in hostage negotiation and bomb disposal scenarios. You could send Spot in to talk to a barricaded suspect without risking human life. An agile, fast-moving robot would be an obvious asset in an active shooter situation.

What seems crucial is that the robot is simply acting as the eyes and ears of the officer swiping at the screen. A human conscience makes the moral choices.

But we can also understand that some communities have reason to be suspicious of the police. The New York Police Department acquired its own Spot several years ago, renamed Digidog. A video of it trotting alongside officers investigating a home invasion emerged, and it was seen at the site of a hostage situation in a public housing project (the robot played no active role in the operation). Public outcry caused the city to cancel its lease with Boston Dynamics and return the device.

Critics like Rep. Alexandria Ocasio-Cortez (D-N.Y.) called it a "robotic surveillance ground drone" that was being "deployed for testing on low-income communities of color with under-resourced schools." Others rebelled at the idea of giving the police another expensive toy. Then-mayor Bill de Blasio didn't criticize the program directly, but allowed that if the robot was "unsettling to people" then the city should "rethink the equation."

So New York got rid of its robot because some people didn't trust the police not to abuse the technology. Which doesn't seem completely unfair, given the track record of the NYPD, but it's also not hard to argue that if the robots were deployed in an ethical fashion, they'd be a great tool for making everyone a little safer. These robots are only as cruel as the cop operating them.

But there is a danger in insulating the actor from the acted-upon; psychologically it's a lot easier to hurt or kill someone at a remove. In a world where a lot of us have been conditioned by video games, the usual inhibitions to violence might be softened if that violence is delivered via touchscreen. And there's always the fear that these robots will become autonomous and start acting on their own, with their decision trees perhaps polluted by the prejudices of their programmers.

I can't guess how close we may be to fulfilling the "Robocop" prophecy, but know there are smart people who think it not only possible but inevitable that robots will take over for us in the most dangerous jobs.

When they do, we might need to install some variation of Isaac Asimov's famous Three Laws of Robotics: "A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."

That's a start, so long as we can keep the robots from thinking for themselves and deciding our puny human rules are silly and can be ignored and to see clearly that we are the prime source of misery and conflict in the world. That's where the dystopia begins.

Or does it?

An android would not necessarily be prideful; it might not feel the need to demand trivial displays of respect. It would not over-react to epithets or allow its judgment to be colored by superstition or emotion. It would not feel obliged to intimidate or react to cracks about its parentage. A Spock-like robot could maintain its calm in the face of chaos.

I'm not sure that in a desperate situation I shouldn't trust its algorithms over the snap decisions and gut responses of your average well-meaning but overworked and under-trained young patrol officer.

Except my heart does prefer that human being, that fierce and trembling spirit inclined to error and overstatement and guilt and doubt. Not because our brains calculate faster and cleaner than the mechanical kind, but because it is possible for them to err on the side of compassion, to occasionally make the right call for the wrong reasons.

So I wouldn't defund the police, but give them more non-lethal tools and release them from jobs for which they are ill-prepared. I would give them robots, because the robots are coming, and ask that they remain their masters for as long as they can.


Philip Martin is a columnist and critic for the Arkansas Democrat-Gazette. Email him at pmartin@adgnewsroom.com.


Upcoming Events