Boston Dynamics is a robotics company based in Waltham, Massachusetts, that has been developing amazing technologies that use artificial intelligence to perform difficult or dangerous tasks.
As I covered a mere two years ago, the visionary engineers at Boston Dynamics invented a rolling heavy-lifting machine called Handle and a Cheetah robot with a top speed of 28.3 mph developed for DARPA’s Maximum Mobility and Manipulation project. (DARPA, the Defense Advanced Research Projects Agency, is the federal organization responsible for the development of emerging technologies for use by the military.)
The military asked for help with a dog-like pack animal that would never tire from carrying insanely heavy loads and never need feeding or grooming. Dog-like robots began to evolve, some without “heads” as we think of them, but with four powerful backward-jointed legs that can adapt to uneven terrain.
Then, there was Atlas, the humanesque robot that was charmingly goofy and clumsy at first, as portrayed in this “Swearing Robot” 2016 test video.
Fast forward to 2019. Now, the 330-pound Atlas is a flippin’ gymnast, too. This hulking mass of metal, plastic, electronics, and cabling can roll a perfect somersault, spring upright, jump and twist 360 degrees (a full circle), and land smartly.
Boston Dynamics has posted a YouTube video that highlights the team’s robotic evolution from 2012 to 2019. The progress is amazing – and sobering.
These AI machines are mighty tools for progress – in the right hands. But, as many of us observers noted years ago, what if Evil Doers reprogrammed these purportedly helpful animalistic robots to attack Us the People?
That’s exactly what may be happening right now. “Spot” the semi-autonomous dog, able to twist a knob to open a door, has been deployed by Massachusetts State Police (MSP) officers in at least two incidents. The first Boston Dynamics dog-like robotic K-9 units are being called “terrifying” as civil rights advocates are waving big red flags.
The American Civil Liberties Union (ACLU) became aware of a Facebook post by MSP about an event held on July 30, 2019, promoting the use of robotics in law enforcement operations and asked for more information about this questionable legal and ethical practice:
“In response, the ACLU of Massachusetts filed a public records request in August 2019 seeking information about plans for, acquisition of, and/or use of robotics by Massachusetts State Police. The request examined the MSP relationship with the following companies: Ghost Robotics, iRobot, Endeavor, and Boston Dynamics.”
Boston Dynamics leased Spot to the MSP bomb squad to test for three months, from August until November 2019, as shown in this contract obtained by the Massachusetts ACLU.
Spot features a 360-degree camera, crash protection, and can withstand tough environments. Its top speed is 3 mph with a carrying capacity of 31 pounds.
A state police officer said that the bomb squad used Spot as a “mobile remote observation device” of suspicious devices or dangerous locations. David Procopio wrote for the MSP:
“Robot technology is a valuable tool for law enforcement because of its ability to provide situational awareness of potentially dangerous environments.”
Vice President of Business Development at Boston Dynamics Michael Perry stated that corporate policy opposes weaponizing Spot or any other robot constructed by the company:
“Part of our early evaluation process with customers is making sure that we’re on the same page for the usage of the robot. So, upfront, we’re very clear with our customers that we don’t want the robot being used in a way that can physically harm somebody.”
But will police departments let the robotics manufacturers dictate how to use these potent computerized robotic machines? Despite a contract clause in the Boston Dynamics lease agreement that Spot is prohibited from being used to “physically harm or intimidate people,” history has already answered that question with a resounding NO.
It’s been three years since police made what is thought to be the first-ever robotic kill in Dallas, Texas when, in 2016, officers deployed a bomb-disposal robot wired with an explosive device on its manipulator arm to kill a suspect after five police officers were murdered and seven others wounded.
Dallas police chief David Brown told members of the press how he and his superiors rationalized using a robotic assassin:
“We saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was.”
Boston Dynamics started selling Spot in September 2019. The company gave testing partners access to the Spot software development kit (SDK). Customers can create customs programs and control “command poses and velocities, configure payloads, and access robot perception and payload data.”
One writer for Vice neatly summed up the conflicting situation regarding AI police animal units as technology marches inexorably onward:
“Soon, the dogs will be able to undo jam jars and tie intricate knots. They will be able to sift flour and change a kitchen garbage bag. They will be able to choke you unconscious with their rigid plastic pincers, as tiny lasers scan your eyes for the final flickers of life extinguishing inside you. They’ll be able to blast an unerring forearm into the softest meat of you and pulse around inside until they find the organ they want, which they will snip out neatly as you bleed beneath them on the floor. They will be able to crush your skull like a Coke can. They’ll be able to flick through a telephone book. They’ll be able to break your nose with a single peck. They’ll be able to strike a match. They’ll be able to slosh your warm body with gas and light the building on fire.”
The clear and present danger here is that police departments are using highly advanced technologies with little or no oversight. Lawmakers need to regulate how robots like Spot can be used to prevent the abuse of our civil liberties.