Police in a dangerous neighborhood The city of San Francisco is now debating implementing a “dystopian” ordinance that would give police robots the authority to kill anybody who poses a threat to civilians or police personnel.
To better manage its fleet of 17 remote-controlled robots, mainly employed for bomb disposal and surveillance in locations inaccessible to human officers, the San Francisco Police Department has developed a methodology for supervising their operation.
There will be no use of lethal force by robots until “the risk of loss of life to members of the public or police is immediate and surpasses any other force option available to SFPD,” as the policy puts it.
The employment of robots by law enforcement would be legitimized for a broad range of purposes, including but not limited to: training, simulations, arresting suspects, responding to emergencies, issuing warrants, checking equipment, and so on.
During an interview with The Verge, Officer Eve Laokwansathitaya of the San Francisco Officers Agency stated that the use of deadly force is legal if it is required to protect law enforcement personnel or members of the general public from a criminal.
In Laokwansathitaya’s words, “SFPD does not have a particular strategy in place” when using fatal force using robots because such instances would be so unusual and unique.
According to Officer Robert Rueca, who spoke with Mission Local, the SFPD has never conducted a robot attack.
For weeks, the San Francisco Board of Supervisors has been presented with this proposal.
Supervisor Aaron Peskin reportedly tried to add the line “Robots must not be used as a Use of Force against any humans” to the Board’s regulations, as reported by Mission Local.
A week later, though, the administration changed its mind.
On November 29, the topic will be on the agenda of a Board of Supervisors meeting in San Francisco.
Since Assembly Bill 481 in 2017, local police enforcement agencies in California have been able to train using actual military equipment. Law enforcement agencies around the United States are now required to keep records on U.S. military assets, including drones, mobile command centers, and sound cannons.
Annual approval or rejection from city governments is now possible.
The Oakland, California police department is reportedly considering buying Remotec F5A robots equipped with shotguns, as revealed by The Intercept last month. While a Facebook post threatened to use “remote armed vehicles within the department,” the agency ultimately opted against taking such a drastic step.
While proponents of the initiative insist it is essential, others point out that it adds monitoring and transparency for a militarized police force, which may be overkill.
Tiffany Moyer, senior staff attorney of the Attorneys’ Committee for Civil Rights of the San Francisco Bay Area, reportedly told Mission Local that the policy is unprecedented and that lawyers and the public should be against it.
Moyer had predicted a “dystopian future” in which police use robots to carry out executions without a trial, jury, or judge.
American Friends Service Committee employee Jennifer Tu has been keeping an eye on how the new rules are enforced in her community.
Tu told Mission Local that by then, most legislation would have changed to make it legal for aggressive robots to exist. As the author puts it, “a pretty large gulf” separates virtual harm from actual injury.
Last month, robotics companies like Boston Dynamic and others signed an agreement to prevent the use of robots in warfare. Adding weapons to unmanned or remote-controlled robots raises new safety and ethical considerations, in our opinion.