Can a robot be my boss?


Support request:

I am annoyed to hear that law enforcement agencies are increasingly using robots to neutralize threats, surveillance and hostage situations. I just saw it RoboCup Many times, but I care about machines being important, life-and-death decisions. Especially given how many times actual human rights officials have abused their authority. Do I have any kind of moral obligation to obey the police robot?

USUSPECT

Dear suspect

Hollywood has not been particularly optimistic about robots in positions of authority. RoboCup An example of a wide-ranging sci-fi cannon that ignites the tragic consequences of leaving complex tasks in our minds on significant machines – robots whose main instructions are literally honored with a letter that can turn fatal, can rape a person to death but a set of stairs Is confused by. The message of these films is clear: Strict automatons are often unable in times of crisis to have the necessary modified solutions and moral harassment.

This stereotype may have led Boston Dynamics, whose robots have been incorporated into some police departments, to release a video last December for the dancing of its models in the December 19th decade, “Do You Love Me?” Maybe you saw? The robots included Atlas, an Android that resembled a deconstructed storm troop, and Spot, which served as inspiration for the killer Dogbots of the “Metalhead” episode. Black Mirror. No machine seems to have been designed to allay fears about adopting robots, so what could be more appealing to the public than expressing their agility? And what could be a better test than cunning than the skill considered as such a significant human being that we invented a move designed to make fun of its (robot) automaton disability? It is difficult to avoid seeing instruments shaking, shaking, and moving as they are animated, embodied creatures, capable of the same flexibility and sensitivity as themselves.

Never think that the joints in the spot can cut off your finger or that the police robot has already been used to apply deadly force. One way to answer your question, without any appeal to the suspected, moral philosophy, is probably to have practical consequences. If you plan like most of us, to survive and stay healthy, then yes, you must obey a police robot.

But I understand that your question is not just practical. And I agree that it is important to consider trade-offs involving the payment of policing duties on machines. Incidentally the Boston Dynamics video was posted at the end of 2020 to celebrate the happy year of “What We Hope” way. A week later, rebels stormed the Capitol and spread images of little resistance from a crowd of police officers – images from Black Lives Matter last summer. The protests were strongly encouraged on social media against the backlash –

At a time when many police departments are facing a crisis of authority due to racial violence, the strongest argument in favor of robotic policing is that machines have no inherent power of superstition. To a robot, a person is a person regardless of skin color, gender or cause. As the White House reports in its 2011 report on algorithms and civil rights, the new technologies “have the potential to help law enforcement make decisions based on risk factors and variables rather than flawed human instincts and prejudices.”

Of course, if current policing technology is any proof, things are not so simple. Predictive policing algorithms, which are used to identify high-risk individuals and surrounding areas, pose a high risk for bias, which roboticist Mirror Howard calls “the main sin of AI.” Since these measures rely on historical data (past court cases, previous arrests), they end up bringing together the same communities where they were first unjustly targeted and structural racism is being reinforced. Automated predictions can become self-evident, locking certain quadrants in the form of extra pressure. (Officers arrive at a location identified as suitable for crime that is aimed at discovery.) These tools, in other words, are not so neutral as to formalize superstition, aking them to perpetuate existing social inequalities in the system seamlessly and mechanically. Kevin McNeish, a professor of digital ethics, notes that the values ​​of the algorithm’s creators “freeze the code and effectively institutionalize those values.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *