[ad_1]
Adherence to the declaration would prohibit researchers from working on robots conducting search and rescue operations or in the new field of “social robotics”. One of Dr. Bethel’s research projects is developing a technology that would use small human-like robots to interview children who have been abused, sexually assaulted, trafficked or otherwise traumatized. In one of his recent studies, 250 children and teenagers who were interviewed about bullying were often willing to confide information to a robot that they would not reveal to an adult.
Having an investigator “guide” a robot to another room could then produce less painful and more informative interviews with the surviving children, said Dr. Bethel, who is a trained forensic interviewer.
“You have to understand problem space before you can talk about robotics and police work,” he said. “They are making a lot of generalizations without a lot of information.”
Dr. Crawford is among the signatories of “No Justice, No Robots” and Black’s open letter in Computing. “And you know, whenever something like this happens, or awareness is raised, especially in the community where I work, I try to make sure I support it,” she said.
Dr. Jenkins refused to sign the “No Justice” statement. “I thought it was worth considering,” he said. “But in the end, I thought the biggest problem was, actually, the representation in the room – in the research lab, in the classroom and the development team, the executive board.” Ethical discussions should be rooted in that first fundamental issue of civil rights, he said.
Dr. Howard did not sign either statement. He reiterated that biased algorithms are the result, in part, of the biased demographics – white, male, able-bodied – that designs and tests the software.
“If outside people who have ethical values don’t work with these law enforcement agencies, then who is?” she said. “When you say ‘no’, others will say ‘yes’. It’s not okay if there’s no one in the room saying, “Um, I don’t think the robot should kill.” “
Source link