• NBC News: Robot ‘Code of Ethics’ Could Move From Sci-Fi to Real Life

    NBC News: Robot ‘Code of Ethics’ Could Move From Sci-Fi to Real Life

     

     

    It’s too bad Emily Post didn’t leave instructions for the robot age.

    Today, as soldiers are naming their robots, the elderly are befriending mechanized assistants, and domestic drones are preparing for liftoff, experts say it’s time for a robot “Code of Ethics” that can protect the complex relationships people are developing with the machines.

    Last week, President Obama kicked around a soccer ball with a four-foot humanoid during a visit to Japan, and president and robot avoided any diplomatic snafus.

    But as for regular humans, experts are already on the case. Roboticist Laurel Riek and her colleague at the University of Notre Dame, ethicist Don Howard, proposed establishing a designer’s code of ethics at the We Robot conference for robotics and law this month.

    Their new checklist is a set of “real, down-to-earth practical guidelines” for engineers who build robots, Riek told NBC News, and a first stab at coding etiquette in a human-robot world.

    People who use robots for physical assistance should be able to shut off any video surveillance when they bathe, for example. The strong emotional attachment kids with autism develop towards therapy robots needs to be considered once that therapy ends. And when it comes to self-driving cars and flying drones, laws should clearly state who is at fault when those autonomous machines do damage.

    A general rule like “a robot may not harm a human,” the first of science fiction writer Isaac Asimov’s famous Three Laws of Robotics, is “too vague” Riek said. Asimov first introduced the rules in his 1942 short story “Runaround,” and they continue to be adopted and updated in all areas of sci-fi entertainment.

    In the real world however, Riek and Howard point to specific protocols still needed for people who work with robots. The privacy, dignity and emotional needs of the humans should be a consideration, as well as the safety, the scientists write in “A Code of Ethics for the Human-Robot Interaction Profession.”

    Without a reminder, “I might not secure the hardware and people might hack into my eldercare robot and harm someone,” said Riek.

    “Of course, everyone wants to think about the specter of the Cylon,” says Howard. Thanks to those scheming robots from “Battlestar Galactica,” and scores of others in science fiction, Howard expects that most humans distrust robots in real life.

    On the plus side: If we’re already thinking about robots — sci-fi or otherwise — we may be more willing to embrace a code that governs their design for real life.

    In their draft set of principles, Riek and Howard suggest that builders of robots respect “emotional needs” and “human frailty” when designing systems.

    “People aren’t sure if they’re interacting with a robot or a person. If they’re in a state that can’t figure it out, then it’s ethically not OK.”

    The vast majority of humanoid robot designs today are modeled after men — Caucasian or Asian men most frequently, they point out. And “gynoids,” or robots modeled after women, tend to look overly feminine. Perhaps it’s also time to talk about designing robot bodies that don’t lean toward one gender or ethnicity, they say.

    Riek designs health care robots. For children with autism and adults with severe mental conditions like dementia, robots have a real therapeutic effect. But overly realistic robots pose a problem for patients whose faculties are impaired.

    “People aren’t sure if they’re interacting with a robot or a person,” Reik said. “If they’re in a state that can’t figure it out, then it’s ethically not OK.”

    A camera-carrying drone outside a bedroom window is everyone’s favorite privacy nightmare. The residents of Deer Trail, Colo., have gone so far as to propose drone “hunting licenses,” for gun-wielding citizens worried about an aerial peeping Tom.

    What exactly are the rights of a homeowner, or a drone operator, when it comes to enthusiasts flying their aircraft around the neighborhood?

    “That’s a hard and interesting scenario,” says Michael Froomkin, professor of law at the University of Miami.

    “Usually we develop a technology and then figure out what to do with it. That is not necessarily the best way to go about it. So I’m happy that people are discussing these issues now.”

    Froomkin and other law experts discussed the prospects for self-defense against robots at the We Robot conference in Miami earlier this month.

    Today, a person on the ground can’t be sure what a drone is up to. But should it be all right to shoot one out of the sky if a resident thinks it’s filming a family barbecue? Can a homeowner take a swing at a neighbor’s pesky drone with a baseball bat if it passes over the property line?

    Simple design requirements could help, Froomkin suggests. For example, drones or robots that aren’t recording with a camera could be required to flash a white light.

    “If you outfit your robot with the right markings or lights — and you’re telling the truth, [the robot] shouldn’t be in any danger,” Froomkin said.

    Kate Darling, a robot ethics researcher at MIT’s Media Lab, agrees that guidelines will help. “Usually we develop a technology and then figure out what to do with it,” she told NBC News. “That is not necessarily the best way to go about it.”

     

    NBC News: www.nbcnews.com/tech/innovation/robot-code-ethics-could-move-sci-fi-real-life-n83201

Comments are closed.