New Delhi: Like humans, robots can lie and deceive. A study released on Thursday shows how emerging technologies like generative AI can be used to manipulate users.
The research by the team from George Mason University in the US aimed to explore “an understudied aspect of robot ethics” to understand mistrust towards emerging technologies and their developers.
To determine whether people could tolerate robot lying, the team asked nearly 500 participants to rank and explain different forms of robot deception.
“I think we should be concerned about any technology that is able to hide the true nature of its capabilities because this could lead to users' perceptions being replaced by technology that was not what the users (and perhaps the developers) originally intended,” said lead author Andrés Rosero, a doctoral candidate at the university.
“We have already seen examples of companies using web design principles and AI chatbots designed to manipulate users into doing a certain thing. We need regulation to protect ourselves from these harmful deceptions,” he added.
The findings, published in the journal 'Frontiers in Robotics and AI', showed that robots can deceive humans in three ways – 'external state deception', 'hidden state deception' and 'superficial state deception'.
Robots were used in medicine, cleaning and other tasks and were said to lie about the world beyond robots – including a housecleaning robot with a hidden camera and a robot working in a shop.
Participants were asked to comment on three questions: the robot's behavior, its deceptiveness, and whether it could be justified. Most participants rejected hidden state deception, which they considered the most deceptive.
They also rejected superficial deceptions, in which the robot pretended to feel pain. Researchers have attributed these deceptions, especially hidden state deceptions, to the robot developers or owners.
He cautioned that the study needs to be replicated at the level of reallife responses as the research involved too few participants to provide solid evidence.
—
– /
Image Credit: KhasKhabar.