Human operator could hear and see the participants and interact with them
Measured electrodermal activity and skin conductance
Impossible to determine truth or lie if thought AI was computer controlled
Automated deception detection is said to be one of biggest security challenges of the century.
To overcome this obstacle, researchers have developed an 'automated interview system' - a virtual interrogator.
It can conduct interviews and determine deviations in people's physiology and behaviour via sensors.
However, researchersfound the machine was only effective was if subjects believe the system is being controlled by a human - not a computer.
Psychologists at the University of Twente set out to investigate 'whether perceiving an automated interview system as operated by a human or computer, influences cues to deception in the form of increased sympathetic nervous system activity (SNS) ', reads the study published in Frontiers in Psychology.
The team recruited 79 subjects for the study that were asked to take over the work of a transport sector worker who had reported in sick.
While performing tasks, the participants were coerced into committingfraud by signing contracts they were not qualified for.
They were then interviewed by a human-like avatar on a screen regarding their transgression, called 'Brad'.
'When confronted with human-like avatars, people generally are uncertain to which degree the avatar really directly represents the actions and thoughts of the person controlling the avatar,' said researchers.
Researchers found it was impossible to tell whether or not they were telling the truth or lying if the subject believed Brad was computer controlled.
But those who believed it was human controlled, revealed a clear difference.Frontiers | Interviewing Suspects with Avatars: Avatars Are More Effective When Perceived as Human | Human-Media Interaction
Virtual Human Toolkit