Can Social Robots Turn Into a Dispute Negotiator?


Can Social Robots Turn Into a Dispute Negotiator?

We may listen to facts from Siri or Alexa, or directions from Google Maps, but would we let a virtual agent enabled by artificial intelligence help mediate conflict among team members? A new study says they might help.

The study was presented at the 28th IEEE International Conference on Robot & Human Interactive Communication in the national capital on Tuesday.

"Our results show that virtual agents and potentially social robots might be a good conflict mediator in all kinds of teams. It will be very interesting to find out the interventions and social responses to ultimately seamlessly integrate virtual agents in human teams to make them perform better," said study lead author Kerstin Haring, Assistant Professor at the University of Denver.

Researchers from the University of Southern California (USC) and the University of Denver created a simulation in which a three-person team was supported by a virtual agent 'Avatar' on screen in a mission that was designed to ensure failure and elicit conflict.

The study was designed to look at virtual agents as potential mediators to improve team collaboration during conflict mediation.

While some of the researchers had previously found that one-on-one human interactions with a virtual agent therapist yielded more confessions, in this study, team members were less likely to engage with a male virtual agent named 'Chris' when conflict arose.

Participating members of the team did not physically accost the device, but rather were less engaged and less likely to listen to the virtual agent's input once failure ensued among team members.

The study was conducted in a military academy environment in which 27 scenarios were engineered to test how the team that included a virtual agent would react to failure and the ensuing conflict.

The virtual agent was not ignored by any means.

The study also found that the teams did respond socially to the virtual agent during the planning of the mission they were assigned (nodding, smiling and recognising the virtual agent's input by thanking it) but the longer the exercise progressed, their engagement with the virtual agent decreased.

 

Source: IANS