Maddy Woodman, the Learning & Development lead for Henley Careers, asks where is the line between human emotion and artificial intelligence?
The main thing we can all agree on is that the difference between humans and machines is emotion. Robots can’t feel hurt, anger, happiness or joy and don’t care about how they are treated (yet!).
If you haven’t already, get on YouTube and watch the video of Boston Dynamics and their backflipping robot (Atlas) that is programmed to move smoothly, pick itself up if it gets knocked over and continue on with the task at hand.
See how this video makes you feel – does it fascinate you, or is it uncomfortable watching?
When Boston released their video of Atlas in action, complete with a guy and a hockey stick prodding him to push “him” over – there was an outcry from the public. It made people feel deep routed empathy, a feeling that they were watching a grown man “bully” a “defenceless” robot.
So where is the line between human emotion and artificial intelligence? It’s definitely a lot more blurred than you think.
Dolby – the film and television sound experts – has developed their own biophysical lab measuring human emotions in order to “hack” Hollywood. Neuroscientists are currently studying how audio and video can affect the human body. Volunteers are hooked up to wires upon wires measuring their heartbeat, what makes their faces flush, and their specific reactions to movies. On top of this, lie detectors are used to ensure that the feedback given is as accurate as possible. Films like Up and Inside Out are classic examples of films that have been tweaked and improved in order to invoke as much emotion as possible, based upon direct feedback from these studies (and who hasn’t cried at Up?!). Inside Out‘s scene where baby Joy is born was specifically created to evoke a physiological feeling of pain, a reflection of real life birth. This kind of technological response is already happening, and you won’t even know it.
So how does this translate into the future world of work and recruitment?
Well – on a larger scale, it seems human emotions detected by machines will be up for sale, in order to boost user experience. Organisations are now thinking about a multi-sensory space, and how they can create not only emotional but physiological reactions to entertainment and beyond. Netflix and Hulu are using eye-tracking devices to measure user reactions to their app interfaces. So how long is it until Careers Services, HR departments and pretty much any other service start to measure reactions in order to create the ultimate experience? Imagine being able to tweak what we do in order to effectively engage and evoke the best reactions in our students, staff and customers…
It’s already happening in recruitment; Unilever are using HireVue’s AI to scan video interviewees’ emotions in order to detect inherent traits, body language and tone. It seems those days of being able to exaggerate certain parts of your CV are over! Will allowing AI to aid us in making recruitment decisions based on facial recognition decrease unconscious bias? What kind of ethical dilemmas does selling human emotions for user experience and ultimately profit, bring?
The future is upon us and whether we like it or not; through entertainment, service design and technology. You could already be experiencing this blurred line between machine and human – and you wouldn’t even know.
Delve deeper into the future of work at Henley Career’s World of Work conference on 13 September 2018 – find out more and book your place.
This article was written by Maddy Woodman, Careers Learning Manager for Henley Business School.