top of page

To reward and Beyond: The Neuroscience of UX [II]

To date, user experience (UX) researchers and product designers use feedback to measure how individuals feel towards products. For example, one study had users journal every day, for four days, about their experience with Amazon’s Alexa (Lapotovska et al., 2018) and another analyzed online reviews for Alexa to see how individuals personify the agent (Purington et al.,2017). A more recent study claims that Amazon’s Alexa is providing companionship to individuals with disabilities. The study’s methods include filtering amazon’s reviews for keywords such as ‘disabilities’ and ‘companionship’ as well as interviewing individuals who have disabilities and use Alexa (Ramadan et al.,2020). While these studies are interesting, they don’t really give insight into how individuals respond in real-time to the product, or how it influences the users' behavior. In the previous part of this article, I suggested that by directly measuring the neural reactions of product users, user research can be improved. I also discussed how the activation of the neural pathway involved in reward and pleasure, the same pathway elicited by drug addiction, sex, and various other desirable activities, mediates user engagement with a product.

Reward appears to be the mechanism responsible for retention on social media and some gaming platforms. However, there are other products that require more than activation of reward mechanisms to encourage usage. For example, what encourages us to ask Google home the weather or have Alexa play Spotify? These agents are designed to be perceived as friendly assistants, and companies are beginning to expand on the possibility of robotic assistants in the home. Recently, a video was released by Amazon previewing the Amazon Astro, designed to be a ‘personal robot helper.’ The video depicts a cute robot that becomes part of a family by helping them with chores and even delivering a beer when they are watching television. Beyond that, companies are beginning to explore the possibility of replacing caregivers with robots. The neural mechanisms most likely involved with these products are social connection and affinity.




When we hug a loved one, laugh at a colleague's joke, or even gaze into the eyes of a pet dog, a hormone known as oxytocin is released. Oxytocin is commonly known as the ‘love hormone’ and is a classic biomarker for human affinity. Oxytocin is also released to reduce stress and has a positive correlation with cortisol, a hormone related to stress. A recent study done by Nirit Geva and her team at Ben Gurion University examined oxytocin levels in the saliva of volunteers, as well as their reported levels of pain and emotions, while they were administered moderately painful heat stimuli. The volunteers were randomly assigned to one of three groups. One group was instructed to pet PARO the robotic seal while they were administered the painful stimuli. The other group just had PARO present in the room without interacting with it, and the control group was just administered the painful stimuli without PARO at all.


PARO is a robot that looks like a seal and has fur, Paro also ‘cries’ and encourages the user to care for it. PARO was originally designed for individuals with dementia to be used in a therapeutic setting. Saliva samples were taken at different intervals of the study to compare oxytocin changes with the baseline. Results showed that the PARO groups (those that pet Paro and those that had it in the room) reported a significant decrease in pain perception, especially in the group that touched PARO. Interestingly, there was a significant decrease in oxytocin levels for the group that physically touched PARO, but not in the control group. Researchers suggest that this is because there was an overall reduction in stress and as a result, less oxytocin was released. These findings are novel and show the potential value of robots as therapy tools. This research is the first of its kind to measure hormonal response in Human-Robot Interaction.


Studies have also shown that similar neural mechanisms are activated when humans view a robot displaying emotions as when they view another human. Additionally, we see similar neural activation of areas correlated with empathy when viewing a robot being treated violently compared with a human (Rosenthal-von der Putten et al.,2013). These findings suggest that humans have similar social reactions to robots as they do to fellow humans, however, the strength of the neural responses is not as strong when viewing a robot compared with a human. Additionally, a recent study examined human physiological responses when gazing into the eyes of a robot. The study found similar heart rate and galvanic skin conductance compared with gazing into the eyes of a human ( Kiilavouri et al., 2021).


In order to better understand how individuals respond to products that are meant to encourage sociability and companionship, UX researchers and product designers should be examining biomarkers of affinity. Based on existing research, they can deploy the usage of neural correlates, hormones such as oxytocin, and physiological markers like heart rate and galvanic skin response. This data will give far better and faster insight into product usage than self-reported feedback. Additionally, companies can benefit from the study implications. For example, a robotic seal that cries and needs attention is a lot less appealing than a robotic seal that reduces pain and stress when pet.

Providing neural reward and invoking social affinity keeps us coming back to products, but in many cases even doing that is not enough. With the proliferation of interactive technology, companies are competing for individuals’ attention. Attention is key to a successful product when it comes to applications, web platforms, and gaming products. In the next part, we will discuss attention and how it can be measured to optimize product design.


 

References

  • Geva, N., Uzefovsky, F. & Levy-Tzedek, S. Touching the social robot PARO reduces pain perception and salivary oxytocin levels. Sci Rep 10, 9814 (2020). https://doi.org/10.1038/s41598-020-66982-y

  • Helena Kiilavuori, Veikko Sariola, Mikko J. Peltola, Jari K. Hietanen,Making eye contact with a robot: Psychophysiological responses to eye contact with a human and with a humanoid robot,Biological Psychology,Volume158,2021,107989,ISSN0301-0511,https://doi.org/10.1016/j.biopsycho.2020.107989. (http://www.sciencedirect.com/science/article/pii/S0301051120301496)

  • Lopatovska, I., & Williams, H. (2018, March). Personification of the Amazon Alexa: BFF or a mindless companion. In Proceedings of the 2018 Conference on Human Information Interaction & Retrieval (pp. 265-268).

  • Purington, A., Taft, J. G., Sannon, S., Bazarova, N. N., & Taylor, S. H. (2017, May). " Alexa is my new BFF" Social Roles, User Satisfaction, and Personification of the Amazon Echo. In Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems (pp. 2853-2859).

  • Ramadan, Z., F Farah, M., & El Essrawi, L. (2021). From Amazon. com to Amazon. love: How Alexa is redefining companionship and interdependence for people with special needs. Psychology & Marketing, 38(4), 596-609.

  • Rosenthal-von der Pütten, A. M., Schulte, F. P., Eimler, S. C., Hoffmann, L., Sobieraj, S., Maderwald, S., ... & Brand, M. (2013, March). Neural correlates of empathy towards robots. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 215-216). IEEE.

 

Natania is a M.A student in Cognitive Affective Neuroscience at Bar Ilan University.

Her thesis focuses on Human Robot Interaction, and she is interested in finding better mechanisms to measure human responses to social robots. She doesn't really have free time because she is a Mom, but in those rare quiet moments you can find her reading. Her favorite author is John Steinback.

88 views0 comments

Related Posts

See All
bottom of page