Article_snip

Janika Leoste , Kristel Marmor , Mati Heidmets
pp. 164 - 192, view paper, download
(https://doi.org/10.55612/s-5002-061-006), Google Scholar

Submitted on 31 Jan 2024 - Accepted on 30 Jun 2024

Interaction Design and Architecture(s) IxD&A Journal
Issue N. 61, Summer 2024

Abstract

This study presents a literature review focused on nonverbal communication in human-robot interaction (HRI) that involves service robots with social capabilities. We aim to list the types of robots used and nonverbal communication cues examined in the reviewed studies; and the main research objectives, participant characteristics, data collection methods, and primary findings of these studies. To achieve this, we used the databases of WoS, Scopus and EBSCO to conduct a literature review on utilization of nonverbal cues by both humans and robots during HRI. The results obtained from 39 relevant open access academic papers published from 2006 to 2023 suggest that enhancing the quality of communication between humans and service robots must be improved, while there are several aspects that require more thorough exploring, needed to strengthen robot self-efficacy, trust and trustworthiness in HRI or overcome cultural differences. The results emphasize the importance of nonverbal communication in shaping the dynamics of interactions between humans and service robots.

Keywords:

CRediT author statement: Janika Leoste: Conceptualization, Methodology, Validation, Formal analysis, Investigation, Resources, Data Curation, Writing - Original Draft, Writing - Review & Editing, Visualization, Supervision, Project administration. Kristel Marmor: Conceptualization, Formal analysis, Investigation, Data Curation, Writing - Original Draft, Writing - Review & Editing. Mati Heidmets: Conceptualization, Validation, Writing - Original Draft, Writing - Review & Editing, Supervision.

Cite this article as:
Leoste J., Marmor K., Heidmets M.: Nonverbal Behavior of Service Robots in Social Interactions – A Survey on Recent Studies, Interaction Design & Architecture(s) – IxD&A Journal, N.61, 2024, pp. 164–192, DOI: https://doi.org/10.55612/s-5002-061-006

References:

1. MarketsAndMarkets. https://www.marketsandmarkets.com/Market-Reports/service-robotics-market-681.html
2. Schraft, R.: Service robot – From vision to realization. Technica, 7, pp. 27–31, (1993)
3. ISO: Robots and Robotic Devices – Vocabulary ISO 8373:2012. Technical Report. ISO: Geneve, Switzerland, (2012)
4. International Federation of Robotics. Service Robots. https://ifr.org/service-robot
5. Bieber, G., Haescher, M., Antony, N., Hoepfner, F., Krause, S.: Unobtrusive Vital Data Recognition by Robots to Enhance Natural Human–Robot Communication. In Social Robots: Technological, Societal and Ethical Aspects of Human-Robot Interaction; Korn, O., Eds; Springer: Cham, (2019)
https://doi.org/10.1007/978-3-030-17107-0_5
6. Wirtz, J., Patterson, P. G., Kunz, W. H., Gruber, T., Lu, V. N., Paluch, S., Martins, A.: Brave new world: Service robots in the frontline. Journal of Service Management, 29, pp. 907–931, (2018)
https://doi.org/10.1108/JOSM-04-2018-0119
7. Kopacek, P.: Development Trends in Robotics. IFAC-PapersOnLine, 49, pp. 36–41, (2016), https://doi.org/10.1016/j.ifacol.2016.11.070
8. Holland, J., Kingston, L., McCarthy, C., Armstrong, E., O’Dwyer, P., Merz, F., McConnell, M.; Service Robots in the Healthcare Sector. Robotics, 10(1), (2021), https://doi.org/10.3390/robotics10010047
9. Turja, T., Rantanen, T., Oksanen, A.: Robot use self-efficacy in healthcare work (RUSH): development and validation of a new measure. AI & Society, 34, pp. 137–143, (2019)
https://doi.org/10.1007/s00146-017-0751-2
10. Leoste, J., Heidmets, M., Virkus, S., Talisainen, A., Rebane, M., Kasuk, T., Tammemäe, K., Kangur, K., Kikkas, K., Marmor, K.: Keeping Distance with a Telepresence Robot: A Pilot Study. Frontiers in Education, 7, 1046461, (2023), https://doi.org/10.3389/feduc.2022.1046461
11. Hoffman, G., Ju, W.: Designing robots with movement in mind. Journal of Human-Robot Interaction, 3(1), pp. 91–122, (2014), https://doi.org/10.5898/JHRI.3.1.Hoffman
12. Erel, H., Shem Tov, T., Kessler, Y., Zuckerman, O.: Robots are always social: Robotic movements are automatically interpreted as social cues. In Extended abstracts of the CHI conference on human factors in computing systems, (2019), https://doi.org/10.1145/3290607.3312758
13. Sarrica, M., Brondi, S., Fortunati, L.: How many facets does a “social robot” have? A review of scientific and popular definitions online. Inf. Technol. People, 33, pp. 1–21, (2019), https://doi.org/10.1108/ITP-04-2018-0203
14. Lee, Y., Lee, S., Kim, D.: Exploring hotel guests’ perceptions of using robot assistants. Tourism Management Perspectives, 37, (2021), https://doi.org/10.1016/j.tmp.2020.100781
15. temibots. temi V3 Robot – Black. https://temibots.com/product/temi-v3-robot-black-buy
16. RobotLAB. LG CLOi Guidebot Robot for Hospitality.
https://www.robotlab.com/hospitality-robots/store/cloi-guidebot
17. Urakami, J., Sutthithatip, S.: Building a Collaborative Relationship between Human and Robot through Verbal and Non-Verbal Interaction. Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (2021), https://doi.org/10.1145/3434074.3447171
18. Anderson-Bashan, L., Megidish, B., Erel, H., Wald, I., Hoffman, G., Zuckerman, O., Grishko, A.: The greeting machine: an abstract robotic object for opening encounters. In Proceedings of the 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), (2018),
https://doi.org/10.1109/ROMAN.2018.8525516
19. Ju, W., Takayama, L.; Approachability: How people interpret automatic door movement as gesture. International Journal of Design, 3(2), (2009),
20. Gonzalez-Aguirre, J.A., Osorio-Oliveros, R., Rodríguez-Hernández, K.L., Lizárraga-Iturralde, J., Morales Menendez, R., Ramí-rez-Mendoza, R.A., Ramírez-Moreno, M.A., et al.: Service Robots: Trends and Technology. Applied Sciences, 11(22), 10702, (2021), http://dx.doi.org/10.3390/app112210702
21. Dantas, R., Fleck, D.: Challenges in Identifying Studies to Include in a Systematic Literature Review: An Analysis of the Organizational Growth and Decline Topics. Global Knowledge, Memory and Communication, (2023), https://doi.org/10.1108/GKMC-03-2023-0098
22. Clarivate. Web of Science Search. https://www.webofscience.com/wos/woscc/basic-search
23. Elsevier B.V. https://www.scopus.com/home.uri
24. EBSCO. EBSCO Discovery Service. EBSCO Information Services.
https://www.ebsco.com/products/ebsco-discovery-service
25. Jirak, D., Aoki, M., Yanagi, T., Takamatsu, A., Bouet, S., Yamamura, T., Sandini, G., Rea, F.: Is It Me or the Robot? A Critical Evaluation of Human Affective State Recognition in a Cognitive Task. Frontiers in Neurorobotics, 16, (2022), https://doi.org/10.3389/fnbot.2022.882483
26. Parreira, M.T., Gillet, S., Winkle, K., Leite, I.: How Did We Miss This?: A Case Study on Unintended Biases in Robot Social Behavior. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (2023), https://doi.org/10.1145/3568294.3580032
27. Ghazali, A.S., Ham, J., Markopoulos, P.P., Barakova, E.I.: Investigating the Effect of Social Cues on Social Agency Judgement. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (2019), https://doi.org/10.1109/HRI.2019.8673266
28. Oetringer, D., Wolfert, P., Deschuyteneer, J., Thill, S., Belpaeme, T.: Communicative Function of Eye Blinks of Virtual Avatars May Not Translate onto Physical Platforms. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (2021), https://doi.org/10.1145/3434074.3447136
29. Lee, K.M., Peng, W., Jin, S.A., Yan, C.: Can Robots Manifest Personality? An Empirical Test of Personality Recognition, Social Re-sponses, and Social Presence in Human-Robot Interaction. Journal of Communication, 56, pp. 754–772, (2006), https://doi.org/10.1111/j.1460-2466.2006.00318.x
30. Perugia, G., Díaz-Boladeras, M., Català-Mallofré, A., Barakova, E.I., Rauterberg, G.: ENGAGE-DEM: A Model of Engagement of People With Dementia. IEEE Transactions on Affective Computing, 13, pp. 926–943, (2020),
https://doi.org/10.1109/TAFFC.2020.2980275
31. Song, S., Yamada, S.: Designing LED lights for a robot to communicate gaze. Advanced Robotics, 33, pp. 360–368, (2019), https://doi.org/10.1080/01691864.2019.1600426
32. Capy, S., Osorio, P., Hagane, S., Aznar, C., Garcin, D., Coronado, E., Deuff, D., Ocnarescu, I., Milleville, I., Venture, G.: Y?kobo: A Robot to Strengthen Links Amongst Users with Non-Verbal Behaviours. Machines, (2022), https://doi.org/10.3390/machines10080708
33. Michalowski, M.P., Kozima, H.: Methodological Issues in Facilitating Rhythmic Play with Robots. In Proceedings of the 16th IEEE International Symposium on Robot and Human Interactive Communication, (2007),
https://doi.org/10.1109/ROMAN.2007.4415060
34. Saulnier, P., Sharlin, E., Greenberg, S.: Exploring minimal nonverbal interruption in HRI. In Proceedings of the 20th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), (2011),
https://doi.org/10.1109/ROMAN.2011.6005257
35. Soomro, Z.A., Bin Shamsudin, A.U., Abdul Rahim, R., Adrianshah, A., Hazeli, M.: Non-Verbal Human-Robot Interaction Using Neural Network for The Application of Service Robot. IIUM Engineering Journal, 24(1), (2023),
https://doi.org/10.31436/iiumej.v24i1.2577
36. Yang, Y., Williams, A.B.: Improving Human-Robot Collaboration Efficiency and Robustness through Non-Verbal Emotional Com-munication. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (2021), https://doi.org/10.1145/3371382.3378385
37. Kwak, S.: The Impact of the Robot Appearance Types on Social Interaction with a Robot and Service Evaluation of a Robot. Archives of Design Research, 27(2), pp. 81–93, (2014), http://dx.doi.org/10.15187/adr.2014.05.110.2.81
38. Ahmad, M. I., Shahid, S., Tahir, A.: Towards the Applicability of NAO Robot for Children with Autism in Pakistan. In Proceedings of Human-Computer Interaction Interact 2017: 16th IFIP TC 13 International Conference, Mumbai, India, (2017), https://doi.org/10.1007/978-3-319-67687-6_32
39. Aly, A., Tapus, A.: A Model for Synthesizing a Combined Verbal and Nonverbal Behavior Based on Personality Traits in Human-Robot Interaction. In proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction, (2013), https://doi.org/10.1109/HRI.2013.6483606
40. Arts, E., Zörner, S., Bhatia, K., Mir, G., Schmalzl, F., Srivastava, A., Vasiljevic, B., Alpay, T., Peters, A., Strahl, E., Wermter, S.: Exploring Human-Robot Trust Through the Investment Game: An Immersive Space Mission Scenario. In Proceedings of the 8th In-ternational Conference on Human-Agent Interaction, (2020),
https://doi.org/10.1145/3406499.3415078
41. Trovato, G., Do, M., Kuramochi, M., Zecca, M., Terlemez, Ö., Asfour, T., Takanishi, A.: A Novel Culture-Dependent Gesture Selection System for a Humanoid Robot Performing Greeting Interaction. In Proceedings of the International Conference on Software Reuse, (2014), http://dx.doi.org/10.1007/978-3-319-11973-1_35
42. Kanda, T., Miyashita, T., Osada, T., Haikawa, Y., Ishiguro, H.: Analysis of Humanoid Appearances in Human–Robot Interaction. IEEE Transactions on Robotics, 24, pp. 725–735, (2008), https://doi.org/10.1109/TRO.2008.921566
43. Umbrico, A., De Benedictis, R., Fracasso, F., Cesta, A., Orlandini, A., Cortellessa, G.: A Mind-inspired Architecture for Adaptive HRI. International Journal of Social Robotics, 15, pp. 371–391, (2022), https://doi.org/10.1007/s12369-022-00897-8
44. Xu, J., Broekens, J., Hindriks, K.V., Neerincx, M.A.: Bodily Mood Expression: Recognize Moods from Functional Behaviors of Hu-manoid Robots. In Proceedings of International Conference on Software Reuse, (2013), https://doi.org/10.1007/978-3-319-02675-6_51
45. Xu, J., Broekens, J., Hindriks, K.V., Neerincx, M.A.: Mood contagion of robot body language in human-robot interaction. Autonomous Agents and Multi-Agent Systems, 29, pp. 1216–1248, (2015), https://doi.org/10.1007/s10458-015-9307-3
46. Chao, C., Cakmak, M., Thomaz, A.L.: Transparent Active Learning for Robots. In Proceedings of 5th ACM/IEEE International Con-ference on Human-Robot Interaction, (2010), https://doi.org/10.1109/HRI.2010.5453178
47. Cooper, S., Fensome, S.F., Kourtis, D.A., Gow, S., Dragone, M.: An EEG Investigation on Planning Human-Robot Handover Tasks. In Proceedins of the IEEE International Conference on Human-Machine Systems, (2020),
https://doi.org/10.1109/ICHMS49158.2020.9209543
48. Devillers, L., Rosset, S., Duplessis, G.D., Bechade, L., Yemez, Y., Türker, B.B., Sezgin, T.M., Erzin, E., Haddad, K.E., Dupont, S., Deléglise, P., Estève, Y., Lailler, C., Gilmartin, E., Campbell, N.: Multifaceted Engagement in Social Interaction with a Machine: The JOKER Project. In Proceedins of the 13th IEEE International Conference on Automatic Face & Gesture Recognition, (2018), https://doi.org/10.1109/FG.2018.00110
49. Lallée, S., Hamann, K., Steinwender, J., Warneken, F., Martinez-Hernandez, U., Barron-Gonzalez, H., Pattacini, U., Gori, I., Petit, M., Metta, G., Verschure, P.F., Dominey, P.F.: Cooperative Human Robot Interaction Systems: IV. Communication of Shared Plans with Naïve Humans Using Gaze and Speech. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, (2013), https://doi.org/10.1109/IROS.2013.6696343
50. Kose-Bagci, H., Dautenhahn, K., Syrdal, D.S., Nehaniv, C.L.: Drum-mate: Interaction Dynamics and Gestures in Human–Humanoid Drumming Experiments. Connection Science, 22, pp. 103–134, (2010), https://doi.org/10.1080/09540090903383189
51. Kennedy, J., Baxter, P.E., Senft, E., Belpaeme, T.: Social Robot Tutoring for Child Second Language Learning. In Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction, (2016), https://doi.org/10.1109/HRI.2016.7451757
52. Mora, A., Glas, D.F., Kanda, T., Hagita, N.: A Teleoperation Approach for Mobile Social Robots Incorporating Automatic Gaze Control and Three-Dimensional Spatial Visualization. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 43, pp. 630–642, (2013), https://doi.org/10.1109/TSMCA.2012.2212187
53. Sheikhi, S., Odobez, J.: Combining dynamic head pose-gaze mapping with the robot conversational state for attention recognition in human-robot interactions. Pattern Recognit. Lett., 66, pp. 81–90, (2015), http://dx.doi.org/10.1016/j.patrec.2014.10.002
54. Shao, M., Snyder, M., Nejat, G., Benhabib, B.: User Affect Elicitation with a Socially Emotional Robot. Robotics, 9, 44, (2020), https://doi.org/10.3390/robotics9020044
55. Gunes, H., Çeliktutan, O., Sariyanidi, E.: Live Human–Robot Interactive Public Demonstrations with Automatic Emotion and Per-sonality Prediction. Philosophical Transactions of the Royal Society B, 374(1771), (2019), http://dx.doi.org/10.1098/rstb.2018.0026
56. Matarese, M., Rea, F., Sciutti, A.: Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction. Frontiers in Robotics and AI, 9, (2022), https://doi.org/10.3389/frobt.2022.733954
57. Redondo, M.E., Niewiadomski, R., Francesco, R., Sciutti, A.: Comfortability Recognition from Visual Non-verbal Cues. In Proceedings of the International Conference on Multimodal Interaction, (2022), https://doi.org/10.1145/3536221.3556631
58. Redondo, M.E., Sciutti, A., Incao, S., Rea, F., Niewiadomski, R.: Can Robots Impact Human Comfortability During a Live Interview? In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (2021), https://doi.org/10.1145/3434074.3447156
59. Esteban, P.G., Bagheri, E., Elprama, S.A., Jewell, C.I., Cao, H., De Beir, A., Jacobs, A., Vanderborght, B.: Should I be Introvert or Extrovert? A Pairwise Robot Comparison Assessing the Perception of Personality-Based Social Robot Behaviors. International Journal of Social Robotics, 14, pp. 115–125, (2021), https://doi.org/10.1007/s12369-020-00715-z
60. Liu, L., Liu, Y., Gao, X.Z.: Impacts of Human Robot Proxemics on Human Concentration-Training Games with Humanoid Robots. Healthcare, 9, (2021), https://doi.org/10.3390/healthcare9070894
61. Baddoura, R., Venture, G.: Social vs. Useful HRI: Experiencing the Familiar, Perceiving the Robot as a Sociable Partner and Responding to Its Actions. International Journal of Social Robotics, 5, pp. 529–547, (2013), https://doi.org/10.1007/s12369-013-0207-x
62. Feng, H., Mahoor, M.H., Dino, F.: A Music-Therapy Robotic Platform for Children With Autism: A Pilot Study. Frontiers in Robotics and AI, 9, (2022), https://doi.org/10.3389/frobt.2022.855819
63. Matarese, M., Sciutti, A., Rea, F., Rossi, S.: Toward Robots’ Behavioral Transparency of Temporal Difference Reinforcement Learning With a Human Teacher. IEEE Transactions on Human-Machine Systems, 51, pp. 578–589, (2021), https://doi.org/ 10.1109/THMS.2021.3116119
64. Bandura, A.: Self-efficacy. In Encyclopedia of Human Behavior, Ramachaudran, V. S., Ed, Academic Press: New York, NY, (1994)
65. Robinson, N. L., Hicks, T., Suddrey, G., Kavanagh, D. J.: The Robot Self-Efficacy Scale: Robot Self-Efficacy, Likability and Willingness to Interact Increases After a Robot-Delivered Tutorial. In Proceedings of the 29th IEEE International Conference on Robot and Human Interactive Communication, (2020), https://doi.org/10.1109/RO-MAN47096.2020.9223535
66. Frank, D.A., Otterbring, T.: Being Seen… by Human or Machine? Acknowledgment Effects on Customer Responses Differ Between Human and Robotic Service Workers. Technological Forecasting & Social Change, 189, 122345, (2023), https://doi.org/10.1016/j.techfore.2023.122345
67. Latane, B.: The Psychology of Social Impact. American Psychologist, 36(4), pp. 343–356, (1981), https://doi.org/10.1037/0003-066X.36.4.343
68. Lee, K.M.: Presence, Explicated. Communication Theory, 14(1), pp. 27–50, (2004), https://doi.org/10.1111/j.1468-2885.2004.tb00302.x
69. Li, Y., Sekino, H., Sato-Shimokawara, E., Yamaguchi, T.: The Influence of Robot’s Expressions on Self-Efficacy in Erroneous Situa-tions. Journal of Advanced Computational Intelligence and Intelligent Informatics, 26(4), pp. 521–530, (2022), https://doi.org/10.20965/jaciii.2022.p0521
70. Swift-Spong, K., Short, E.S., Wade, E., Matari?, M.J.: Effects of comparative feedback from a Socially Assistive Robot on self-efficacy in post-stroke rehabilitation. In Proceedings of the IEEE International Conference on Rehabilitation Robotics, (2015), https://doi.org/10.1109/ICORR.2015.7281294
71. Albardiaz, R.: Non-verbal Communication: An Update on Smell. Education for Primary Care, 32(6), pp. 363–365, (2021), https://doi.org/10.1080/14739879.2021.1955625
72. Vinciarelli, A., Pantic, M., Bourland, H.: Social signal processing: Survey of an emerging domain. Image and Vision Computing, 27, pp. 1743–1759, (2009), https://doi.org/10.1016/j.imavis.2008.11.007
73. Barquero, G., Núñez, J., Escalera, S., Xu, Z., Tu, W., Guyon, I., Palmero, C.: Didn’t See That Coming: A Survey on Non-verbal Social Human Behavior Forecasting. Machine Learning Research, 173, pp. 139–178, (2022)
74. Spatola, N., Kühnlenz, B., Cheng, G.: Perception and Evaluation in Human–Robot Interaction: The Human–Robot Interaction Evalu-ation Scale (HRIES) – A Multicomponent Approach of Anthropomorphism. International Journal of Social Robotics, 13, pp. 1517–1539, (2021), https://doi.org/10.1007/s12369-020-00667-4
75. Mandl, S., Bretschneider, M., Asbrock, F., Meyer, B., Strobel, A.: The Social Perception of Robots Scale (SPRS): Developing and Testing a Scale for Successful Interaction Between Humans and Robots. In Proceedings of the PRO-VE 2022: Collaborative Networks in Dig-italization and Society 5.0, (2022), https://doi.org/10.1007/978-3-031-14844-6_26
76. Zuckerman, O., Walker, D., Grishko, A., Moran, T., Levy, C., Lisak, B., … & Erel, H.: Companionship is not a function: the effect of a novel robotic object on healthy older adults’ feelings of “Being-seen”. In Proceedings of the 2020 CHI conference on human factors in computing systems, (2020), https://doi.org/10.1145/3313831.3376411
77. Jung, M. F., DiFranzo, D., Stoll, B., Shen, S., Lawrence, A., Claure, H.: Robot assisted tower construction-a resource distribution task to study human-robot collaboration and interaction with groups of people. arXiv preprint arXiv:1812.09548, (2018).
78. Hall, E.: The Hidden Dimension. Doubleday: New York, (1969)
79. Xie, L., Lei, S.: The nonlinear effect of service robot anthropomorphism on customers’ usage intention: A privacy calculus perspective. International Journal of Hospitality Management, 107, 103312, (2022), https://doi.org/10.1016/j.ijhm.2022.103312

back to Table of Contents