How to Compliment a Human – Designing Affective and Well-being Promoting Conversational Things

Ilhan Aslan, Dominik Neu, Daniela Neupert, Stefan Grafberger, Nico Weise, Pascal Pfeil, Maximilian Kuschewski
pp.  157 – 184, download
(https://doi.org/10.55612/s-5002-058-007)

Abstract

With today’s technologies it seems easier than ever to augment everyday things with the ability to perceive their environment and to talk to users. Considering conversational user interfaces, tremendous progress has already been made in designing and evaluating task oriented conversational interfaces, such as voice assistants for ordering food, booking a flight etc. However, it is still very challenging to design smart things that can have with their users an informal conversation and emotional exchange, which requires the smart thing to master the usage of social everyday utterances, using irony and sarcasm, delivering good compliments, etc. In this paper, we focus on the experience design of compliments and the Complimenting Mirror design. The paper reports in detail on three phases of a human-centered design process including a Wizard of Oz study in the lab with 24 participants to explore and identify the effect of different compliment types on user experiences and a consequent field study with 105 users in an architecture museum with a fully functional installation of the Complimenting Mirror. In our analyses we argue why and how a “smart” mirror should compliment users and provide a theorization applicable for affective interaction design with things in more general. We focus on subjective user feedback including user concerns and prepositions of receiving compliments from a thing and on observations of real user behavior in the field i.e. transitions of bodily affective expressions comparing affective user states before, during, and after compliment delivery. Our research shows that compliment design matters significantly and using the right type of compliments in our final design in the field test, we succeed in achieving reactive expressions of positive emotions, “sincere” smiles and laughter, even from the seemingly sternest users. We conclude by providing an outlook of our contribution for the new age of large language models and prompt engineering.

Keywords: UX Design, Affective Computing, IoT, Machine Learning Application, Conversational User Interfaces, Prompt Engineering.


References

1. Shusterman, R. (2008). Body consciousness: A philosophy of mindfulness and somaesthetics. Cambridge University Press.
2. Rosa, H. (2016). Resonanz: Eine Soziologie der Weltbeziehung. Suhrkamp Verlag.
3. Dang, C. T., Aslan, I., Lingenfelser, F., Baur, T., & André, E. (2019, October). Towards somaesthetic smarthome designs: exploring potentials and limitations of an affective mirror. In Proceedings of the 9th International Conference on the Internet of Things (pp. 1-8). DOI: https://doi.org/10.1145/3365871.3365893
4. Petrak, B., Weitz, K., Aslan, I., & Andre, E. (2019, October). Let me show you your new home: studying the effect of proxemic-awareness of robots on users’ first impressions. In 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN) (pp. 1-7). IEEE. DOI: 10.1109/RO-MAN46459.2019.8956463
5. Lee, C. P., Cagiltay, B., & Mutlu, B. (2022, April). The Unboxing Experience: Exploration and Design of Initial Interactions Between Children and Social Robots. In CHI Conference on Human Factors in Computing Systems (pp. 1-14). DOI: https://doi.org/10.1145/3491102.3501955
6. Picard, R. W. (2000). Affective computing. MIT press.
7. Turk, M., & Robertson, G. (2000). Perceptual user interfaces (introduction). Communications of the ACM, 43(3), 32-34.
8. Höök, K. (2008, June). Affective loop experiences–what are they?. In International Conference on Persuasive Technology (pp. 1-12). Springer, Berlin, Heidelberg.
9. Aslan, I., Seiderer, A., Dang, C. T., Rädler, S., & André, E. (2020, October). PiHearts: Resonating Experiences of Self and Others Enabled by a Tangible Somaesthetic Design. In Proceedings of the 2020 International Conference on Multimodal Interaction (pp. 433-441). DOI: https://doi.org/10.1145/3382507.3418848
10. Tsaknaki, V., Cotton, K., Karpashevich, P., & Sanches, P. (2021, May). “Feeling the Sensor Feeling you”: A Soma Design Exploration on Sensing Non-habitual Breathing. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-16). DOI: https://doi.org/10.1145/3411764.3445628
11. Tsaknaki, V., Balaam, M., Ståhl, A., Sanches, P., Windlin, C., Karpashevich, P., & Höök, K. (2019, June). Teaching soma design. In Proceedings of the 2019 on Designing Interactive Systems Conference (pp. 1237-1249). DOI: https://doi.org/10.1145/3322276.3322327
12. Calvo, R. A., & Peters, D. (2014). Positive computing: technology for wellbeing and human potential. MIT press.
13. Marita P McCabe and Lina A Ricciardelli. 2004. Body image dissatisfaction among males across the lifespan: A review of past literature. Journal of Psychosomatic Research 56, 6 (2004), 675 – 685. DOI: http://dx.doi.org/https: //doi.org/10.1016/S0022-3999(03)00129-6
14. Lisa M. Groesz, Michael P. Levine, and Sarah K. Murnen. 2002. The effect of experimental presentation of thin media images on body satisfaction: A meta-analytic review. International Journal of Eating Disorders 31, 1 (2002), 1–16. DOI: http://dx.doi.org/10.1002/eat.10005
15. Emily Lowe-Calverley and Rachel Grieve. 2018. Self-ie love: Predictors of image editing intentions on Facebook. Telematics and Informatics 35, 1 (2018), 186–194. DOI: https://doi.org/10.1016/j.tele.2017.10.011
16. Zoe Brown and Marika Tiggemann. 2016. Attractive celebrity and peer images on Instagram: Effect on women’s mood and body image. Body image 19 (09 2016), 37–43. DOI: http://dx.doi.org/10.1016/j.bodyim.2016.08.007
17. Sahar Sajadieh and Hannah Wolfe. 2019. Come Hither to Me: Performance of a Seductive Robot. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA ’19). ACM, New York, NY, USA, Article INT030, 4 pages. DOI: http://dx.doi.org/10.1145/3290607.3313287
18. Löwgren, J., & Stolterman, E. (2004). Thoughtful interaction design: A design perspective on information technology. Mit Press.
19. Ju, W. (2015). The design of implicit interactions. Synthesis Lectures on Human-Centered Informatics, 8(2), 1-93. DOI: https://doi.org/10.1007/978-3-031-02210-4
20. Holmes, J. (1986). Compliments and compliment responses in New Zealand English. Anthropological linguistics, 485-508.
21. Pomerantz, A. (1978). Compliment responses: Notes on the co-operation of multiple constraints. In Studies in the organization of conversational interaction (pp. 79-112). Academic Press. DOI: https://doi.org/10.1016/B978-0-12-623550-0.50010-0
22. Kleinke, C. L., Peterson, T. R., & Rutledge, T. R. (1998). Effects of self-generated facial expressions on mood. Journal of personality and social psychology, 74(1), 272. DOI: https://doi.org/10.1037/0022-3514.74.1.272
23. Tsujita, H., & Rekimoto, J. (2011). HappinessCounter: smile-encouraging appliance to increase positive mood. In CHI’11 Extended Abstracts on Human Factors in Computing Systems (pp. 117-126). https://doi.org/10.1145/1979742.1979608
24. Cha, N., Kim, I., Park, M., Kim, A., & Lee, U. (2018, October). HelloBot: Facilitating Social Inclusion with an Interactive Greeting Robot. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers (pp. 21-24). DOI: https://doi.org/10.1145/3267305.3267616
25. Weber, K., Ritschel, H., Aslan, I., Lingenfelser, F., & André, E. (2018, October). How to shape the humor of a robot-social behavior adaptation based on reinforcement learning. In Proceedings of the 20th ACM international conference on multimodal interaction (pp. 154-162). DOI: https://doi.org/10.1145/3242969.3242976
26. Ritschel, H., Aslan, I., Sedlbauer, D., & André, E. Irony Man: Augmenting a Social Robot with the Ability to Use Irony in Multimodal Communication with Humans.
27. Farve, N., & Maes, P. (2016, April). Smile Catcher: Can Game Design Lead to Positive Social Interactions?. In International Conference on Persuasive Technology (pp. 211-218). Springer, Cham.
28. Besserer, D., Bäurle, J., Nikic, A., Honold, F., Schüssel, F., & Weber, M. (2016, November). Fitmirror: a smart mirror for positive affect in everyday user morning routines. In Proceedings of the workshop on multimodal analyses enabling artificial agents in human-machine interaction (pp. 48-55). DOI: https://doi.org/10.1145/3011263.3011265
29. Andreu, Y., Chiarugi, F., Colantonio, S., Giannakakis, G., Giorgi, D., Henriquez, P., … & Tsiknakis, M. (2016). Wize Mirror-a smart, multisensory cardio-metabolic risk monitoring system. Computer Vision and Image Understanding, 148, 3-22. DOI: https://doi.org/10.1016/j.cviu.2016.03.018
30. Blum, T., Kleeberger, V., Bichlmeier, C., & Navab, N. (2012, March). mirracle: An augmented reality magic mirror system for anatomy education. In 2012 IEEE Virtual Reality Workshops (VRW) (pp. 115-116). IEEE. DOI: 10.1109/VR.2012.6180909
31. Bimbo Brasil. 2014. Friendly Mirror – Nutrella. Video.(06 August 2014). Retrieved August 11, 2019 from https://www.youtube.com/watch?v=kodQamtcmus.
32. Ikea UK. 2014. IKEA Motivational Mirror. Video. (02 October 2014). Retrieved August 11, 2019 from https://www.youtube.com/watch?v=W30-HQXhB-E.
33. Pérez-Marín, D., & Pascual-Nieto, I. (2013). An exploratory study on how children interact with pedagogic conversational agents. Behaviour & Information Technology, 32(9), 955-964. DOI: https://doi.org/10.1080/0144929X.2012.687774
34. Avula, S., Chadwick, G., Arguello, J., & Capra, R. (2018, March). Searchbots: User engagement with chatbots during collaborative search. In Proceedings of the 2018 conference on human information interaction & retrieval (pp. 52-61). DOI: https://doi.org/10.1145/3176349.3176380
35. Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human-Computer Studies, 151, 102630. DOI: https://doi.org/10.1016/j.ijhcs.2021.102630
36. Portela, M., & Granell-Canut, C. (2017, September). A new friend in our Smartphone? Observing Interactions with Chatbots in the search of emotional engagement. In Proceedings of the XVIII International Conference on Human Computer Interaction (pp. 1-7). DOI: https://doi.org/10.1145/3123818.3123826
37. Aslan, I., Xu, F., Uszkoreit, H., Krüger, A., & Steffen, J. (2005, January). COMPASS2008: Multimodal, multilingual and crosslingual interaction for mobile tourist guide applications. In INTETAIN (pp. 3-12).
38. Amiriparian, S., Sokolov, A., Aslan, I., Christ, L., Gerczuk, M., Hübner, T., … & Schuller, B. W. (2021). On the impact of word error rate on acoustic-linguistic speech emotion recognition: An update for the deep learning era. arXiv preprint arXiv:2104.10121. DOI: https://doi.org/10.48550/arXiv.2104.10121
39. Yang, Z., Jing, X., Triantafyllopoulos, A., Song, M., Aslan, I., & Schuller, B. W. (2022). An overview & analysis of sequence-to-sequence emotional voice conversion. arXiv:2203.15873. DOI: https://doi.org/10.48550/arXiv.2203.15873
40. Deibel, D., & Evanhoe, R. (2021). Conversations with Things: UX design for Chat and Voice. Rosenfeld Media.
41. Bommasani, R., Hudson, D. A., Adeli, E., Altman, R., Arora, S., von Arx, S., … & Liang, P. (2021). On the opportunities and risks of foundation models. arXiv preprint arXiv:2108.07258. DOI: https://doi.org/10.48550/arXiv.2108.07258

 

back to Table of Contents