Article_snip

Oihane Unciti , Antoni Martinez Ballesté , Ramon Palau
pp. 85 - 102, view paper, download
(https://doi.org/10.55612/s-5002-060-003), Google Scholar

Submitted on 31 Oct 2023 - Accepted on 15 Feb 2024

Interaction Design and Architecture(s) IxD&A Journal
Issue N. 60, Spring 2024

Abstract

The purpose of the article is to understand the current state of both the technology and the implementation of emotion recognition in the educational environment. The goal is to obtain detailed information about the current state of emotion recognition technology and how its practical use is being carried out in educational settings. In this line, it examines the proposals from publications over the last 10 years on the advancement of technology for emotion recognition in education. A total of 1,347 studies were obtained and 43 were included in the review for analysis and discussion. The article demonstrates how the number of studies has increased in recent years, with a higher frequency in online learning. Furthermore, according to the Technological Readiness Level, despite the growing interest in emotion recognition in the educational environment, its implementation is still far from becoming a reality. Most of the research has been conducted from a theoretical perspective and none of them has been fully developed and implemented in the classroom. In addition, many of the studies analysed have not tested the validity of their findings.

Keywords: Emotion Recognition, Smart Classroom, Educational Environments

CRediT author statement: Oihane Unciti: Investigation, Formal analysis, Writing – original draft preparation. Antoni Martínez Ballesté: Investigation, Formal analysis, Writing – review and editing. Ramon Palau: Formal analysis – review and editing

Cite this article as:
Unciti O., Martinez Ballesté A., Palau R.: Real-Time Emotion Recognition and its Effects in a Learning Environment., Interaction Design & Architecture(s) – IxD&A Journal, N.60, 2023, pp. 85–102, DOI: https://doi.org/10.55612/s-5002-060-003

References:

1. Abbassi, N., Helaly, R., Hajjaji, M.A., Mtibaa, A.: A deep learning facial emotion classification system: a VGGNet-19 based approach. 2020 20th international conference on sciences and techniques of automatic control and computer engineering (STA). IEEE. (2020). https://doi.org/10.1109/sta50679.2020.9329355
2. Altrabsheh, N., Cocea, M., Fallahkhair, S.: Predicting students’ emotions using machine learning techniques. Lecture notes in computer science (pp. 537–540). Springer International Publishing (2015). https://doi.org/10.1007/978- 3-319-19773-9 56
3. Bahreini, K., Nadolski, R., Westera, W.: Towards multimodal emotion recognition in e- learning environments. Interactive Learning Environments, 24 (3), 590–605, (2014). https://doi.org/10.1080/10494820.2014.908927
4. Bahreini, K., Nadolski, R., Westera, W.: Towards real-time speech emotion recognition for affective e-learning. Education and Information Technologies, 21 (5), 1367–1386, (2015). https://doi.org/10.1007/s10639-015-9388-2
5. Busso, C., Lee, S., Narayanan, S.: Analysis of emotionally salient aspects of fundamental frequency for emotion detection. IEEE Transactions on Audio, Speech, and Language Processing, 17 (4), 582–596, (2009). https://doi.org/10.1109/ tasl.2008.2009578
6. Chen, H., & Guan, J.: Teacher-student behaviour recognition in classroom teaching based on improved YOLO-v4 and internet of things technology. Electronics, 11 (23), 3998, (2022) https://doi.org/10.3390/electronics11233998
7. Chen, M., Xie, L., Li, C., Wang, Z.: Research on emotion recognition for online learning in a novel computing model. Applied Sciences, 12 (9), 4236, (2022). https://doi.org/10.3390/app12094236
8. Dawson, M.E., Schell, A.M., Filion, D.L., Berntson, G.G.: The electrodermal system. J.T. Cacioppo, L.G. Tassinary, & G. Berntson (Eds.), Handbook of psychophysiology (pp. 157– 181). Cambridge University Press (2017). https://doi.org/10.1017/cbo9780511546396.007
9. Du, Y., Crespo, R.G., Martínez, O.S.: Human emotion recognition for enhanced performance evaluation in e-learning. Progress in Artificial Intelligence, (2022)., https://doi.org/10.1007/s13748-022-00278-2
10. Duki ?c, D., & Krzic, A.S.: Real-time facial expression recognition using deep learning with application in the active classroom environment. Electronics, 11 (8), 1240, (2022) https://doi.org/10.3390/electronics11081240
11. Fakhar, S., Baber, J., Bazai, S.U., Marjan, S., Jasinski, M., Jasinska, E.,. . . Hussain, S.: Smart classroom monitoring using novel real-time facial expression recognition system. Applied Sciences, 12 (23), 12134, (2022) https://doi.org/10.3390/app122312134
12. Feshbach, N.D., & Feshbach, S.: Empathy and education. The social neuroscience of empathy, pp. 85–98, (2009). The MIT Press. https://doi.org/10.7551/mitpress/9780262012973.003.0008
13. Fornons, V., & Palau, R.: Flipped classroom en la enseñanza de las matemáticas: una revisión sistemática. Education in the Knowledge Society (EKS), 22, e24409, (2021). https://doi.org/10.14201/eks.24409 Retrieved from https://doi.org/10.14201/eks.24409
14. Gupta, S., Kumar, P., Tekchandani, R.K.: Facial emotion recognition based real-time learner engagement detection system in online learning context using deep learning models. Multimedia Tools and Applications, 82 (8), 11365–11394, (2022). https://doi.org/10.1007/s11042-022-13558-9
15. Hayes, D.N.: ICT and learning: Lessons from australian classrooms. Computers & Education, 49 (2), 385–395, (2007).https://doi.org/10.1016/j.compedu.2005.09.003
16. Ishii, L.E., Nellis, J.C., Boahene, K.D., Byrne, P., Ishii, M.: The importance and psychology of facial expression. Otolaryngologic Clinics of North America, 51 (6), 1011– 1017, (2018). https://doi.org/10.1016/j.otc.2018.07.001
17. Jie, L., Xiaoyan, Z., Zhaohui, Z.: Speech emotion recognition of teachers in classroom teaching. 2020 chinese control and decision conference (CCDC). IEEE. (2020). https://doi.org/10.1109/ccdc49329.2020.9164823
18. Kim, P.W.: Assessing engagement levels in a non-face-to-face learning environment through facial expression analysis. Concurrency and Computation: Practice and Experience, 33 (22), (2021). https://doi.org/10.1002/cpe.6182
19. Kiuru, N., Spinath, B., Clem, A.-L., Eklund, K., Ahonen, T., Hirvonen, R.: The dynamics of motivation, emotion, and task performance in simulated achievement situations. Learning and Individual Differences, 80, 101873, (2020) https://doi.org/10.1016/j.lindif.2020.101873
20. Li, G., & Wang, Y.: Research on leamer’s emotion recognition for intelligent education system. 2018 IEEE 3rd advanced information technology, electronic and automation control conference (IAEAC). IEEE (2018). https://doi.org/10.1109/iaeac.2018.8577590
21. Li, L., Cheng, L., xi Qian, K.: An e-learning system model based on affective computing. 2008 international conference on cyberworlds. IEEE (2008). https://doi.org/10.1109/cw.2008.41
22. Llurba, C., Fretes, G., Palau, R.: Pilot study of real-time emotional recognition technology for secondary school students. Interaction Design and Architecture(s)(52), 61–80, (2022). https://doi.org/10.55612/s-5002-052-004
23. Meriem, B., Benlahmar, H., Naji, M.A., Sanaa, E., Wijdane, K.: Determine the level of concentration of students in real time from their facial expressions. International Journal of Advanced Computer Science and Applications, 13 (1), (2022). https://doi.org/10.14569/ijacsa.2022.0130119
24. Mogas, J., Palau, R., Lorenzo, N., Gallon, R.: Developments for smart classrooms. International Journal of Mobile and Blended Learning, 12 (4), 34–50, (2020). https://doi.org/10.4018/ijmbl.2020100103
25. Nandi, A., Xhafa, F., Subirats, L., Fort, S.: Real-time emotion classification using EEG data stream in e-learning contexts. Sensors, 21 (5), 1589, (2021). https://doi.org/10.3390/s21051589
26. Nardelli, M., Valenza, G., Greco, A., Lanata, A., Scilingo, E.P.: Recognizing emotions induced by affective sounds through heart rate variability. IEEE Transactions on Affective Computing, 6 (4), 385–394, (2015). https://doi.org/10.1109/ taffc.2015.2432810
27. Psaltis, A., Apostolakis, K.C., Dimitropoulos, K., Daras, P .: Multimodal student engagement recognition in prosocial games. IEEE Transactions on Games, 10 (3), 292–303, (2018). https://doi.org/10.1109/tciaig.2017.2743341
28. Qianqian, L., Qian, W., Boya, X., Churan, L., Zhenyou, X., Shu, P., Peng, G.: Research on behaviour analysis of real-time online teaching for college students based on head gesture recognition. IEEE Access, 10, 81476–81491, (2022). https://doi.org/10.1109/access.2022.3192349
29. Rashid, S.M.M., Mawah, J., Banik, E., Akter, Y., Deen, J.I., Jahan, A., . . . Mannan, A. Prevalence and impact of the use of electronic gadgets on the health of children in secondary schools in Bangladesh: A cross-sectional study. Health Science Reports, 4 (4), (2021). https://doi.org/10.1002/hsr2.388
30. Savchenko, A.V., Savchenko, L.V., Makarov, I.: Classifying emotions and engagement in online learning based on a single facial expression recognition neural network. IEEE Transactions on Affective Computing, 13 (4), 2132–2143, (2022). https://doi.org/10.1109/taffc.2022.3188390
31. Scotti, S., Mauri, M., Barbieri, R., Jawad, B., Cerutti, S., Mainardi, L., Villamira, M.A.: Automatic quantitative evaluation of emotions in e-learning applications. 2006 international conference of the IEEE engineering in medicine and biology society. IEEE (2006). https://doi.org/10.1109/iembs.2006.260601
32. Sharma, A., & Mansotra, V.: Classroom student emotions classification from facial expressions and speech signals using deep learning. International Journal of Recent Technology and Engineering (IJRTE), 8 (3), 6675–6683, (2019). https://doi.org/10.35940/ijrte.c5666.098319
33. Sharma, P., Eseng ?onu ?l, M., Khanal, S.R., Khanal, T.T., Filipe, V., Reis, M.J.C.S.: Student concentration evaluation index in an e-learning context using facial emotion analysis. Communications in computer and information science (pp. 529–538). Springer International Publishing (2019). https://doi.org/10.1007/978-3-030-20954-4 40
34. Skaramagkas, V., Ktistakis, E., Manousos, D., Tachos, N.S., Kazantzaki, E., Tripoliti, E.E., Tsiknakis, M.: A machine learning approach to predict emotional arousal and valence from gaze extracted features. 2021 IEEE 21st international conference on bioinformatics and bioengineering (BIBE). IEEE (2021). https://doi.org/10.1109/bibe52308.2021.9635346
35. Sun, A., Li, Y., Huang, Y.-M., Li, Q.: The exploration of facial expression recognition in distance education learning system. Lecture notes in computer science, pp. 111–121. Springer International Publishing (2018). https://doi.org/10.1007/978-3-319-99737-7 11
36. Tao, J., & Tan, T.: Affective computing: A review. Affective computing and intelligent interaction (pp. 981–995). Springer Berlin Heidelberg (2005). https://doi.org/10.1007/11573548 125
37. van der Haar, D.: Student emotion recognition using computer vision as an assistive technology for education. Lecture notes in electrical engineering, pp. 183–192. Springer Singapore (2019). https://doi.org/10.1007/978-981-15-1465-4 19
38. Winkielman, P., Coulson, S., Niedenthal, P.: Dynamic grounding of emotion concepts. Philosophical Transactions of the Royal Society B: Biological Sciences, 373 (1752), (2018)., https://doi.org/10.1098/rstb.2017.0127
39. Zhang, J., Yin, Z., Chen, P., Nichele, S.: Emotion recognition using multimodal data and machine learning techniques: A tutorial and review. Information Fusion, 59, 103–126, (2020). https://doi.org/10.1016/j.inffus.2020.01.011
40. Zhao, B., Wang, Z., Yu, Z., Guo, B.: EmotionSense: Emotion recognition based on wearable wristband. 2018 IEEE SmartWorld, ubiquitous intelligence & computing, advanced & trusted computing, scalable computing & communications, cloud & big data computing, internet of people and smart city innovation. IEEE (2018). https://doi.org/10.1109/smartworld.2018.00091

back to Table of Contents