Lucius Vinicius-Filho, Nelson Zagalo, Oksana Tymoshchuk, Rita Oliveira, Sâmia Mouzinho-Machado
pp. 257 – 268, download
(https://doi.org/10.55612/s-5002-067-012)
Abstract
Many eye-tracker studies exclude individuals with special needs to avoid results’ distortion, while few address its application design. The aim in this study was to identify key challenges in developing eye-tracker applications and explore techniques to address them. An integrative literature review was conducted across Scopus and Web of Science, analysing 26 records and evaluating 10 Tobii Dynavox games as grey literature. Results reveal device limitations, such as inaccuracy, Midas Touch, and eye fatigue, that must be considered in interface design. Key recommendations for implementation include: (1) Snap Clutch and MAGIC frameworks; (2) dwell times; (3) designated rest areas; (4) arrow-flanked and Messenger text-visualization; and (5) three game design paradigms. In conclusion, these methods positively contribute to eye-tracker game design and demonstrate how applications can adapt to encountered obstacles. Future works should explore the effectiveness of combining these solutions and expand the evaluation of eye-tracker games as grey literature.
Keywords: Eye-tracker, design, methods, application, digital games, multimedia, human-computer interaction
References
1. Punde, P. A., Jadhav, M. E., & Manza, R. R. (2017, October). A study of eye-tracking technology and its applications. In 2017 1st International Conference on Intelligent Systems and Information Management (ICISIM) (pp. 86-90). IEEE. https://doi.org/10.1109/ICISIM.2017.8122153.
2. Isokoski, P., Joos, M., Spakov, O., & Martin, B. (2009). Gaze controlled games. Universal Access in the Information Society, 8, 323-337. https://doi.org/10.1007/s10209-009-0146-3
3. Compañ-Rosique, P., Molina-Carmona, R., Gallego-Durán, F., Satorre-Cuerda, R., Villagrá-Arnedo, C., & Llorens-Largo, F. (2019). A guide for making video games accessible to users with cerebral palsy. Universal Access in the Information Society, 18, 565-581. https://doi.org/10.1007/s10209-019-00679-6.
4. Novák, J. Š., Masner, J., Benda, P., Šimek, P., & Merunka, V. (2024). Eye-tracking, Usability, and User Experience: A Systematic Review. International Journal of Human–Computer Interaction. doi: 10.1080/10447318.2023.2221600.
5. Vinicius-Filho, L., Tymoshchuk, O., Oliveira, R., Moreno, D., Oliveira, L. (2025, in press). Play with Music: Developing an Inclusive Digital Music Game for Children with Cerebral Palsy. In World Conference on Qualitative Research. Cham: Springer Nature Switzerland.
6. Libraries, A. U. (2025, January 02). LibGuides: Systematic Reviews/Evidence Synthesis: Inte-
grative Reviews. Retrieved from https://libguides.adelphi.edu/Systematic_Reviews/integrative
-review.
7. Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., …Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, 372, n71. doi: 10.1136/bmj.n71.
8. Tobii Dynavox Global. (2025a). Tobii Dynavox Global. Retrieved from https://www.tobiidynavox.com.
9. Tobii Eye-tracker 5 | Next Generation of Head and Eye-tracking. (2025). Retrieved from https://gaming.tobii.com/product/eye-tracker-5.
10. Tobii Dynavox US. (2025). TD I-Series | eye-tracking-enabled SGD | world’s #1 eye-tracker. Retrieved from https://us.tobiidynavox.com/pages/td-i-series.
11. Tobii Dynavox Global. (2025b). TD Control. Retrieved from https://www.tobiidynavox.com/products/td-control.
12. Molina, A. I., Arroyo, Y., Lacave, C., Redondo, M. A., Bravo, C., & Ortega, M. (2024). Eye-tracking-based evaluation of accessible and usable interactive systems: tool set of guidelines and methodological issues. Universal Access in the Information Society, 1-24. https://doi.org/10.1007/s10209-023-01083-x.
13. Tobii Dynavox Global. (2025c). Accessible games. Retrieved from https://www.tobiidynavox.com/collections/games/eye-control+eye-control?sort_by=manual.
14. Hyrskykari, A. (2006). Utilizing eye movements: Overcoming inaccuracy while tracking the focus of attention during reading. Computers in human behavior, 22(4), 657-671. https://doi.org/10.1016/j.chb.2005.12.013.
15. Jian, Y. C., Chen, M. L., & Ko, H. W. (2013). Context Effects in Processing of Chinese Academic Words: An Eye‐Tracking Investigation. Reading Research Quarterly, 48(4), 403-413. https://doi.org/10.1002/rrq.56.
16. Gowases, T. (2007). Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis. University of Joensuu, Finland.
17. Sundstedt, V. (2012). Gazing at games: An introduction to eye-tracking control (Vol. 14). Morgan & Claypool Publishers. https://doi.org/10.1007/978-3-031-79552-7
18. Jacob, R. J. (1995). Eye-tracking in advanced interface design. Virtual environments and advanced interface design, 258(288), 2. https://doi.org/10.1093/oso/9780195075557.003. 0015.
19. Špakov, O., & Miniotas, D. (2005, August). EyeChess: A tutorial for endgames with gaze-controlled pieces. In European Conference on Eye Movements (pp. 14-18).
20. Donmez, M., & Cagiltay, K. (2024). Eye training games for children with low vision. Educational Technology & Society, 27(4), 406-416.
21. Fatehi, B., Harteveld, C., & Holmgård, C. (2022, June). Guiding Game Design Decisions via Eye-Tracking: An Indie Game Case Study. In 2022 Symposium on Eye-tracking Research and Applications (pp. 1-7). https://doi.org/10.1145/3517031.3529613.
22. Bissoli, A., Lavino-Junior, D., Sime, M., Encarnação, L., & Bastos-Filho, T. (2019). A human–machine interface based on eye-tracking for controlling and monitoring a smart home using the internet of things. Sensors, 19(4), 859. https://doi.org/10.3390/s19040859.
23. Chen, Z., & Shi, B. E. (2019). Using variable dwell time to accelerate gaze-based web browsing with two-step selection. International Journal of Human–Computer Interaction, 35(3), 240-255. https://doi.org/10.1080/10447318.2018.1452351.
24. Suryakusuma, D. A., Wicaksono, A., Hartanto, R., & Wibirama, S. (2022, September). Implementing design thinking in the development of digital signage user interface for public health education. In 2022 8th International Conference on Science and Technology (ICST) (Vol. 1, pp. 1-6). IEEE. https://doi.org/10.1109/ICST56971.2022.10136306.
25. Tantisatirapong, S., & Phothisonothai, M. (2018, September). Design of user-friendly virtual thai keyboard based on eye-tracking controlled system. In 2018 18th International Symposium on Communications and Information Technologies (ISCIT) (pp. 359-362). IEEE. https://doi.org/10.1109/ISCIT.2018.8587965.
26. Nguyen, M. H., Ngo, T. D., Hung, N. B., Mao, C. V., Kieu, H. D., & Le, T. H. (2023). On-screen keyboard controlled by gaze for Vietnamese people with amyotrophic lateral sclerosis. Technology and Disability, 35(1), 53-65. https://doi.org/10.3233/TAD-220391.
27. Ramirez Gomez, A., & Lankes, M. (2019). Towards designing diegetic gaze in games: The use of gaze roles and metaphors. Multimodal Technologies and Interaction, 3(4), 65. https://doi.org/10.3390/mti3040065.
28. Smith, J. D., & Graham, T. N. (2006, June). Use of eye movements for video game control. In Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology (pp. 20-es). https://doi.org/10.1145/1178823.1178847.
29. Istance, H., Bates, R., Hyrskykari, A., & Vickers, S. (2008, March). Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the 2008 symposium on Eye-tracking research & applications (pp. 221-228). https://doi.org/10.1145/1344471.1344523.
30. Vickers, S., Istance, H., & Smalley, M. (2010). EyeGuitar: making rhythm based music video games accessible using only eye movements. ACM Other conferences. Association for Computing Machinery. doi: 10.1145/1971630.1971641.
31. Zhai, S., Morimoto, C., & Ihde, S. (1999, May). Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 246-253). https://doi.org/10.1145/302979.303053.
32. Nacke, L. E., Stellmach, S., Sasse, D., Niesenhaus, J., & Dachselt, R. (2011). LAIF: A logging and interaction framework for gaze-based interfaces in virtual entertainment environments. Entertainment Computing, 2(4), 265-273. https://doi.org/10.1016/j.entcom.2010.09.004.