Evaluating the Impact of FoLA on Learning Analytics Knowledge Creation and Acceptance during Multidisciplinary, Co-Design of Learning Activities

Marcel Schmitz, Maren Scheffel, Roger Bemelmans, Hendrik Drachsler
pp.  9 – 33, download
(https://doi.org/10.55612/s-5002-055-001)

Abstract
Learning analytics offers opportunities to enhance the design of learning activities by providing information on the impact of different learning designs. Despite the availability of design methods that aim to facilitate the integration of learning analytics in learning design, there is a lack of research evaluating their effectiveness. This study aims to assess the effectiveness of the FoLA2 method. Sixty participants utilized the FoLA2 method to create fourteen learning activities in higher education settings. To measure the impact, participants completed a technology acceptance test both before and after each session. Additionally, the researchers analyzed audio recordings of the sessions using epistemic network analysis to gain insights into the discussions surrounding learning analytics and the design of enriched learning activities. The results of both the technology acceptance test and the epistemic network analysis indicated that the FoLA2 method effectively supports the integration of learning analytics during the design of learning activities.

Keywords: Learning Analytics, Learning Design, Technology Acceptance Model, Epistemic Network Analysis, co-creation, learning activities.

References

[1] N. Law and L. Liang, “A multilevel framework and method for learning analytics integrated learning design.,” Journal of Learning Analytics, vol. 7, no. 3, pp. 98-117, 2020.
https://doi.org/10.18608/jla.2020.73.8

[2] G. Pishtari, M. J. Rodríguez-Triana, E. M. Sarmiento-Márquez, M. Pérez-Sanagustín, A. Ruiz-Calleja, P. Santos, L. P. Prieto, L. P. Prieto, S. Serrano-Iglesias and T. Väljataga, “Learning design and learning analytics in mobile and ubiquitous learning: A systematic review,” British Journal of Educational Technology, vol. 51, no. 4, pp. 1078-1100, 2020.
https://doi.org/10.1111/bjet.12944
[3] S. K. Banihashem, O. Noroozi, S. v. Ginkel, L. P. Macfadyen and H. J. Biermans, “A systematic review of the role of learning analytics in enhancing feedback practices in higher education.,” Educational Research Review, vol. 37, p. 100489, 2022.
https://doi.org/10.1016/j.edurev.2022.100489
[4] O. Nguyen, B. Rienties and D. Whitelock, “Informing learning design in online education using learning analytics of student engagement.,” Open world learning: research, innovation and the challenges of high-quality education, pp. 189-207, 2022.
https://doi.org/10.4324/9781003177098-17
[5] V. Kelt, R. Briers, T. Britton, M. B. Brown and K. Brook, “Enhancing Teaching and Learning through Educational Data Mining and Learning Analytics.,” Computers and Education, vol. 5, p. 2456246, 2022.
[6] L. P. Macfadyen, L. Lockyer and B. Rienties, “Learning Design and Learning Analytics: Snapshot 2020,” Journal of Learning Analytics, vol. 7, pp. 6-12, December 2020.
https://doi.org/10.18608/jla.2020.73.2
[7] Q. Nguyen, B. Rienties and D. Whitelock, “A Mixed-Method Study of How Instructors Design for Learning in Online and Distance Education,” Journal of Learning Analytics, vol. 7, p. 64-78, 2020. https://doi.org/10.18608/jla.2020.73.6
[8] M. R. Gruber, Designing for Great Teaching with Learning Design Cards.
[9] C. P. Alvarez, R. Martinez-Maldonado and S. B. Shum, “LA-DECK: A card-based learning analytics co-design tool,” in Proceedings of the tenth international conference on learning analytics & knowledge, 2020. https://doi.org/10.1145/3375462.3375476
[10] Y. Vezzoli, M. Mavrikis and A. Vasalou, “Inspiration cards workshops with primary teachers in the early co-design stages of learning analytics,” in Proceedings of the Tenth international conference on learning analytics & knowledge, 2020.
https://doi.org/10.1145/3375462.3375537
[11] H. Plattner, “An introduction to design thinking process guide,” The Institute of Design at Stanford: Stanford, 2010.
[12] R. Koper, (2005). An introduction to learning design. In R. Koper & C. Tattersall (Eds.), Learning design (pp. 3-20). Springer. https://doi.org/10.1007/3-540-27360-3_1
[13] R. Koper (2006). Current research in learning design. Educational Technology & Society, 9(1), 13-22. https://www.jstor.org/stable/10.2307/jeductechsoci.9.1.13
[14] Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Journal of Educational Technology & Society, 15(3), 42-57.
[15] Drachsler, H. (2023). Towards Highly Informative Learning Analytics. Open Universiteit. ISBN: 978-94-6469-372-0 Last accessed online: https://bit.ly/HILA_Drachsler
[16] Macfadyen, L. P., Lockyer, L., & Rienties, B. (2020). Learning design and learning analytics: Snapshot 2020. Journal of Learning Analytics, 7(3), 6-12. https://doi.org/10.18608/jla.2020.73.2
[17] Tsai, Y. -S., Moreno-Marcos, P. M., Jivet, I., Scheffel, M., Tammets, K., Kollom, K., & Gašević, D. (2018). The SHEILA framework: Informing institutional strategies and policy processes of learning analytics. Journal of Learning Analytics, 5(3), 5-20. https://doi.org/10.18608/jla.2018.53.2
[18] Schmitz, M., Scheffel, M., Bemelmans, R., & Drachsler, H. (2022). FoLA 2–A Method for Co-Creating Learning Analytics-Supported Learning Design. Journal of Learning Analytics, 9(2), 265-281. https://doi.org/10.18608/jla.2022.7643
[19] Scheffel, M., Schmitz, M., van Hooijdonk, J., van Limbeek, E., Kockelkoren, C., Joppe, D., & Drachsler, H. (2021). The design cycle for education (DC4E). DELFI 2021.
[20] B. Rienties, C. Herodotou, T. Olney, M. Schencks and A. Boroowa, “Making sense of learning analytics dashboards: A technology acceptance perspective of 95 teachers,” International Review of Research in Open and Distributed Learning, vol. 19, 2018.
https://doi.org/10.19173/irrodl.v19i5.3493
[21] L. Ali, M. Asadi, D. Gašević, J. Jovanović and M. Hatala, “Factors influencing beliefs for adoption of a learning analytics tool: An empirical study,” Computers & Education, vol. 62, p. 130-148, 2013. https://doi.org/10.1016/j.compedu.2012.10.023
[22] Scheffel, M., Niemann, K., & Jivet, I. (2017). The evaluation framework for learning analytics. Heerlen, The Netherlands: Open Universiteit.
[23] A. Mavroudi, S. Papadakis and I. Ioannou, “Teachers’ Views Regarding Learning Analytics Usage Based on the Technology Acceptance Model,” TechTrends, p. 1-10, 2021. https://doi.org/10.1007/s11528-020-00580-7
[24] V. Venkatesh, J. Y. L. Thong and X. Xu, “Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology,” MIS quarterly, p. 157-178, 2012. https://doi.org/10.2307/41410412
[25] V. Venkatesh, M. G. Morris, G. B. Davis and F. D. Davis, “User acceptance of information technology: Toward a unified view,” MIS quarterly, p. 425-478, 2003.
https://doi.org/10.2307/30036540
[26] K. Tamilmani, N. P. Rana, S. F. Wamba and R. Dwivedi, “The extended Unified Theory of Acceptance and Use of Technology (UTAUT2): A systematic literature review and theory evaluation,” International Journal of Information Management, vol. 57, p. 102269, 2021. [31] D. W. Shaffer, Quantitative ethnography, Lulu. com, 2017.
https://doi.org/10.1016/j.ijinfomgt.2020.102269
[27] T. Zhou, Y. Lu and B. Wang, “Integrating TTF and UTAUT to explain mobile banking user adoption,” Computers in human behavior, vol. 26, p. 760-767, 2010.
https://doi.org/10.1016/j.chb.2010.01.013
[28] P. R. Warshaw and F. D. Davis, “Disentangling behavioral intention and behavioral expectation,” Journal of experimental social psychology, vol. 21, p. 213-228, 1985.
https://doi.org/10.1016/0022-1031(85)90017-4
[29] A. F. Agudo-Peregrina, A. Hernández-Garcı́a and F. J. Pascual-Miguel, “Behavioral intention, use behavior and the acceptance of electronic learning systems: Differences between higher education and lifelong learning,” Computers in Human Behavior, vol. 34, p. 301-314, 2014. https://doi.org/10.1016/j.chb.2013.10.035
[30] Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of advanced nursing, 62(1), 107-115. https://doi.org/10.1111/j.1365-2648.2007.04569.x
[31] Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative health research, 15(9), 1277-1288.
https://doi.org/10.1177/1049732305276687
[32] Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), 77-101. https://doi.org/10.1191/1478088706qp063oa
[33] Guest, G., MacQueen, K. M., & Namey, E. E. (2011). Applied thematic analysis. sage publications. https://doi.org/10.4135/9781483384436
[34] Gee, J. P. (2014). An introduction to discourse analysis: Theory and method. Routledge. https://doi.org/10.4324/9781315819679
[35] Potter, J., & Wetherell, M. (1987). Discourse and social psychology: Beyond attitudes and behaviour. Sage Publications, Inc.
[36] Thornberg, R., Perhamus, L., & Charmaz, K. (2014). Grounded theory. Handbook of research methods in early childhood education: Research methodologies, 1, 405-439.
[37] Glaser, B. G., & Strauss, A. L. (2017). Discovery of grounded theory: Strategies for qualitative research. Routledge. https://doi.org/10.4324/9780203793206
[38] Ten Have, P. (2007). Doing conversation analysis. Doing Conversation Analysis, 1-264.
https://doi.org/10.4135/9781849208895
[39] Heritage, J., Sidnell, J., & Stivers, T. (2013). The handbook of conversation analysis.
https://doi.org/10.1002/9781118325001
[40] D. W. Shaffer, Quantitative ethnography, Lulu. com, 2017.
[41] D. W. Shaffer, W. Collier and A. R. Ruis, “A tutorial on epistemic network analysis: Analyzing the structure of connections in cognitive, social, and interaction data,” Journal of Learning Analytics, vol. 3, p. 9-45, 2016. https://doi.org/10.18608/jla.2016.33.3
[42] D. Shaffer and A. Ruis, “Epistemic network analysis: A worked example of theory-based learning analytics,” Handbook of learning analytics, 2017.
https://doi.org/10.18608/hla17.015
[43] S. Zhang, Q. Liu and Z. Cai, “Exploring primary school teachers’ technological pedagogical content knowledge (TPACK) in online collaborative discourse: An epistemic network analysis,” British Journal of Educational Technology, vol. 50, p. 3437-3455, 2019.
https://doi.org/10.1111/bjet.12751
[44] M. Koehler and P. Mishra, “What is technological pedagogical content knowledge (TPACK)?,” Contemporary issues in technology and teacher education, vol. 9, p. 60-70, 2009.
[45] S. Cox and C. R. Graham, “Using an elaborated model of the TPACK framework to analyze and depict teacher knowledge,” TechTrends, vol. 53, p. 60-69, 2009.
https://doi.org/10.1007/s11528-009-0327-1
[46] D. M. Bressler, A. M. Bodzin, B. Eagan and S. Tabatabai, “Using epistemic network analysis to examine discourse and scientific practice during a collaborative game,” Journal of Science Education and Technology, vol. 28, p. 553-566, 2019.
https://doi.org/10.1007/s10956-019-09786-8
[47] D. W. Shaffer, “Epistemic frames for epistemic games,” Computers & education, vol. 46, p. 223-234, 2006. https://doi.org/10.1016/j.compedu.2005.11.003
[48] J. Hulland, “Use of partial least squares (PLS) in strategic management research: A review of four recent studies,” Strategic management journal, vol. 20, p. 195-204, 1999.
https://doi.org/10.1002/(SICI)1097-0266(199902)20:2<195::AID-SMJ13>3.0.CO;2-7
[49] J. Henseler, C. M. Ringle and M. Sarstedt, “A new criterion for assessing discriminant validity in variance-based structural equation modeling,” Journal of the academy of marketing science, vol. 43, p. 115-135, 2015.
https://doi.org/10.1007/s11747-014-0403-8
[50] R. P. Bagozzi and Y. Yi, “On the evaluation of structural equation models,” Journal of the academy of marketing science, vol. 16, p. 74-94, 1988. https://doi.org/10.1007/BF02723327
[51] C. Fornell and D. F. Larcker, “Evaluating structural equation models with unobservable variables and measurement error,” Journal of marketing research, vol. 18, p. 39-50, 1981.
https://doi.org/10.1177/002224378101800104
[52] D. Gefen, D. Straub and M.-C. Boudreau, “Structural equation modeling and regression: Guidelines for research practice,” Communications of the association for information systems, vol. 4, p. 7, 2000. https://doi.org/10.17705/1CAIS.00407
[53] Marquart, Honojosa, Swiecki, Eagan and D. W. Schaffer, Epistemic Network Analysis (Version 1.7.0) [Software].
[54] A. L. Siebert-Evenstone, G. A. Irgens, W. Collier, Z. Swiecki, A. R. Ruis and D. W. Shaffer, “In search of conversational grain size: modelling semantic structure using moving stanza windows.,” Journal of Learning Analytics, vol. 4, p. 123-139, 2017.
https://doi.org/10.18608/jla.2017.43.7
[55] G. Arastoopour, N. C. Chesler, D. W. Shaffer and Z. Swiecki, “Epistemic network analysis as a tool for engineering design assessment,” in 2015 ASEE Annual Conference & Exposition, 2015.
[56] S. Sullivan, C. Warner-Hillard, B. Eagan, R. J. Thompson, A. R. Ruis, K. Haines, C. M. Pugh, D. W. Shaffer and H. S. Jung, “Using epistemic network analysis to identify targets for educational interventions in trauma team communication,” Surgery, vol. 163, p. 938- 943, 2018. https://doi.org/10.1016/j.surg.2017.11.009
[57] S. McKenney and Y. Mor, “Supporting teachers in data-informed educational design,” British journal of educational technology, vol. 46, p. 265-279, 2015.
https://doi.org/10.1111/bjet.12262
[58] A. F. Wise, Y. Zhao and S. N. Hausknecht, “Learning analytics for online discussions: Embedded and extracted approaches.,” Journal of Learning Analytics, vol. 1, p. 48-71, 2014. https://doi.org/10.18608/jla.2014.12.4
back to Table of Contents