Sign in or register
for additional privileges

C2C Digital Magazine (Spring / Summer 2021)

Colleague 2 Colleague, Author

This page was created by Anonymous.  The last update was by Shalin Hai-Jew.

You appear to be using an older verion of Internet Explorer. For the best experience please upgrade your IE version or switch to a another web browser.

Machine data as the source of learning engagement in hands-on learning online

By Yu-Ping Hsu, Western Illinois University
 




Introduction


Instructional technology provides the capacity to address the needs of students with diverse cognitive skills and socialization needs. Learning experience is viewed as an important factor in learner engagement/motivation, and a contributor to learning in online instruction (Sims, 2003; Swartzwelder & Murphy, 2019; Chan, Wan & Ko, 2019).  Moore’s three types of learning interaction (Moore, 1989) included student-content interaction, student-student interaction, and student-faculty interaction; and have been used widely in the research literature.  Several studies have demonstrated that well-designed online interactivities can improve student’s learning experience (Svihla, 2015; Cain & Lee, 2016; Watkins, 2005; Herrington, Oliver & Reeves, 2003). However, the field has no clear agreement on how to measure these interactivities for improving learning experience in online instruction (Ekwunife-Orakwue and Teng, 2014; Walmsley-Smith, Machin and Walton, 2019).  Some assume that an analytics approach, using tracking data from behavioral and physiological responses (e.g., facial expressions, eye tracking, click-stream data) as evidence of involvement and attentiveness, is a measure of motivation and engagement.  Using the physiological response data in online instruction can be a reliable source of understanding online activities that enhance learning experience (Lee & Shapiro, 2019; Lee & DuMont, 2010).  The purpose of this project is to explore how to design learning activities in hands-on lessons online that are effective and engaging based on facial expressions and physiological responses.

This project designed four popular types of learning activities, video, simulation, drill & practice, and concept map in the engineering and technology field.  The result example demonstrated a female student’s learning engagement when she experienced different online activities with the physiological responses and facial expressions from iMotions. This project found that student has different emotional responses during the different types of learning activities.  

Objectives


The main goal of this project is to identify the specific online interactivities that allow students to engage better in online hands-on learning based on students’ emotional responses.  

Methodology


The literature suggests that engagement is increased when online instruction includes interactivities. However, there is no agreement on how to measure learning interactivities for learning experience in a certain way (Ekwunife-Orakwue and Teng, 2014; Walmsley-Smith, Machin and Walton, 2019). To the best of our knowledge, apart from work by (Hsu, Meyen & Lee, 2018 & Hemphill, 2001), not many studies have measured physiological responses to explore how the learning interactivities can improve learning experience.  This project combines students’ physiological responses and facial expressions to identify students’ learning engagement when they experience the different types of online activities for hands-on lessons online. The project attempts to enhance online learning experience in the area of online interactivity design for hands-on lessons.    

This volunteered participant has taken at least one online course in the past, and was very familiar to the learning management system.  Student took around 30 minutes to complete all online hands-on activities.  
       
To analyze the physiological data and facial expressions, we planned collectively viewed each time-stamped segment, compared physiological changes in online interactivities lesson, and the events in text-image lesson to concurrent facial expressions which are a total of 19 different Action Units (AUs) are scored for nine basic emotions, anger, sadness, frustration, confusion, joy, surprise, fear, disgust and contempt are analyzed from the iMotions biometric system (Stöckli, Schulte-Mecklenbeck,  Borer, et al., 2018).  See Figure 1.  This project evaluated positive and negative emotions for identifying the students’ learning engagement.  

At this example, we didn’t analyze the raw data.  We looked the level of peaks of the participant’s GSR along with facial expressions.  The example is shown in Figures 2, 3, 4, and 5.      

   
Figure 1.  The Interface of iMotions Platform
 



Data from iMotions


The data from iMotions showed this participant engaged more with the video and the simulation.  She had less engagement in drill and practice and the concept map.  Here we presented four peaks selected for a participant’s learning engagement.  The female student wore the Shimmer sensor and interacted four learning activities.  The build-in webcam detected her facial expressions.  When the iMotions platform received the data accurately, the waves are recorded, see Figure 1.  

The summary of the levels of peaks, this student had the highest peak when she watched the video in Figure 1.  She also had the second peak when she interacted with the simulation in Figure 2.  However, when she interacted with drill & practice and the concept map, she had lower levels of peaks.  

The results told us this real-time data can help us understand how student engages with different learning activities.  Further analysis is ongoing with more than 20 students.  


Figure 2.  The Peak when the Participant Watched the Video

 



Figure 3.  The 2nd Peak when the Participant Interacted with the Simulation

 



Figure 4.  The Lower Peak when the Participant Interacted with the Drill & Practice





Figure 5.  The Flat Wave when the Participant Interacted with the Concept Map



 


Significance


Reliable measurement on learning engagement in online interactivity design is needed to be provide improvements, but it is difficult to obtain without any valid data to assess students’ learning engagement.  This project used four types of online interactivities which are: videos, simulations, drill and practice, and concept mapping, for examining students’ learning experience.   The example participant had more emotion changes when she interacted with the specific learning activities throughout the hands-on learning online.  

The rich and real-time physiological responses and facial expressions suggest and clarify how the online interactivities can enhance students’ learning engagement in hands-on lessons. The students’ learning engagement was analyzed in a minute-to-minute scale and this was valuable for looking at how students engage with different online interactivities.  Later, we will look at the large group of students’ data across different learning interactivities in the results.  

These findings may serve as guidelines to those designing online learning engagement, such as educators, instructional designers, technologists, and educational developers who are developing guidelines with regard to the selection and implementation of learning interactivities.  Since the results of this project were evidence-based, educators or instructional designers can apply them towards their course design for better learning engagement.



Acknowledgements


I thank Rafael Obregon, Chair and Associate Professor of Engineering Technology at the Western Illinois University, for supporting this project. This project was funded by the university research council (URC) from the Western Illinois University.  
    

References

    
Cain, R., & Lee, V. R. (2016). Measuring Electrodermal Activity to Capture Engagement in an Afterschool Maker Program. Paper presented at the 6th Annual Conference on Creativity and Fabrication in Education, Stanford, CA USA. Retrieved from https://dl.acm.org/doi/10.1145/3003397.3003409

Chan, S. C., Wan, C. J., & Ko, S. (2019). Interactivity, active collaborative learning, and learning performance: The moderating role of perceived fun by using personal response systems. The International Journal of Management Education, 17(1), 94-102.

Ekwunife-Orakwue, K. C. V. & Teng, T. L. (2014) The impact of transactional distance dialogic interactions on student learning outcomes in online and blended environments. Computers and Education, 78, 414–427. http://doi.org/10.1016/j.compedu.2014.06.011

Hemphill, H. (2001). Instructional syntax analysis: Beyond CBT. TechTrends: Linking Research and Practice to Improve Learning, 45(1).

Hsu, Y. P., Meyen, E. L. & Lee, Y. J. (2018). Understanding Emotional Analytics for Student Engagement: An Instructional Design Perspective. In M. Boboc & S. Koc (Eds.), Student-Centered Virtual Learning Environments in Higher Education (pp.70-102). Hershey, PA: Information Science Reference. https://doi.org/10.4018/978-1-5225-5769-2.

Lee, V.R., & Shapiro, R.B. (2019). A Broad View of Wearables as Learning Technologies: Current and Emerging Applications. In Díaz P., Ioannou A., Bhagat K., Spector J. (Eds). Learning in a Digital World. Smart Computing and Intelligence(pp.113-133). Springer, Singapore.

Lee, V. R., & DuMont, M. (2010). An exploration into how physical activity data-recording devices could be used in computer-supported data investigations. International Journal of Computers for Mathematical Learning, 15(3), 167-189.

Moore, M. G. (1989). Three types of interaction.

Sims, R. (2003). Promises of interactivity: Aligning learner perceptions and expectations with strategies for flexible and online learning. Distance Education, 24(1), 87-103.

Stöckli, S., Schulte-Mecklenbeck, M., Borer, S., & Samson, A. C. (2018). Facial expression analysis with AFFDEX and FACET: A validation study. Behavior research methods, 50(4), 1446-1460.

Svihla, V. (2015). Making for engagement, Development and learning. Paper presented  at the 5th  Annual Conference on Creativity and Fabrication in Education, Stanford, CA USA. Retrieved from https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Making+for+Engagement%2C+Development+and+Learning&btnG=.

Swartzwelder, K., Murphy, J., & Murphy, G. (2019). The Impact of Text-Based and Video Discussions on Student Engagement and Interactivity in an Online Course. Journal of Educators Online, 16(1), n1.

Walmsley-Smith, H., Machin, L., & Walton, G. (2019). The E-Design Assessment Tool: an evidence-informed approach towards a consistent terminology for quantifying online distance learning activities. Research in Learning Technology, 27.






About the Author



Yu-Ping Hsu is an assistant professor of the IDT program in the department of Engineering Technology at Western Illinois University.  Dr. Hsu's research interest is in the area of user interaction design approaches to learning that emphasize multimedia, collaboration, emotional responses, information visualization, universal design and accessibility design.  
Comment on this page
 

Discussion of "Machine data as the source of learning engagement in hands-on learning online"

Add your voice to this discussion.

Checking your signed in status ...

Previous page on path Cover, page 8 of 21 Next page on path