Literature Analysis

Technology-Mediated Student Engagement


INTRODUCTION


The purpose of this paper is to present an analysis on the theme of engagement in technology-mediated environments. Technology-mediated student engagement refers to learners cognitively and affectively connected with the learning experience (Lowe, Lee, Schibeci, Cummings, Phillips & Lake, 2010).  It is an important theme because as Lowe et al. (2010) asserted technology is becoming more widely available and online learning environments are now integrated into mainstream education.   However, although there are learning benefits associated with using technology, there are also disadvantages (Chen, Lambert & Guidry, 2010), and the successful engagement of learners with technology is dependent on several factors (Bradshaw, Powell & Terrell, 2005).

 

Organization of the paper

This paper begins with a methods’ section with a description on the procedure of the analysis. The findings’ section presents the results of the analysis of the 10 studies. The findings’ section includes the common themes that emerged in regard to student engagement and technology-mediated environments. The discussion section offers an examination of the results through assessment, interpretation, and evaluation. The conclusion highlights the main findings about student engagement and identifies limitations and implications for practice. 


METHODS

The analysis included 10 selected sources from nine peer-reviewed educational technology journals. The 10 sources included an electronic medium.  The analysis only considered sources with the word engagement as part of the title. For analysis consideration, the journal sources had to include research participants involved in technology-mediated environments. One study observed student engagement in an e-Forum setting (Mason, 2011). Another study tracked student engagement via the use of a social network for educational purposes (Badge et al., 2012).  Fives studies reported on student engagement in online learning environments designed for specific courses (Bulger, Mayer, Almeroth & Sheridan, 2008; Cornelius, Gordon & Harris, 2011; Bradshaw et al., 2005; Chen et al., 2010; Holley & Dobson, 2008). Two studies involved video games or game design and student engagement (Reynolds & Caperton, 2011; Annetta, Minogue, Holmes & Cheng, 2009). One study observed student engagement with the use of online digital learning objects (Lowe et al., 2010). The selected studies ranged from 2005 to 2012 and did not include meta-analysis, book reviews, etc. Because technology-mediated environments are constantly changing, the selection process only included recent studies.  Nine of the selected studies are from 2008 to 2012.

            The research teams conducted the 10 studies in the following locations: The United States of America, The United Kingdom, New Zealand, and Australia.  Three of the studies (Bradshaw et al., 2005; Reynolds & Caperton, 2011; Cornelius et al., 2011) used qualitative data and three studies used quantitative data (Annetta et al., 2009; Bulger et al., 2008; Chen et al., 2010) and four studies applied a mixed methods approach (Badge, Saunders & Cann, 2012; Lowe et al., 2010; Holley & Dobson, 2008; Mason, 2011). Five of the studies involved university undergraduate students as participants (Badge et al., 2012; Mason, 2011; Holley & Dobson, 2008; Chen et al., 2010; Bulger et al., 2008), one study involved high school students (Annetta et al., 2009), two studies involved professional students - mainly teachers and college lecturers (Cornelius et al., 2011; Bradshaw et al., 2005), one study involved middle, high, and college students (Reynolds & Caperton, 2011), and one study involved primary and secondary students (Lowe et al., 2010). The number of participants in each study varied greatly. Three of the studies used measurement tools that researchers tested in previous studies. These measurement tools were Protocol for Classroom Observations (Annetta et al., 2009), Statistical Package for the Social Sciences (Mason, 2011), and National Survey of Student Engagement (Chen et al., 2010). One of the studies developed a Classroom Behavioural Analysis System (Bulger et al., 2008) and another study used the open-source Gephi tool for network graph visualization and statistical analysis (Badge et al., 2012).

After the retrieval and assessment of 21 studies, the analysis of the 10 selected studies followed, to some extent, a meta-synthesis as discussed by Zimmer (2006). The analysis involved determining similarities, differences, and patterns while searching for common themes on student engagement in technology-mediated learning environments. By “incorporating findings from many studies” (Zimmer, 2006, p. 317), this analysis attempted to offer a “generalizable reality” (p. 317) on common themes shared in the 10 studies related to student engagement and technology-mediated environments.


FINDINGS

            The analysis of the 10 studies resulted in the identification of four themes related to student engagement in technology-mediated environments. These four themes were: 1) the design of learning and student engagement; 2) time and student engagement; 3) online interaction and student engagement; and 4) relevant and meaningful learning.

 

The design of learning and student engagement

            Lowe et al. (2010) observed student engagement increased when design features included choices and challenges that related to students’ prior knowledge and the course material. However, when the design of the environment presented unfamiliar concepts and words, the learning space provided too much unfamiliar information to students. Consequently, the “cognitive challenge was excessive” (p. 236), and students disengaged from the learning activities.

            Similarly, Reynolds and Caperton (2011) discussed excessive cognitive challenges. In this environment students expressed frustration with coding and programming in a game design program. This frustration correlated to the students’ own lack of experience in game design. Furthermore, many of the educators themselves were novices, compounding the students’ frustration and disengagement when they sought support.

            Annetta et al. (2009) also reported on issues with excessive cognitive challenges when a 3D multi-user virtual learning environment became distracting in its complexity, resulting in student disengagement. The research team argued that such environments, including Multiplayer Education Gaming Applications (MEGAs), engaged students in the learning process. However, designers and educators need to determine how they can design MEGAs to reduce the “novelty effect…without sacrificing high-quality cognitive engagement” (Annetta et al., 2009, p. 80). Additionally, Cornelius et al. (2011) noted a lack of engagement when the course objectives caused confusion or when students faced technical difficulties. However, the research time reported that the inclusion of anonymity in the design of the role-play learning environment was an important feature in engaging students.

            Mason (2011) argued that online socialization and information exchanges are necessary steps to facilitate engagement, and they were not included in the design of an online learning forum used by the students. The absence of these two necessary steps resulted in only 17.9% of the class posting to the forum. The emphasis in the importance of designing a mediated-technology learning environment that included the opportunity for information exchanges was also reported in Bradshaw et al., (2005). They stressed the need to give adequate time for induction into the digital environment.

            Bulger et al. (2008) reported that when educators designed learning activities to be immersive, technology became a learning tool and resource that increased student engagement. Additionally, a no-simulation environment produced lower engagement results versus higher engagement results in a simulation environment. The research team argued that the results of higher engagement in a simulation environment showed a direct link between student engagement and the instructional method and design.  

 

Time and student engagement

            The synchronous nature of a mediated-technology environment can cause issues with student engagement and the pace of an online learning activity. For example, some participants in Cornelius et al. (2011) expressed enjoyment with the online engagement, but participants experienced the activity pace as either too slow or too fast. The students, who complained about the quick pace, were native English speakers. However, they possessed poor typing skills and slow reading speeds.

            Bradshaw et al. (2005) reported that the asynchronous nature of the environment allowed students to contribute when it was convenient for them. One student provided positive feedback welcoming the time flexibility. Deadlines were still necessary to encourage engagement, as there was no structure to time. These time constraints, however, resulted in students complaining it was the greatest barrier to their engagement. The educator would close an online discussion before the students were ready to contribute.             

            Findings from Reynolds & Caperton (2011) revealed that 20% of students disliked the lack of time to meet all their goals. One student commented on the pace being too fast while another noted that a 40-minute class period was not long enough. Further, 10% of the students said that time management was difficult.

            Holley and Dobson (2008) reported that student engagement with online activities far surpassed the research team’s expectations. The results revealed students were visiting the site nearly every hour. The flexibility of online technology allowed non-traditional students to engage in their learning at times most convenient to them. Badge et al. (2012) discovered that students not only engaged in an online environment outside course time, but also 15% of students continued to engage academically in Friendfeed, an online social network, after the formal requirement.

 

Online interaction and student engagement

            Badge et al. (2012) described online interaction as the “social glue” (Badge et al., 2012, para. 23) that encouraged student engagement. This social interaction prevented students from working in isolation where there is no support to encourage engagement and learning beyond basic assessment tasks.

            Lowe et al. (2010) discussed learning as a social activity where internal or external dialogue is necessary for students to engage in testing and applying new knowledge. They observed that when teachers provided limited support, student engagement declined. Students lost focus and did not explore the learning environment; thus, outcomes were not achieved. In contrast, when students worked collaboratively with each other, engagement increased, and the teacher observed active learning.

            Chen et al. (2010) posited that it is important for instructors in online courses to provide students with the appropriate instruction to encourage engagement and promote contribution. Additionally, Mason (2011) argued that prominent online instructor activity encouraged student interaction and engagement. The limited posting by the instructor was inadequate for encouraging students to engage and interact. Students also expressed dissatisfaction with the amount of feedback they received and the lack of support.              

 

Relevant and meaningful learning

            Holley and Dobson (2008) discussed the importance of designing a learning experience that engaged students in “a more meaningful dialogue” (Holley & Dobson, 2008, p. 141). Students expressed excitement over a learning task that was directly relevant to their study. Consequently, the final average mark was 79%, 6% higher than years before the new design implementation.

            Annetta et al. (2009) argued that cognitive engagement was most likely linked to the need for students to apply subject specific knowledge to authentic scenarios. However, they also stressed there were cognitive conflicts in these activities that caused students to question what they had learned through other activities outside the MEGA.

            Chen et al. (2010) reported that technology-mediated environments designed to engage students are more likely to promote higher order thinking, and reflective and integrative learning. Furthermore, the research team discovered students made higher gains in practical competence and personal and social development.

            Bradshaw et al. (2005) reported a disconnection between what professional students thought was ultimately relevant to them and the formal assessment outcomes. Bulger et al. (2008) concluded that student engagement increased in active learning when the academic requirements encouraged students to seek information beyond the walls of the classroom. Moreover, encouraging students to develop their own understandings and make learning connections increased engagement.



DISCUSSION

            This analysis presented evidence from 10 studies that highlighted shared themes related to student engagement and technology-mediated environments. The explored themes are essentially factors that can improve or hinder student engagement. Studies like those of Lowe et al. (2010) and Mason (2011) discussed the importance of an effective learning space design to help promote engagement and participation. These studies offered evidence that illustrated the learning environment must be consistently manageable for students to avoid frustrations and cognitive challenges. Additionally, Mason (2011) offered necessary steps that educators need to include in the learning environment design to facilitate engagement. In both Reynolds & Caperton (2011) and Cornelius et al. (2011) some of these steps were absent when students became frustrated over trying to understand learning objectives. The aforementioned studies noted the need for educators to address possible technical difficulties to avoid disengagement. Bradshaw et al. (2005) stressed making time for induction. Additionally, according to Bulger et al. (2008), student engagement, instructional method, and design are all intrinsically linked.

            Both synchronous and asynchronous mediated-technology environments presented issues in regard to time management and engagement. Synchronous activities created a faster pace that left some students feeling overwhelmed (Cornelius et al., 2011). Meanwhile, asynchronous activities forced instructors to create deadlines leaving students to complain about imposed time constraints (Bulger et al., 2006). Time constraints were also a concern in Reynolds & Caperton (2011).  However, while time constraints and pacing were a concern for some environments, other studies displayed an effective positive blending between time concerns and engagement. For instance, in Holly and Dobson (2008), students enjoyed the flexibility to engage in their learning outside school. Additionally, the engagement rates were very high and exceeded the study’s expectations. Moreover, in Badge et al. (2012) student engagement outside school hours was high. Many of the students even decided to continue online academic engagement after the formal learning concluded.

            Some of the studies reported a strong relation between student engagement and the need for human interaction in a mediated-technology environment. For example, two of the studies illustrated that not only is peer to peer interaction necessary, but also educators or facilitators must maintain a strong presence to encourage social interaction, provide support, and offer timely feedback (Chen et al., 2010; Mason, 2011). Isolation was a major concern in some of the studies. Many avenues were available for students to encourage each other to interact, collaborate, and engage in their online learning experience (Badge et al, 2012).

            Providing relevant, meaningful, and authentic learning experiences promoted deeper student engagement. Both Holley & Dobson (2008) and Anneta et al. (2009) reported excitement and cognitive engagement when students immersed in relevant and authentic experiences. Similarly, Bulger et al. (2008) indicated an increase in student engagement when students were provided with opportunities to learn outside the traditional classroom. However, it is necessary to provide hesitant students with reassurances of their knowledge and skills to avoid cognitive conflicts and disengagement (Annetta et al. 2009).  Reassurance from educators relates to the themes: timely feedback and support from educators, and the need to implement appropriate design considerations when developing mediated-technology environments.  Positive benefits associated with student engagement and mediated-technology environments were also discussed (Chen et al., 2010). One study highlighted disconnection between what students viewed as relevant learning versus the course’s formal outcomes and assessment (Bradshaw et al., 2005). Bradshaw et al. (2005) involved professional students as participants. The majority were teachers considering how they could practically apply what they learned to their own courses. This disconnect was not evident in any of the other studies.

            Bulger et al. (2008) used a Classroom Behavioural Analysis System (CBAS) to measure student engagement levels. The system measured engagement by tracking on-task and off-task Internet actions. The research team suggested more studies exploring the relationship between measured engagement levels and academic performance could further test CBAS.

 

CONCLUSIONS

            Encouraging and fostering student engagement in a mediated-technology learning environment required careful attention to instructional design and delivery (Lowe et al., 2010; Annetta et al., 2009; Reynolds and Caperton, 2011). Furthermore, careful attention was paid to anticipating and eliminating technical difficulties and possible design flaws so as not to discourage students in the learning experience and risk disengagement (Cornelius et al., 2011). Even though students were engaged in a digital space, human interaction was paramount concerning student engagement and participation (Badge et al., 2012). This interaction was not only important between students but also was equally important between student and educator (Chen et al., 2010). Educators needed to reassure students that they were available to provide support and feedback (Mason, 2011). Students had to be adequately challenged in the learning process without the challenges causing an overwhelming amount of frustration (Lowe et al., 2010; Annetta et al., 2009; Reynolds and Caperton, 2011). Another concern was over time. Time was an important factor for consideration when encouraging continued student engagement (Reynolds & Caperton; 2011). While deadlines and timelines were impossible to avoid, educators had to keep in mind that many students reported high levels of engagement in these types of learning environments because of the time flexibility they offered (Holley and Dobson, 2008; Badge et al., 2012).

 

Limitations

            This analysis included studies that reported on a wide range of technology-mediated learning environments. Some of the environments were completely online learning experiences (Mason, 2011; Bradshaw et al., 2005; Bulger et al., 2008). However, some of them were blended environments with additional learning activities and some were even traditional classroom settings (Holley and Dobson, 2008; Annetta et al., 2009). Including only studies that dealt solely with an online learning experience could have changed the results and factors that affected student engagement. Similarly, the analysis included studies with participants from the primary grades to professional adults and non-traditional adult students. The use of adults outside undergraduate programs and college courses certainly affected the results. For example, in Bradshaw et al. (2005) there was a disconnection between what students believed as relevant and what the course design dictated. However, these students were professionals, mainly teachers. They were bringing with them existing knowledge and experience that would most likely not be a concern with younger students. Time concerns and constraints was a factor in student engagement with both synchronous and asynchronous environments. However, analyzing both types of environments may have affected the analysis, particularly in regard to the positive meshing of time and engagement in some asynchronous environments (Holly & Dobson, 2008; Badge et al., 2012).

 

Implications

            Anneta et al. (2009) called for “cautious optimism” when it came to investigating the impact of games on student engagement. They cautioned, “Games are not a panacea” (Anneta et al, pg. 80). This message can be applied to all technology-mediated learning environments. The research team suggested that the establishment of a criterion for design and evaluation is necessary. They argued that if educators are to become designers, they will need detailed and continual professional development and support. Educators cannot be amateurs in the material they are teaching as it only contributes to frustration and disengagement when students seek guidance and support (Reynolds & Caperton, 2011). Educators must also consider how their own online presence in a technology-mediated learning environment impacts student engagement (Bradshaw et al., 2005). They must be frequently active in their online moderation, providing feedback and support, and encouraging student engagement (Mason, 2011). Finally, three of the studies suggested that further research is necessary to explore the relationship between measured engagement levels and academic performance (Bulger et al, 2008; Chen et al., 2010; Badge et al, 2012).

            As previously stated, technology-mediated learning environments are not a panacea. They are, nevertheless, environments that students can find relevant, motivating, and fun (Reynolds et al., 2011). However, while entertainment in a technology-mediated learning environment can help with initial engagement, designers and educators must pay careful attention to design and structure to provide and support continuous emotional and cognitive engagement (Lowe et al., 2010).


REFERENCES

Annetta, L., Minogue, J., Holmes S.Y., & Cheng M.T. (2009). Investigating the impact of video games on high school students’ engagement and learning about genetics. Computers & Education, 53(1), 74-85. 

Badge, J. L., Saunders, N.F.W., Cann, A.J. (2012). Beyond marks: new tools to visualise student engagement via social networks. Research in Learning Technology, 20(1), DOI: 10.3402/rlt.v20i0/16283

Bradshaw, P., Powell S., & Terrell, I. (2005). Developing engagement in Ultralab’s online communities of enquiry. Innovations in Education & Teaching International, 42(3), 205-215.

Bulger, M.E., Mayer, R.E., Almeroth, K.C. & Blau, S.D. (2008). Measuring Learner Engagement in Computer-Equipped College Classrooms. Journal of Educational Multimedia and Hypermedia, 17(2), 129-143. Chesapeake, VA: AACE. Retrieved from http://www.editlib.org/p/23524.

Chen, D.P., Lambert, A,M., & Guidry, K.R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54(4), 1222-1232.

Cornelius, S., Gordon, C., & Harris, M. (2011). Role engagement and anonymity in synchronous online role play. The International Review of Research in Open and Distance Learning, 12(5), 57-73.

Holley, D., & Dobson, C. (2008). Encouraging student engagement in a blended learning environment: the use of contemporary learning spaces. Learning, Media and Technology, 33(2), 139-150.

Lowe, K., Lee, L., Schibeci, R., Cummings, R., Phillips, R., & Lake, D. (2010). Learning objects and engagement of students in Australian and New Zealand Schools. British Journal of Educational Technology, 41(2), 227-241. 

Mason, R.B. (2011). Student engagement with, and participation in, an e-forum. Journal of Educational Technology and Society, 14(2), 258-268.

Reynolds, R. & Caperton, I.H. (2011). Contrasts in student engagement, meaning-making, dislikes, and challenges in a discovery-based                       program of game design learning. Educational Technology Research and Development, 59(2), 267-289.

Zimmer, L. (2006). Qualitative meta-synthesis: a question of dialoguing with texts. Journal of Advanced Nursing 53(3), 311–318.

This free website was made using Yola.

No HTML skills required. Build your website in minutes.

Go to www.yola.com and sign up today!

Make a free website with Yola