Skip to main content
  • Research article
  • Open access
  • Published:

Empowering students’ agentive engagement through formative assessment in online learning environment

Abstract

This study investigated empowering students to engage agentively in formative assessment of their English writing. To this end, student agentic engagement was explored drawing on multiple data collection. A Digitalized Engagement Enhancement Tool (DEET) was utilized to encourage students to record, unpack, plan for actions, and reflect on the feedback they received from the teacher and their peers. A series of content analyses was conducted to codify and track students’ engagement dimensions and practices in multiple writing samples, DEET, and student-revised sample writings. The analysis of the frequency of student engagement codes and their writing performances indicated a significant increase in student engagement at all dimensions. Further Chi-square analysis indicated that student agentic engagement was characterized by reciprocal and proactive practices in the critical assessment of their writing. The analyses indicated significant increase in the quality of student writing performances as well. Thematic analysis of students’ evaluation of self-perceived efficacy of DEET provided insights for teaching practitioners to build on the formative purpose of student engagement enhancement practices. The implications for teaching practitioners were discussed.

Highlights

  • Students’ agentive engagement in formative assessment of L2 writing was explored.

  • A Digitalized Engagement Enhancement Tool was utilized.

  • Students’ agentive engagement was identified via content analysis.

  • Students’ agentive engagement and writing performances were optimized.

  • Students’ agentic engagement was achieved by reciprocal and proactive practices.

Introduction

Traditionally, assessment predominantly centered on exams to document learning, often leading to grades or certification. However, its’ lack of formative feedback and its undesirable consequences for students’ engagement, have instigated a transformative shift to formative assessment; also known as assessment for learning, emphasizing learners' active role in self-assessment, and fostering self-regulation and autonomy (Lee et al., 2019). Agentic engagement is strongly associated with formative assessment in the sense that ongoing support for scaffolded learning in formative assessment supports learners to engage productively (Mallary, 2023). In formative assessment, students are encouraged to reciprocally construct the classroom assessment context through collaboration with other learners and proactive stimulation of their self-regulated learning (Reeve, 2013; Reeve et al., 2020a). The probable benefits of students’ agentive engagement are high quality learning and achievement (Wu et al., 2021).

Investigating students’ agentic engagement in learning can provide insights into unpacking the process of how to empower students ‘as assessors of their learning’ (Wang & Lee, 2021). Various studies investigated how student engagement can be enhanced through ongoing monitoring of learning and provision of adequate formative feedback (Chen & Li, 2021). However, they mainly utilized instruments such as self-report questionnaires, tests, rubrics (Adams et al., 2020; Winstone et al., 2017; Zhang, 2021). Therefore, the reliability of their findings can be compromised by several factors, including the potential influence of social desirability bias, (Choi et al., 2023; Lavidas et al., 2022), test–retest reliability issues as students' performance can change over time due to various external factor (Akhmedov, 2022), and subjectivity and inconsistencies in grading clarity and specificity of rubrics (Lipnevich et al., 2023) as well as relying on student demographic information and learning management system access data, rather than incorporating performance data (Veerasamy et al., 2022). Responding to these reliability concerns is crucial in enhancing the validity and accuracy of research in the field of education (Wallwey & Kajfez, 2023). Hence, this study addresses this gap by employing methods beyond self-reports, instead utilizing students' enacted performances to explore their engagement.

The current study endeavors to address these constraints by examining students' enacted performances, augmented by a digitalized student engagement enhancement tool across multiple datasets aiming to explore how students agentively engage in the process of learning to write. This research underscores the importance of continuous assessment in tracking students’ achievements, serving as valuable indicators of engagement to support ongoing learning. By employing the digitalized formative feedback enhancement tool, we were able to investigate the efficacy of a personalized iterative approach to formative feedback, empowering students and illuminating their growth as evaluators of their own writing during assessment tasks.

Literature review

Student engagement and learning

Student engagement pertains to students’ commitment in educationally effective practices to achieve educational goals and educational outcomes such as academic achievement (Mao & Lee, 2023). Extensive research has explored the potential contributions of engagement to learning across a wide spectrum of educational settings, spanning from primary to higher education. In the realm of student engagement within the sphere of learning, it is conventionally conceptualized as a multifaceted construct, often discerned to comprise a variable number of dimensions. A tripartite framework has been advanced to delineate three distinct dimensions of engagement. Behavioral Engagement focuses on students' active involvement and efforts in their purposeful learning tasks, driven by the pursuit of academic achievement and desirable educational outcomes. Emotional Engagement delves into students' interest and excitement in their learning activities. It pertains to their emotional connection and positive disposition toward the educational experience. Cognitive Engagement pertains to students actively engaging in processes such as information storage, retrieval, and organization (Pilotti et al., 2017). Despite being interrelated, these dimensions are operationalized and conceptualized as distinct (Chiu, 2022). Agentic engagement encompasses three dimensions. It involves proactive, well-thought-out, and collaborative approaches that educators employ in learning activities. When agentic engagement contributes to creating a more supportive learning atmosphere, characterized by factors such as enhanced autonomy support and greater appreciation of learning activities, it fosters an environment conducive to student motivation. This, in turn, promotes enthusiastic, focused, and enduring student engagement. Students create their own learning context by proactively designing and reciprocally constructing their shared understanding with authentic community. This requires students receiving formative feedback by which they can adjust, act, and assess their learning practices towards learning objectives (Chen & Li, 2021).

Formative assessment and student engagement

Formative assessment is a process by which teachers and students can respond to and enhance student learning during the learning process and stage as-needed learning interventions (Veerasamy et al., 2021). Several studies have found a significant correlation between formative assessment and student engagement and achievement. For example, Chen & Li (2021) and Fuller (2017) indicate that underperformers in formative assessment tasks were disengaged and failed ones in the final writing exam as well. Formative assessment, if it leads to engagement in learning and entails provision of timely and informative feedback properly, can hone student learning (Jacoby et al., 2014). Examining students’ agentive engagement can help capture students’ role as reflective and critical agents in formative assessment of their writing development (Wang & Lee, 2021). Student agentic engagement means that students are agents who can proactively control, create and contribute to their learning context. It also means that students can offer constructive input to instruction by expressing their preferences and asking questions to reciprocally create a more supportive learning environments for themselves by affecting and transforming what the teacher says, does, and provide to learning moments (Reeve, 2013; Reeve et al., 2020b).

While student agentive engagement is increasingly recognized as a key factor in learners’ taking formative stance in assessing and directing their learning, questions remain over what constitutes high-impact pedagogies in relation to specific dimensions of student engagement (Macfarlane & Tomlinson, 2017). Most pedagogies were external to learners. For example, research investigated how different teacher interventions such as curriculum embedded formative assessment (Xie & Cui, 2021) and computer mediated formative assessment (Sullivan et al., 2021) promote engaged learning. Thus, as Wang and Lee (2021) put forward, scant research has investigated how formative assessment can achieve its formative function for effective learning by promoting students’ internal drives such as their agency in critical assessment of their learning context for maximum learning opportunities.

Student engagement in technology enhanced learning

Research has provided compelling evidence that technology-enhanced education can effectively bolster student engagement. Comparative studies in Taiwan, investigating the effectiveness of teacher-centered and student-centered technology-enhanced classrooms, revealed that the student-centered approach led to higher emotional engagement, while both approaches fostered significant cognitive engagement (Wu & Huang, 2007). Similarly, research focusing on pretest/posttest questionnaires in the realm of online learning indicated that digital support significantly increased student engagement and satisfaction (Chiu, 2021). Furthermore, scoping review of Yang et al. (2023) found that web-based online learning was positively correlated with student engagement and learning outcomes, surpassing the results of face-to-face classes (Yang et al., 2023). Notably, research has also highlighted that synchronous and asynchronous blended learning approaches yield positive impacts on student engagement but with varying effects across different engagement levels (Heilporn et al., 2021). However, it's important to acknowledge that technical challenges have been associated with school dropouts and reduced commitment to online learning (Asoodar et al., 2016; Esmaeili et al., 2018), underscoring the importance of addressing these issues to fully harness the potential of technology in education.

Mixed results observed in studies concerning student engagement in technology-enhanced learning environments indicate that the online setting can enhance specific forms of engagement while potentially presenting obstacles for others (Kahn et al., 2017). Furthermore, while these comparative studies offer valuable insights, they create misleading dichotomies that often ignore or fail to account for individual student agentive practices (Kahn et al., 2017).

For a comprehensive understanding, we must examine how students unpack and act upon their own engagement construction. This is particularly pertinent, as Huang and Wang (2023) underscore the paramount importance of initiatives aimed at creating a supportive online learning environment that caters to students' psychological needs.

The present research

Although timely and important, previous studies are limited in nature for some reasons: (a) utilization of student self-reports rather than references to student actual performances, and (b) the absence of methodological triangulation (Zhang & Hyland, 2022), (c) heavy focus on summative assessment as an indicator of student engagement, and d) failure to account for students’ agentive establishment of engaged learning context (Henrie et al., 2015). The present study responds to these limitations by investigating students enacted actual performances reinforced by digitalized student engagement enhancement tool on multiple data set. This study can provide insights into unpacking the process of how to empower students ‘as critical assessors of their writing and their writing performance quality. Respectively, students perceived efficacy of the feedback engagement enhancement tool can provide insights into further evidence of engaged student writing. To achieve the research objectives, the following research questions were proposed:

  1. 1.

    In what ways did the feedback enhancement engagement tool influence students' evolving agentive engagement during their formative assessment of writing?

  2. 2.

    What practices do students employ in their agentive engagement during the formative assessment of their writing?

  3. 3.

    How do students' engagement in writing tasks and the quality of their writing correlate?

  4. 4.

    How do students assess the effectiveness of the student engagement enhancement tool in the formative assessment of their writing?

Method

Participants

The participants in this study were higher education students initially drawn from a pool of 101 individuals who expressed interest in the course after receiving a course description flyer within the university network. Ultimately, 48 participants actively engaged in the course, out of which 22 students provided comprehensive feedback portfolios. The gender distribution among these participants included 4 males and 18 females. The age range of participants spanned from 23 to 56 years. These 22 participants with comprehensive feedback portfolios, belonged to diverse majors, with 6 specializing in educational psychology, 7 in law, and 11 in business and management, showcasing the predominant fields of study. They were from various faculties within the researchers' university, all falling under the broader category of social sciences. Additionally, 15 participants identified themselves as having intermediate proficiency, while 7 individuals categorized their skills as upper-intermediate when self-assessing their academic English writing abilities.

Research context

This study was conducted within an extracurricular online course focusing on research paper writing in English at the researchers' university. The course aimed to prepare students for academic writing and enhance their skills in crafting academic papers. The course covered various writing strategies for each section of a research paper, spanning the title and abstract, introduction, literature review, method and results, and discussion and conclusion. Each section of the research paper required completing a single writing task, totaling five tasks in all. Emphasis was placed on correct language usage and the effective application of appropriate language for academic texts. The instructional format consisted of five online sessions, each lasting half an hour, with two-week intervals. During these intervals, students engaged in feedback sessions and tracking sessions to implement the received feedback. Pairings for feedback exchange were based on students' self-rated English proficiency levels and their indicated disciplines in the course registration form. The class content and procedure are detailed in Table 1. The course was conducted in two platforms including Microsoft Teams for synchronous/asynchronous learning and Google Docs for collaborative feedback exchanges and revision tracking. Microsoft Teams offered a versatile platform for classes, chats, and document access, while Google Docs facilitated collaborative editing and transparent revision records.

Table 1 Classroom procedure

Data collection procedure

A Digitalized engagement enhancement tool (DEET)

The Digital Engagement Enhancement Tool (DEET) is anchored in the substantial body of research surrounding information and communication technology (ICT) in language teaching and assessment. This study conducted a five-week trial of DEET, designed to amplify student engagement over a two-month period, demonstrating its efficacy through established pedagogical frameworks. As highlighted by Barrot (2021) the evolution from traditional portfolios to online platforms has transformed how students engage with their writing. DEET embodies this transformation by utilizing Google Docs as a digital space for feedback and collaboration. The tool consists of four integral components, each fulfilling a distinct role to encourage collaborative feedback exchanges and enhance the learning experience.

  1. 1.

    Unpacking Feedback: This component motivates students to collect and organize feedback seamlessly within Google Docs, creating a centralized repository for constructive input.

  2. 2.

    Analysis of Feedback: Students are prompted to critically analyze the feedback they receive, identifying strengths and weaknesses to inform their improvement strategies. This reflective practice is crucial for metacognitive development, as discussed by Hung (2012), enabling learners to engage dynamically with their assessments.

  3. 3.

    Action Planning for Performance Improvement: The tool encourages students to devise strategic plans for enhancing their academic performance. By fostering collaborative features within Google Docs, students share insights and develop action plans together, reinforcing the social constructivist approach to learning supported by ICT integration.

  4. 4.

    Reflection on Impact: In this final component, students assess the impact of their actions on their performance. Google Docs serves as a collaborative platform for reflection, where students record and interpret feedback, engaging in dialogues with peers and instructors. This continuous feedback loop ultimately results in the creation of digitalized feedback booklets, making the assessment process more transparent and actionable.

The validation of DEET is supported by the literature on electronic portfolio assessment, which highlights the benefits of multimedia integration for learners (Ngui et al., 2022). By allowing students to showcase their work through various formats, DEET enhances their ability to engage in self-assessment and establish metacognitive goals, echoing (López-Crespo et al., 2022; Mohammadi Zenouzagh et al., 2023) findings on the positive impact of technology on learner engagement.

Furthermore, students accessed personalized digital feedback booklets via Google Forms after essay exercises, culminating in a comprehensive final assessment. This process not only reinforces the importance of feedback in academic performance but also validates the design of DEET as a tool rooted in proven educational theories and practices.

Measures

The study delved into students' agentive engagement practices, utilizing diverse data sources, including writing samples, feedback enhancement tools, and revision history in Google Docs. These data were gathered over a two-month paper writing course to meticulously trace students' engagement in the learning process. The researchers evaluated students' final revised documents across five treatment sessions spanning two months. Additionally, students provided self-perceived assessments of the feedback sessions and practices through an online Google Form. This self-evaluation encompassed three criteria: perceived advantages and disadvantages, noticed limitations, and suggested measures to preserve the formative nature of the feedback sessions implemented in the research.

Students’ agentive engagement

Students' agentive engagement during learning activities was conceptualized as a three-component construct encompassing behavioral, emotional, and cognitive aspects. These aspects represent constructive contributions to the instructional flow they receive. Axial deductive coding, informed by the study of Guo et al. (2023), was employed to label different codes for dimensions of student engagement, and selective coding was used to compare these codes across student files. In this process, qualitative analysis of students' feedback booklets was conducted to profile their behavioral, cognitive, and affective engagement with feedback. Additionally, to further track student engagement dimensions and streamline the data, their revision history on Google Docs was analyzed for instances of revisions that could reflect their engagement dimensions.

Students ‘behavioral engagement dimension is characterized by (a) observable actions and a willingness to participate actively in writing; (b) students’ not being distracted and not being delayed in their study works, (c) the revision history showing number of editions participants have undertaken, (d) and number of feedback acceptance and rejections in revision. Student’ engagement on cognitive dimension was attributed to (a) students' mental effort to finish tasks utilizing a profound, self-regulated, and planned learning approach rather than superficial learning techniques efforts to form questions and hypotheses, as well as the monitoring of the thinking process in order to construct knowledge, (b) strategic regulations and flexibility in dealing with learning problems, (c) exchange of information from different sources (d) proposing ideas, managing time and task and task procedure, (d) making connections, integration, and synthesis of information from various sources, (e) suggest solutions for problems and justifications for why specific solution was suggested. Emotional engagement was related to (a) expressing emotions, self-expressions of likes, dislikes, and preferences and personal values and attitudes, (b) willingness to do the work enthusiastically, (c) complementing others or the contents.

The data were analyzed manually to explore how the target students engaged with peer feedback, employing a synthesis of deductive and inductive thematic analysis through open coding, axial coding, and selective coding. Open codes were derived from the original data in student writing, the feedback they received, their feedback engagement booklets, and their revised drafts.

Initially, transcripts were read and re-read to develop a general understanding of the qualitative data. Thematic analysis, broadly defined as “a method for identifying, analysing, and reporting patterns (themes) within data” (Braun & Clarke, 2006, p. 79), was used for this purpose. Specifically, a deductive approach informed by the conceptualization of engagement and the research question was employed, coding the data from affective, behavioral, and cognitive perspectives. These dimensions included codes related to the value of peer feedback, strategies for addressing feedback, recognition and understanding of peer feedback, and the cognitive/metacognitive processes applied in response. During the practical analysis, students’ feedback engagement booklets, their revised writing, and log analyses from Google Docs were scrutinized, with relevant sections highlighted and coded according to the established codes. This process was completed with a cross-case comparison and interpretation of the data.

The data were analyzed to explore how the target students engaged with peer feedback, employing a synthesis of deductive and inductive thematic analysis through open coding, axial coding, and selective coding. Open codes were derived from the original data in student writing, the feedback they received, their feedback engagement booklets, and their revised drafts.

Initially, feedback booklet, google history logs and student writing performances were read and re-read to develop a general understanding of the qualitative data. Thematic analysis, broadly defined as “a method for identifying, analyzing, and reporting patterns (themes) within data” (Braun & Clarke, 2006, p. 79), was used for this purpose. Specifically, a deductive approach informed by the conceptualization of engagement and the research question was employed, coding the data from affective, behavioral, and cognitive perspectives. These dimensions included codes related to the value of peer and teacher feedback, strategies for addressing feedback, recognition and understanding of peer and teacher feedback, and the cognitive/metacognitive processes applied in response. During the practical analysis, students’ feedback engagement booklets, their revised writing, and log analyses from Google Docs were scrutinized, with relevant sections highlighted and coded according to the established codes. This process was completed with a cross-case comparison and interpretation of the data.

. Text analysis of feedback booklets and log analysis of Google docs were segmented into feedback related episodes (FREs). Feedback-related episodes (FREs) are defined as segments of writing assignments for which feedback is provided and received, serving as units of analysis (Gilbuena et al., 2011). FREs reveal how students or participants engage with feedback, their emotional and cognitive responses, and the strategies they use to apply feedback to improve their work or performance. In this context, each FRE represents a student's demonstration of agentive engagement through their writing samples, interactions with a digitalized feedback engagement tool, and revisions made over five treatment sessions. Each FRE is identified through the student's involvement in a sequence of four key indicators:

  1. 1.

    Trigger (i.e., the writing part which received feedback).

  2. 2.

    Indicator (i.e., the feedback that sets off an action).

  3. 3.

    Response (i.e., students’ consideration of revision).

  4. 4.

    Reaction (i.e., the revised part that indicates a speaker's uptake of the feedback).

Table 2 presents a sample of the data layout and coding sheet for digitalized feedback engagement enhancement, Google Docs revision history, and revised writing assignments.

Table 2 Data Layout and Coding Sheet for Student Engagement episodes

This framework allowed our raters to classify the codes into the three engagement dimensions using axial coding, which systematically categorized each code into behavioral, cognitive, and emotional engagement The coding process provided a systematic approach to quantifying student engagement. The themes and patterns identified through the qualitative analysis were translated into numerical values that reflected three student engagement dimensions. This process allows us to generate descriptive statistics that illustrate the levels of engagement demonstrated by students during their interactions with peer feedback.

Table 3 provides descriptive statistics concerning students' engagement dimensions. The findings indicated that students exhibited the highest mean on the cognitive dimension (M = 20.31, SE = 1.35). This was followed by the behavioral dimension (M = 19.25, SE = 1.32), and lastly, the emotional dimension (M = 14.11, SE = 0.581).

Table 3 Descriptive Statistics of student engagement dimensions

To ensure reliability, a knowledgeable colleague was invited to code 20% of the data, specifically the first session's data. Inter-coder agreement, measured by Cohen's Kappa, demonstrated a robust consensus in coding the interpretative context of student engagement dimensions—behavioral, cognitive, and emotional—with respective values of 0.83, 0.84, and 0.74. Any disagreements were successfully resolved through negotiation and interpretation sessions. Discussions among raters were assessed using the framework developed by Bakker et al. (2015), which required them to identify indicators of student engagement. Raters were tasked with examining (counter) evidence that contributes to student engagement differentiating between indicators of student engagement, assigning scores, and determining whether the overall performance could be attributed to a specific dimension of student engagement. Additionally, raters were required to write a summary that included comments on the assigned dimension, along with citations of key arguments and evidence. They were encouraged to consult with fellow assessors to discuss the comparability of the assigned dimensions, share their rationales, and provide supporting evidence and arguments. Ultimately, raters needed to decide whether to maintain their originally assigned dimension or adjust. It is important to note that the personal views of raters cannot be eliminated from the coding process. To minimize this effect, several briefing sessions were conducted for the raters prior to their evaluations.

Student agentive engagement practices

Student agentic engagement is characterized by two overarching practices that span across the mentioned dimensions. Firstly, students possess the ability to proactively control, create, and contribute to their learning context. This involves offering constructive input to instruction by expressing their preferences. Secondly, students engage in reciprocal collaboration with the provider of the learning environment, whether it be a teacher or peer, to influence and transform the content, actions, and provisions within learning moments. Students either collaboratively construct the classroom assessment context with their peers or independently stimulate their self-regulated learning.

To investigate students' agentive engagement practices, various data sources were employed, including writing samples, input in feedback engagement enhancement tools, and revision history in Google Docs. These data were collected to trace students' agentive engagement strategies and practices. The coded data from the previous stage underwent further analysis through axial coding to identify categories of student agentive engagement practices that could be linked together. The codes were then clustered into behavioral, cognitive, and emotional categories, and classified into two emergent themes: (a) proactivity in learning self-regulation and (b) reciprocity in assessment context co-construction (see Table 4).

Table 4 Student Agentive Engagement practices

Student writing performances

To address the third research question, examining the relationship between students' engagement with feedback and the quality of their writing, a weighted rubric for assessing research papers was employed. The rubric's quality criteria comprised purpose/position (20 points), analysis (30 points), evidence (source) (30 points), organization (10 points), writing quality, and adherence to format guidelines (10 points), along with timeliness and paper length rated on a four-point scale: unsatisfactory, developing, accomplished, and exemplar, resulting in a total score of 100.

To ensure reliability, Pearson correlations were computed among raters for five writing tasks. Significant agreement was found across all tasks: Task 1 (r(20) = 0.901, large effect size, p < 0.001), Task 2 (r(20) = 0.842, large effect size, p < 0.001), Task 3 (r(20) = 0.704, large effect size, p < 0.001), Task 4 (r(20) = 0.756, large effect size, p < 0.001), and Task 5 (r(20) = 0.833, large effect size, p < 0.001). These findings establish a strong consistency in rating the quality of students' written work across multiple tasks.

Students’ self-perceived efficacy of their enhanced engagement

Participants were asked to write a reflective essay assessing the effectiveness of their engaged writing formative assessment, specifically through the use of a digital feedback booklet designed to enhance engagement. The reflective essay aimed to gather students’ assessments of the effectiveness of their engaged writing formative assessment, specifically through the use of a digital feedback booklet designed to enhance engagement. This qualitative reflection sought to capture students' perceptions of their experiences with the booklet, providing a nuanced understanding of its impact on their writing engagement. The reflections were organized around three main themes: merits, limitations, and suggestions for improvement.

To facilitate this reflection, three prompts were used to elicit detailed responses from students about their experiences with the feedback booklet. Participants were encouraged to consider the effectiveness of the feedback received, the usability of the booklet, and their overall engagement in the writing process. Each prompt aimed to guide students in articulating their thoughts and feelings about their interactions with the tool.

The second theme, limitations, highlighted the challenges students encountered while using the feedback booklet. The third theme, suggestions for improvement, focused on students’ recommendations for enhancing the engagement feedback booklet.

A thematic analysis of students' responses in their reflective essays, which assessed the effectiveness of the student engagement enhancement tool in the formative assessment of their writing, was conducted using Atlas.ti. This analysis utilized open coding and axial coding to develop an in-depth understanding of student feedback and establish categories that highlighted the tool's merits, limitations, and areas for improvement.

During the open coding phase, 95 quotations were identified and analyzed to generate initial codes based on the raw data from the students' reflective essays. These codes were then examined to identify recurring patterns and direct statements related to the students' experiences with the tool. This process allowed us to capture nuanced perspectives on how students engaged with the digitalized feedback enhancement tool and perceived its impact on their writing process.

Axial coding was subsequently employed to link the open codes and group them into larger, interconnected categories that provided a cohesive structure to the analysis. This step enabled the consolidation of related codes into three overarching themes: merits (with 9 subthemes), limitations (comprising 3 subthemes), and suggestions for the ongoing development of the feedback engagement enhancement booklet (with 3 subthemes).

The analysis shed light on the limitations and suggestions for improving the digitalized feedback enhancement tool's functionality.

Research design and analysis

In this study, we employed a situated multiple analysis approach, gathering data before, during, and after the implementation of the feedback engagement enhancement tool. This method enabled us to track the evolving nature of engagement, capturing changes in engagement dimensions and practices over time. Each analysis unit comprised students' demonstrations of agentive engagement in their writing, interactions with the feedback tool, and revisions across five treatment sessions. Table 5 provides an in-depth overview of the research phases and corresponding analyses linked to the research questions. This approach allowed us to explore how student engagement evolved throughout the study's stages.

Table 5 Research design and Data analysis Procedure

Results

Student engagement dimensions in formative assessment of their writing

We employed a Repeated Measures ANOVA to examine how students’ agentive engagement dimensions were enhanced during the formative assessment of their writing across five writing tasks. The results of the Repeated Measures ANOVA are presented in Table 6. These results (F (1, 19.05) = 41.40, p < 0.05, pη2 = 0.685, indicating a large effect size) demonstrated significant differences in students' cognitive, behavioral, and emotional engagement dimensions across the five writing tasks.

Table 6 Tests of Within-Subjects Effects

As illustrated in Fig. 1, all three engagement dimensions exhibited increasing trends from the first to the fifth task, although the rate of increase in means was slightly slower for the emotional attribute.

Fig. 1
figure 1

Means on student engagement dimensions by five tasks

Students’ agentive engagement practices in formative assessment of their writing

A chi-square analysis (crosstabs) was conducted to examine both the nature and variability of students' agentive practices throughout the treatment. The results, presented in Table 7 and Fig. 2, indicate that students' proactive agentive engagement practices exhibited a consistent increase from the first to the fifth tasks, with percentages of 4.9%, 13.3%, 16.8%, 24.5%, and 40.5%, respectively. It's noteworthy that none of the Standardized Residuals exceeded the range of ± 1.96.

Table 7 Frequencies, Percentages and Standardized Residuals Agentive Engagement Practices
Fig. 2
figure 2

Students' proactive engagement practices

The findings from Table 7 and Fig. 3 revealed that the percentages of reciprocal agentive engagement practices remained relatively stable across the five tasks, with values of 9.2%, 15.5%, 16.7%, 21.3%, and 37.4%. It's worth noting that all Standardized Residuals fell within the range of ± 1.96.

Fig. 3
figure 3

Students' reciprocal student engagement practices

The most observed proactive student engagement practices included reflection (from the cognitive dimension), self-initiated adjustments of external or personal goal setting (from the behavioral dimension) and affect regulation (from the emotional dimension). While reciprocal practices were less prevalent compared to proactive ones, the analysis revealed a gradual increase in reciprocal practices as the course progressed. Among the reciprocal behaviors, providing input and offering diagnostic feedback were the most frequently encountered. In contract, practices such as students requesting assistance from teachers and peers when encountering challenges, receiving support from one another through clarification requests and confirmation checks, and expressing preferences were the least frequently observed reciprocal regulatory practices.

The relation between students engaged writing and students’ writing quality

We conducted a Repeated Measures ANOVA to assess students' performance across the five writing tasks. The descriptive statistics for the students' performance in these tasks are presented in Table 8. The results demonstrate an increase in their mean scores as they progressed through the five tasks, as depicted in Fig. 4.

Table 8 Descriptive Statistics for Five Writing Tasks
Fig. 4
figure 4

Summative grades on writing quality

Table 9 presents the primary outcomes of the repeated measures ANOVA. The results (F (2.31, 48.60) = 596.99, p < 0.05, pη2 = 0.966, indicating a large effect size), indicate significant variations in students' performance across the five writing tasks. For a detailed breakdown of the results, please refer to Fig. 6, which displays the outcomes of the post-hoc comparison test.

Table 9 Tests of Within-Subjects Effects Writing Tasks

Students’ self-perceived efficacy of their enhanced engagement

The inductive coding of students' assessments of the effectiveness of their engaged writing formative assessment via the digitalized engagement enhancement feedback booklet revealed their recognition of both its advantages and disadvantages, as well as their suggestions for further improvement. In total, 95 quotations were categorized into three overarching themes: merits (with 9 subthemes), limitations (comprising 3 subthemes), and suggestions for the ongoing development of the feedback engagement enhancement booklet (with 3 subthemes). Table 10 presents the codes and sample quotations related to students' evaluations.

Table 10 Students’ reflective evaluation of Digitalized Feedback engagement enhancement tool

Discussion

This study aimed to empower students as critical evaluators of their writing by exploring their agentic engagement. The implementation of a digitalized feedback engagement tool facilitated the cultivation of students' agency. Throughout the study, students' engagement dimensions and practices were closely observed as they assessed their writing. Their active participation in writing, interactions with the feedback tool, and revisions over five sessions were meticulously documented.

Results highlighted significant engagement across cognitive, behavioural, and emotional dimensions, with cognitive engagement being notably prominent and emotional engagement ranking lowest. Moreover, students exhibited a proclivity toward proactive engagement rather than reciprocal practices, employing a spectrum of proactive behaviors associated with diverse engagement dimensions. This study yields valuable insights into how students can actively shape and refine their writing through proactive engagement practices. Notably, the most frequently observed proactive student engagement practices encompassed reflection (from the cognitive dimension), self-initiated adjustments of external or personal goal setting (from the behavioral dimension) and affect regulation (from the emotional dimension). Although reciprocal practices were comparatively less prevalent, the analysis revealed a gradual increase in their occurrence as the course progressed. Among reciprocal behaviors, providing input and offering diagnostic feedback were the most frequently encountered. In contrast, practices such as students seeking assistance from teachers and peers when facing challenges, receiving support from one another through clarification requests and confirmation checks, and expressing preferences were the least frequently observed reciprocal regulatory practices.

Incorporating both quantitative and qualitative data, this study uncovered that learners employed a variety of various cognitive and metacognitive strategies to engage with peer feedback and improve their drafts, indicating significant cognitive and behavioral engagement. This aligns with previous research showing learners use strategies to manage feedback and regulate their writing (e.g., Fan & Xu, 2020). By employing these strategies, demonstrated agency and self-regulation in processing feedback and making revisions, moving from dependence on other regulated to self-regulated learning (Chen et al., 2022). These students showcased a clear understanding of the benefits of these practices, especially when faced with challenges in online learning, actively striving to maintain their engagement. These insights resonate with recent studies highlighting the significance of formative assessment in fostering self-regulated learning (Luckritz Marquis, 2021; Teng, 2022). Learners actively participating in formative assessment tasks develop crucial metacognitive skills, including reflection, feedback utilization, planning, goal setting, and the implementation of various strategies to enhance their learning experience. The findings of the present study align with existing research, emphasizing that students' engagement across various dimensions is influenced by instructional design and the chosen learning modalities (Bhardwaj et al., 2021). As this study highlights, and as corroborated by Henrie et al. (2015), cognitive and behavioral engagement primarily involve the actions taken by the learners. When students create artifacts stemming from reflection, interpretation, synthesis, or elaboration, it can lead to significant learning outcomes.

Shernoff and Shernoff (2013) suggests that optimal learning environments, characterized by student-generated artifacts, clear objectives, teacher feedback, and interactive learning, promote cognitive and behavioural engagement in online settings. While this study's findings indicate no significant improvement in emotional engagement, previous research indicates that affective engagement tends to rise during academic activities, especially when students feel empowered to showcase their skills, collaborate, and receive feedback from peers and adults. These insights emphasize the importance of fostering positive emotional responses within academic contexts. This variance in emotional engagement could be attributed to deep learning that enable students to monitor their real-time emotions, such as anger, disgust, fear, happiness, sadness, and surprise. This may lead to greater emphasis on strategies such as critical understanding of new concepts, connecting new ideas and concepts, and linking new knowledge to previous knowledge (Bhardwaj et al., 2021). Another plausible explanation, as suggested by Reeve et al. (2020a) is the need to reevaluate how we conceptualize student engagement. They propose that emotional engagement can either invigorate, sustain, amplify, deplete, diminish, or even terminate other aspects of engagement. Emotion might need to be redefined as not just a distinct facet of engagement but as a variable that predicts changes in other facets of engagement. Furthermore, research has indicated that emotional engagement struggles to independently account for variance in measures of academic progress, including academic achievement (Dierendonck et al., 2020; Gutiérrez & Tomás, 2019). While emotional engagement does positively and significantly correlate with both achievement and other engagement components, it consistently fails to serve as a standalone predictor of independent variance in achievementMoreover, studies indicate that while emotional engagement correlates positively and significantly with academic achievement and other engagement components, it struggles to singularly predict variations in achievement (Dierendonck et al., 2020; Gutiérrez & Tomás, 2019). Emotion seems to interact with, rather than independently determine, academic progress, suggesting a need to reconsider its role in understanding student engagement and its impact on academic success. The other plausible explanation for the insignificant change in emotional engagement is novelty effect (Dubovi, 2022). The novelty effect in technology-integrated education refers to the initial excitement or increased interest that students and educators may experience when introduced to a new technology or educational tool. However, it's important to note that the novelty effect is typically temporary. Over time, as the novelty wears off, users may become accustomed to the technology, and its impact on engagement and motivation may decrease.

However, the distinctions in learning outcomes tend to blur when considering different learning environments. Many of the current challenges appear to stem from the necessity of establishing practical connections between research on student academic engagement and effective online course design (Czerkawski & Lyman, 2016). Learning successfully in such learning environments depends on the individual feedback engagement. This highlights the importance of designing courses that translate engagement theory into practical, applicable strategies. In online learning environments, successful learning heavily relies on how students engage with feedback. Individual feedback engagement becomes a crucial factor, as it not only facilitates the application of cognitive and metacognitive strategies but also supports self-regulation and sustained motivation. The effectiveness of feedback processes can determine how well students adapt to and thrive in these environments, emphasizing the need for online course structures that foster meaningful, interactive feedback mechanisms to support learning outcomes.

Conclusion and implications

This study explored students' agentic engagement in the learning process, emphasizing the empowerment of students as critical assessors of their writing through extensive data collection. A key instrument employed for this purpose was the Digitalized Feedback Engagement Enhancement Tool (DEET), utilized as a student feedback assessment portfolio in an academic writing course. Through the DEET, students were encouraged to document, analyze, strategize actionable steps, and reflect on the feedback received from both their teacher and peers. The results highlighted that students' proactive and reciprocal engagement practices played a significant role in enhancing writing quality. The study suggests that agentic engagement signifies students' constructive contributions to the instructional flow they receive. These proactive and reciprocal actions are pivotal not only for academic progress but also for shaping a more supportive and interactive learning environment. Additionally, agentic engagement serves as an explanatory framework for understanding students' academic progress and how various dimensions of student engagement synergistically operate for optimal learning outcomes. Besides, the Feedback Engagement Enhancement Tool shows promise in expanding access to quality learning resources. To cultivate a supportive learning environment, educators, practitioners, and researchers must consider pedagogical, social, and technical elements. Embracing digital tools for formative assessment calls for adaptable, engaging, and student-centered evaluation methods, leveraging technology for self-reflection, peer collaboration, and proactive engagement. This transcends mere course digitization, emphasizing instructors' role in anticipating and addressing online learning challenges. Educators should consciously enhance student motivation and engagement, integrating it into course planning. Hence, adopting an ecosystem perspective becomes crucial, considering circumstances and factors influencing student engagement. Empowering students' agentive engagement through formative assessment involves fostering a proactive and self-directed learning approach. By integrating formative assessment strategies, educators can provide timely feedback, encourage reflection, and promote student involvement in the learning process. This approach enhances students' ability to take ownership of their learning journey, fostering a sense of agency and self-efficacy. Through formative assessment, educators create an environment where students actively participate in setting learning goals, monitoring their progress, and adapting their strategies, ultimately contributing to a more empowered and engaged student.

Despite the interesting finings, we noticed one notable limitation of this study pertains to the evaluation of group members, which may not have fully captured the individual contributions of each team member. In a similar vein, concerning student engagement, an analogous limitation arises when evaluating group dynamics. This issue involves potential inadequacies in fully recognizing the individual contributions of each student within a collaborative setting. Such limitations may pose challenges in appropriately assigning credit and accountability to each student, potentially impeding the effective structuring of collaborative activities. The difficulty lies in delineating individual responsibilities while maintaining a conducive environment for collective engagement, emphasizing the need for strategies that balance collaborative efforts with individual accountability in educational settings. This limitation underscores the importance of creating more interactive learning environments that balance collaboration with individual assessment. To address this issue, there is a growing demand for innovative software platforms that not only facilitate collaborative activities but also enable the assessment of both collaborative work and individual contributions. Such platforms would provide a more comprehensive and accurate representation of each student's involvement in group tasks, offering a more nuanced understanding of their contributions to the collective learning experience.

Similarly, while the present research findings emphasize the use of students' actual performances over self-report, there is a requirement for further research to improve the measurability of this complex construct (Dubovi, 2022). O'Brien et al. (2022) propose the use of more holistic measures to comprehend students' engagement as an ongoing cycle involving engagement, disengagement, and re-engagement. Recent advancements in data-capturing devices suggest adopting a multimodal approach, combining psycho-physiological signals, student actual performances and self-report data streams. Integrating multiple measures of engagement is crucial for identifying the causes of disengagement, with the ultimate goal of aiding learners in re-engaging (Sharma & Giannakos, 2020).

Availability of data and materials

Data is available for submission if requested through anonymous email.

Abbreviations

DFEET:

A Digitalized Engagement Enhancement Tool

FREs:

Feedback Related Episodes

References

  • Adams, A.-M., Wilson, H., Money, J., Palmer-Conn, S., & Fearn, J. (2020). Student engagement with feedback and attainment: The role of academic self-efficacy. Assessment & Evaluation in Higher Education, 45(2), 317–329.

    Article  Google Scholar 

  • Akhmedov, B. A. (2022). Analysis of the Reliability of the Test form of Knowledge Control in Cluster Education. Psychology and Education, 59(2), 403–418.

    MATH  Google Scholar 

  • Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior, 63, 704–716.

    Article  Google Scholar 

  • Bagheri, M., & Zenouzagh, Z. M. (2021). Comparative study of the effect of face-to-face and computer mediated conversation modalities on student engagement: Speaking skill in focus. Asian-Pacific Journal of Second and Foreign Language Education, 6(1), 1–23.

    Article  Google Scholar 

  • Bakker, A. B., Sanz Vergel, A. I., & Kuntze, J. (2015). Student engagement and performance: A weekly diary study on the role of openness. Motivation and Emotion, 39, 49–62.

    Article  Google Scholar 

  • Barrot, J. S. (2021). Effects of Facebook-based e-portfolio on ESL learners’ writing performance. Language, Culture and Curriculum, 34(1), 95–111.

    Article  Google Scholar 

  • Bhardwaj, P., Gupta, P., Panwar, H., Siddiqui, M. K., Morales-Menendez, R., & Bhaik, A. (2021). Application of Deep Learning on Student Engagement in e-learning environments. Computers & Electrical Engineering, 93, 107277.

    Article  Google Scholar 

  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

    Article  MATH  Google Scholar 

  • Chen, P.-S.D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54(4), 1222–1232.

    Article  MATH  Google Scholar 

  • Chen, Q., & Li, H. (2021). Formative assessment in China and its effects on EFL learners’ learning achievement: A meta-analysis from policy transfer perspective. The Educational Review, USA, 5(9), 355–366.

    Article  MATH  Google Scholar 

  • Chen, J., Zhang, L. J., & Chen, X. (2022). L2 learners’ self-regulated learning strategies and self-efficacy for writing achievement: A latent profile analysis. Language Teaching Research, 13621688221134967.

  • Chiu, T. K. (2021). Applying the self-determination theory (SDT) to explain student engagement in online learning during the COVID-19 pandemic. Journal of Research on Technology in Education, 6, 1–17.

    MATH  Google Scholar 

  • Chiu, T. K. F. (2022). Applying the self-determination theory (SDT) to explain student engagement in online learning during the COVID-19 pandemic. Journal of Research on Technology in Education, 54(sup1), S14–S30. https://doi.org/10.1080/15391523.2021.1891998

    Article  MATH  Google Scholar 

  • Choi, H., Winne, P. H., Brooks, C., Li, W., & Shedden, K. (2023). Logs or Self-Reports? Misalignment Between Behavioral Trace Data and Surveys When Modeling Learner Achievement Goal Orientation. LAK23: 13th International Learning Analytics and Knowledge Conference

  • Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering student engagement in online learning environments. TechTrends, 60(6), 532–539.

    Article  Google Scholar 

  • Dierendonck, C., Milmeister, P., Kerger, S., & Poncelet, D. (2020). Examining the measure of student engagement in the classroom using the bifactor model: Increased validity when predicting misconduct at school. International Journal of Behavioral Development, 44(3), 279–286.

    Article  Google Scholar 

  • Dubovi, I. (2022). Cognitive and emotional engagement while learning with VR: The perspective of multimodal methodology. Computers & Education, 183, 104495. https://doi.org/10.1016/j.compedu.2022.104495

    Article  MATH  Google Scholar 

  • Esmaeili, Z., Farajollahi, M., Saeedipour, B., & Taheri Otaghsara, H. (2018). The Explanation of the Components of the E-Learning System and its Relationship with the Satisfaction of Faculty Members in Payame Noor University. Education Strategies in Medical Sciences, 11(1), 157–171.

    Google Scholar 

  • Fan, Y., & Xu, J. (2020). Exploring student engagement with peer feedback on L2 writing. Journal of Second Language Writing, 50, 100775.

    Article  MATH  Google Scholar 

  • Fuller, K. (2017). Beyond reflection: Using ePortfolios for formative assessment to improve student engagement in non-majors introductory science. The American Biology Teacher, 79(6), 442–449.

    Article  Google Scholar 

  • Gilbuena, D., Sherrett, B. U., & Koretsky, M. (2011). Episodes as a discourse analysis framework to examine feedback in an industrially situated virtual laboratory project. in 2011 ASEE Annual Conference & Exposition.

  • Guo, J.-P., Lv, S., Wang, S.-C., Wei, S.-M., Guo, Y.-R., & Yang, L.-Y. (2023). Reciprocal modeling of university students’ perceptions of the learning environment, engagement, and learning outcome: A longitudinal study. Learning and Instruction, 83, 101692. https://doi.org/10.1016/j.learninstruc.2022.101692

    Article  MATH  Google Scholar 

  • Gutiérrez, M., & Tomás, J. M. (2019). The role of perceived autonomy support in predicting university students’ academic success mediated by academic self-efficacy and school engagement. Educational Psychology, 39(6), 729–748.

    Article  MATH  Google Scholar 

  • Heilporn, G., Lakhal, S., & Bélisle, M. (2021). An examination of teachers’ strategies to foster student engagement in blended learning in higher education. International Journal of Educational Technology in Higher Education, 18(1), 1–25.

    Article  Google Scholar 

  • Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53.

    Article  MATH  Google Scholar 

  • Huang, Y., & Wang, S. (2023). How to motivate student engagement in emergency online learning? Evidence from the COVID-19 situation. Higher Education, 85(5), 1101–1123. https://doi.org/10.1007/s10734-022-00880-2

    Article  MATH  Google Scholar 

  • Jacoby, C., Heugh, J., Bax, S., & Branford-White, C. (2014). Enhancing learning through formative assessment. Innovations in Education and Teaching International, 51(1), 72–83.

    Article  Google Scholar 

  • Kahn, P., Everington, L., Kelm, K., Reid, I., & Watkins, F. (2017). Understanding student engagement in online learning environments: The role of reflexivity. Educational Technology Research and Development, 65(1), 203–218.

    Article  Google Scholar 

  • Lavidas, K., Papadakis, S., Manesis, D., Grigoriadou, A. S., & Gialamas, V. (2022). The Effects of Social Desirability on Students’ Self-Reports in Two Social Contexts: Lectures vs. Lectures and Lab Classes. Information, 13(10), 491.

    Google Scholar 

  • Lee, I., Mak, P., & Yuan, R. E. (2019). Assessment as learning in primary writing classrooms: An exploratory study. Studies in Educational Evaluation, 62, 72–81.

    Article  Google Scholar 

  • Lipnevich, A. A., Panadero, E., & Calistro, T. (2023). Unraveling the effects of rubrics and exemplars on student writing performance. Journal of Experimental Psychology: Applied, 29(1), 136.

    Google Scholar 

  • López-Crespo, G., Blanco-Gandía, M. C., Valdivia-Salas, S., Fidalgo, C., & Sánchez-Pérez, N. (2022). The educational e-portfolio: Preliminary evidence of its relationship with student’s self-efficacy and engagement. Education and Information Technologies, 27, 1–16.

    Article  MATH  Google Scholar 

  • Luckritz Marquis, T. (2021). Formative assessment and scaffolding online learning. New Directions for Adult and Continuing Education, 2021(169), 51–60.

    Article  MATH  Google Scholar 

  • Macfarlane, B., & Tomlinson, M. (2017). Critical and alternative perspectives on student engagement. Higher Education Policy, 30(1), 1–4.

    Article  MATH  Google Scholar 

  • Mallary, J. C. (2023). Promoting Student Engagement, Empowerment, and Agency Using Formative Assessment: A Phenomenological Study. The University of West Florida.

    MATH  Google Scholar 

  • Mao, Z., & Lee, I. (2023). Student Engagement with Written Feedback: Critical Issues and Way Forward. RELC Journal, 00336882221150811.

  • Mohammadi Zenouzagh, Z., Admiraal, W., & Saab, N. (2023). Learner autonomy, learner engagement and learner satisfaction in text-based and multimodal computer mediated writing environments. Education and Information Technologies, 67, 1–41.

    Google Scholar 

  • Ngui, W., Pang, V., & Hiew, W. (2022). E-portfolio as an academic writing assessment tool in higher education: Strengths and challenges. Indonesian Journal of Applied Linguistics, 12(2), 556–568.

    Article  Google Scholar 

  • O’Brien, H. L., Roll, I., Kampen, A., & Davoudi, N. (2022). Rethinking (Dis) engagement in human-computer interaction. Computers in Human Behavior, 128, 107109.

    Article  MATH  Google Scholar 

  • Pilotti, M., Anderson, S., Hardy, P., Murphy, P., & Vincent, P. (2017). Factors related to cognitive, emotional, and behavioral engagement in the online asynchronous classroom. International Journal of Teaching and Learning in Higher Education, 29(1), 145–153.

    Google Scholar 

  • Reeve, J. (2013). How students create motivationally supportive learning environments for themselves: The concept of agentic engagement. Journal of Educational Psychology, 105(3), 579.

    Article  MathSciNet  MATH  Google Scholar 

  • Reeve, J., Cheon, S. H., & Jang, H. (2020a). How and why students make academic progress: Reconceptualizing the student engagement construct to increase its explanatory power. Contemporary Educational Psychology, 62, 101899.

    Article  Google Scholar 

  • Reeve, J., Cheon, S. H., & Jang, H. (2020b). How and why students make academic progress: Reconceptualizing the student engagement construct to increase its explanatory power. Contemporary Educational Psychology, 62, 101899.

    Article  Google Scholar 

  • Sharma, K., & Giannakos, M. (2020). Multimodal data capabilities for learning: What can multimodal data tell us about learning? British Journal of Educational Technology, 51(5), 1450–1484.

    Article  MATH  Google Scholar 

  • Shernoff, D. J., & Shernoff, D. J. (2013). Introduction: Towards optimal learning environments in schools. Optimal learning environments to promote student engagement, pp. 1–24.

  • Sullivan, P., McBrayer, J. S., Miller, S., & Fallon, K. (2021). An Examination of the use of computer-based formative assessments. Computers & Education, 173, 104274.

    Article  MATH  Google Scholar 

  • Teng, L. S. (2022). Explicit strategy-based instruction in L2 writing contexts: A perspective of self-regulated learning and formative assessment. Assessing Writing, 53, 100645.

    Article  MATH  Google Scholar 

  • Veerasamy, A. K., Laakso, M.-J., & D’Souza, D. (2021). Formative assessment tasks as indicators of student engagement for predicting at-risk students in programming courses. Informatics in Education. https://doi.org/10.15388/infedu.2022.15

    Article  MATH  Google Scholar 

  • Veerasamy, A. K., Laakso, M.-J., & D’Souza, D. (2022). Formative assessment tasks as indicators of student engagement for predicting at-risk students in programming courses. Informatics in Education, 21(2), 375–393.

    MATH  Google Scholar 

  • Wallwey, C., & Kajfez, R. L. (2023). Quantitative research artifacts as qualitative data collection techniques in a mixed methods research study. Methods in Psychology, 8, 100115.

    Article  MATH  Google Scholar 

  • Wang, L., & Lee, I. (2021). L2 learners’ agentic engagement in an assessment as learning-focused writing classroom. Assessing Writing, 50, 100571.

    Article  Google Scholar 

  • Winstone, N. E., Nash, R. A., Rowntree, J., & Parker, M. (2017). ‘It’d be useful, but I wouldn’t use it’: Barriers to university students’ feedback seeking and recipience. Studies in Higher Education, 42(11), 2026–2041.

    Article  Google Scholar 

  • Wu, H. K., & Huang, Y. L. (2007). Ninth-grade student engagement in teacher-centered and student-centered technology-enhanced learning environments. Science Education, 91(5), 727–749.

    Article  MATH  Google Scholar 

  • Wu, X. M., Zhang, L. J., & Dixon, H. R. (2021). Implementing assessment for learning (AfL) in Chinese university EFL classes: teachers’ values and practices. System, 101, 102589.

    Article  Google Scholar 

  • Xie, Q., & Cui, Y. (2021). Preservice teachers’ implementation of formative assessment in English writing class: Mentoring matters. Studies in Educational Evaluation, 70, 101019.

    Article  MATH  Google Scholar 

  • Yang, D., Wang, H., Metwally, A. H. S., & Huang, R. (2023). Student engagement during emergency remote teaching: A scoping review. Smart Learning Environments, 10(1), 24. https://doi.org/10.1186/s40561-023-00240-2

    Article  MATH  Google Scholar 

  • Zhang, Z. (2021). Promoting student engagement with feedback: Insights from collaborative pedagogy and teacher feedback. Assessment & Evaluation in Higher Education, 56, 1–16.

    MATH  Google Scholar 

  • Zhang, Z. V., & Hyland, K. (2022). Fostering student engagement with feedback: An integrated approach. Assessing Writing, 51, 100586.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

The authors of "Empowering Students’ Agentive Engagement through Formative Assessment in Online Learning Environment" have made significant contributions to the research endeavor. Together, ZMH, WA, & NS conceived the study's scope and objectives, designed the methodology, and integrated technology into the online learning environment. Collaboratively, ZMZ conducted data collection, performed statistical analyses, and interpreted findings. Authorship responsibilities extend to a comprehensive literature review, manuscript drafting, critical review, and editing, with each author actively contributing to distinct sections while collectively ensuring the cohesion and quality of the final manuscript. The final approval attests to the shared commitment and accountability of all authors for the scholarly rigor and integrity of the article. ZMH, WA & NS have approved the submitted version and agreed both to be personally accountable for their contributions and to ensure that questions related to the accuracy or integrity of any part of this manuscript.

Corresponding author

Correspondence to Zohre Mohammadi Zenouzagh.

Ethics declarations

Ethics approval and consent to participate

Research ethical issues were established and acknowledged in this research.

This research involves Human Participants and Informed consent was recognized and acknowledged in this research.

Consent for publication

Authors adhere to the Journals Copy Right and publication policies, kindly confirm this is missing and need or not.

Competing interests

The authors report there are no competing interests to declare.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mohammadi Zenouzagh, Z., Admiraal, W. & Saab, N. Empowering students’ agentive engagement through formative assessment in online learning environment. Int J Educ Technol High Educ 22, 9 (2025). https://doi.org/10.1186/s41239-024-00498-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-024-00498-7

Keywords