Exploring the Effects of an Automated Writing Evaluation Tool on Metacognitive Engagement in Persuasive Writing

Authors

  • Xiaolan Wang
  • Supyan Hussin
  • Azlina Abdul Aziz

Keywords:

automated writing evaluation; metacognitive awareness; persuasive writing; EFL learners; self-regulated learning

Abstract

This study explored the impact of automated writing evaluation (AWE) on the development of metacognitive awareness in English as a foreign language (EFL) students’ persuasive writing within Chinese higher education. Grounded in self-regulated learning theory and Flavell’s (1979) metacognitive framework, the research investigates how AWE influences students’ abilities to plan, monitor, and evaluate their writing processes. Employing a single-group mixed-methods design over a 16-week intervention period, data were collected from 100 students through the Metacognitive Awareness Writing Questionnaire (MAWQ), reflective journals, and post-intervention interviews with a randomly selected subset of 10 participants. Quantitative results revealed negligible overall gains in metacognitive awareness but a recalibration of self-perceptions in areas such as planning and conditional knowledge. In contrast, qualitative data offered a more nuanced view. The students reported increased attention to text structure and grammar and demonstrated selective adoption of AWE feedback. However, many expressed uncertainties when faced with ambiguous or overly general suggestions, highlighting the ongoing need for teacher support. These findings suggest that while AWE tools such as PIGAI may effectively facilitate surface-level revisions, their capacity to foster deeper metacognitive engagement is limited without instructional scaffolding. To enhance pedagogical outcomes, it is recommended that AWE systems be integrated into a broader instructional framework, supported by explicit strategy training. Incorporating clearer rubrics and more contextualized, explanation-rich feedback may further promote students’ independent and strategic engagement with the writing process.

https://doi.org/10.26803/ijlter.24.10.30

References

Al Mamari, B. K. S. (2020). Bringing innovation to EFL writing through a focus on formative e-assessment: Omani post-basic education students’ experiences of and perspectives on automated writing evaluation (AWE) [Doctoral dissertation, University of Exeter]. ProQuest Dissertations Publishing. https://doi.org/10.13140/RG.2.2.27034.77763

Aull, L. (2023). Language patterns in secondary and postsecondary student writing. In D. West Brown & D. Z. Wetzel (Eds.), Corpora and rhetorically informed text analysis: The diverse applications of DocuScope (pp. 94–118). John Benjamins Publishing Company. https://doi.org/10.1075/scl.109

Bai, L., & Hu, G. (2016). In the face of fallible AWE feedback: How do students respond? Educational Psychology, 37(1), 67–81. https://doi.org/10.1080/01443410.2016.1223275

Barkaoui, K. (2024). Examining performance on an integrated writing task from a Canadian English language proficiency test. The Canadian Modern Language Review, 80(2), 77–115. https://doi.org/10.3138/cmlr-2023-0022

Boud, D., & Falchikov, N. (1989). Quantitative studies of student self-assessment in higher education: A critical analysis of findings. Higher Education, 18(5), 529–549. https://doi.org/10.1007/BF00138746

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Cotos, E. (2023). Automated feedback on writing. In O. Kruse, C. Rapp, C. M. Anson, K. Benetos, E. Cotos, A. Devitt, & A. Shibani (Eds.), Digital writing technologies in higher education (pp. 347–364). Springer. https://doi.org/10.1007/978-3-031-36033-6_22

Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approach (5th ed.). SAGE Publications.

Efklides, A. (2011). Interactions of metacognition with motivation and affect in self-regulated learning: The MASRL model. Educational Psychologist, 46(1), 6–25. https://doi.org/10.1080/00461520.2011.538645

Etikan, I., Musa, S. A., & Alkassim, R. S. (2015). Comparison of convenience sampling and purposive sampling. American Journal of Theoretical and Applied Statistics, 5(1), 1–4. https://doi.org/10.11648/j.ajtas.20160501.11

Farahian, M. (2017). Developing and validating a metacognitive writing questionnaire for EFL learners. Issues in Educational Research, 27(4), 736–750. https://search.informit.org/doi/10.3316/ielapa.218613409807923

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906–911. https://doi.org/10.1037/0003-066X.34.10.906

Fu, Q.-K., Zou, D., Xie, H., & Cheng, G. (2024). A review of AWE feedback: Types, learning outcomes, and implications. Computer Assisted Language Learning, 37(1–2), 179?221. https://doi.org/10.1080/09588221.2022.2033787

Gavina, M. A. & Ibay-Pamo, M. (2024). Effect of automated writing evaluation in higher education academic writing performance. EDULANGUE, 6(2), 118–137. https://doi.org/10.20414/edulangue.v6i2.8300

Kellogg, R. T. (2022). [Review of the book Executive function and writing, edited by T. Limpo & T. Olive]. Journal of Writing Research, 13(3), 473–479. https://doi.org/10.17239/jowr-2022.13.03.05

Koltovskaia, S. (2020). Student engagement with automated written corrective feedback (AWCF) provided by Grammarly: A multiple case study. Assessing Writing, 44, Article 100450. https://doi.org/10.1016/j.asw.2020.100450

Link, S., Dursun, A., Karakaya, K., & Hegelheimer, V. (2014). Towards best ESL practices for implementing automated writing evaluation. CALICO Journal, 31(3), 323–344. http://www.jstor.org/stable/calicojournal.31.3.323

Ramadhanti, D., & Yanda, D. P. (2021). Students’ metacognitive awareness and its impact on writing skill. International Journal of Language Education, 5(3), 193–206. https://doi.org/10.26858/ijole.v5i3.18978

Ranalli, J. (2018). Automated written corrective feedback: How well can students make use of it? Computer Assisted Language Learning, 31(7), 653–674. https://doi.org/10.1080/09588221.2018.1428994

Ranalli, J. (2021). L2 student engagement with automated feedback on writing: Potential for learning and issues of trust. Journal of Second Language Writing, 52, Article 100816. https://doi.org/10.1016/j.jslw.2021.100816

Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7(4), 351–371. https://doi.org/10.1007/BF02212307

Shokri, M. A., & Mousavi, K. (2024). Enhancing writing pedagogy: An exploration of metacognitive awareness raising strategies on creativity and critical thinking in writing courses. Bayan College International Journal of Multidisciplinary Research, 4(1), 11–41. http://bayancollegeijmr.com/index.php/ijmr/article/view/148

Stevenson, M., & Phakiti, A. (2019). Automated feedback and second language writing. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues (pp. 125–142). Cambridge University Press.

Teng, M. F. (2019). The role of metacognitive knowledge and regulation in mediating university EFL learners’ writing performance. Innovation in Language Learning and Teaching, 14(5), 436–450. https://doi.org/10.1080/17501229.2019.1615493

Teng, M. F. (2024). ChatGPT is the companion, not enemies: EFL learners’ perceptions and experiences in using ChatGPT for feedback in writing. Computers and Education: Artificial Intelligence, 7, Article 100270. https://doi.org/10.1016/j.caeai.2024.100270

Teng, M. F., & Yue, M. (2023). Metacognitive writing strategies, critical thinking skills, and academic writing performance: A structural equation modeling approach. Metacognition and Learning, 18(1), 237–260. https://doi.org/10.1007/s11409-022-09328-5

Teng, M. F., Qin, C., & Wang, C. (2022). Validation of metacognitive academic writing strategies and the predictive effects on academic writing performance in a foreign language context. Metacognition and Learning, 17, 167–190. https://doi.org/10.1007/s11409-021-09278-4

Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers & Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004

Woodworth, J. (2022). Hybrid feedback: The efficacy of combining automated and teacher feedback for second language academic writing development [Doctoral dissertation, York University]. York University Institutional Repository. http://hdl.handle.net/10315/39657

Zhai, N., & Ma, X. (2022). The effectiveness of automated writing evaluation on writing quality: A meta-analysis. Journal of Educational Computing Research, 61(4), 875–900. https://doi.org/10.1177/07356331221127300

Zhang, J., & Zhang, L. J. (2022). The effect of feedback on metacognitive strategy use in EFL writing. Computer Assisted Language Learning, 37(5–6), 1198–1223. https://doi.org/10.1080/09588221.2022.2069822

Zhang, Z. V., & Hyland, K. (2022). Fostering student engagement with feedback: An integrated approach. Assessing Writing, 51, Article 100586. https://doi.org/10.1016/j.asw.2021.100586

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70. https://doi.org/10.1207/s15430421tip4102_2

Downloads

Published

2025-10-30