How Students Behave while Solving Critical Thinking Tasks in an Unconstrained Online Environment: Insights from Process Mining

Keywords: critical online reasoning, process mining, event logs, process patterns

Abstract

To learn successfully with the use of various internet resources, students must acquire critical thinking skills that will enable them to critically search, evaluate, select and verify information online. Defined as Critical Online Reasoning, this complex latent construct manifests itself in an unconstrained online environment and is measured on two levels: students’ work product (an essay) and the process of task completion (online behaviour patterns). This research employs process mining techniques to investigate the possibility of distinguishing between students’ successful and unsuccessful attempts to take the test. The findings of the work were gained on generalised behaviour patterns from the process mining algorithm deployed on two groups of students (63 low performing and 45 high performing students). Divided by the work product score, the two groups exposed some differences in their online behaviour, with the high performers showing more strategic behaviour and effective search and use of information online. However, the research has also shown the downside of process mining as a tool for generalisation of process patterns.

Downloads

Download data is not yet available.

References

Aalst van der W.M.P. (2016) Process Mining: Data Science in Action. Heidelberg: Springer.

Arpasat P., Premchaiswadi N., Porouhan P., Premchaiswadi W. (2021) Applying Process Mining to Analyze the Behavior of Learners in Online Courses. International Journal of Information and Education Technology, vol. 11, no 10, pp. 436-443. https://doi.org/10.18178/ijiet.2021.11.10.1547

Benevento E., Aloini D., Aalst W. (2022) How Can Interactive Process Discovery Address Data Quality Issues in Real Business Settings? Evidence from a Case Study in Healthcare. Journal of Biomedical Informatics, vol. 130, no 4, Article no 104083. http://dx.doi.org/10.1016/j.jbi.2022.104083

Care E., Kim H., Vista A., Anderson K. (2018) Education System Alignment for 21st Century Skills: Focus on Assessment. Washington, DC: Center for Universal Education at The Brookings Institution.

Dewey J. (1910) How We Think. Lexington, MA: D.C. Heath and Company. https://doi.org/10.1037/10903-000

Eichmann B., Greiff S., Naumann J., Brandhuber L., Goldhammer F. (2020) Exploring Behavioural Patterns during Complex Problem‐Solving. Journal of Computer Assisted Learning, vol. 36, no 6, pp. 933–956. https://doi.org/10.1111/jcal.12451

Griffin P., McGaw B., Care E. (2012) The Changing Role of Education and Schools. Assessment and Teaching of 21st Century Skills (eds P. Griffin, B. McGaw, E. Care), Dordrecht, Germany: Springer Science+Business Media B.V., pp. 1–16. http://dx.doi.org/10.1007/978-94-007-2324-5_2

Hargittai E., Fullerton L., Menchen-Trevino E., Thomas, K.Y. (2010) Trust Online: Young Adults’ Evaluation of Web Content. International Journal of Communication, vol. 4, no 1, pp. 468–494.

Kerr D., Andrews J.J., Mislevy R.J. (2016) The In-Task Assessment Framework for Behavioral Data. The Wiley Handbook of Cognition and Assessment: Frameworks, Methodologies, and Applications (eds A.A. Rupp, J.P. Leighton), Wiley-Blackwell, pp. 472–507. https://doi.org/10.1002/9781118956588.ch20

Kitchenham B., Charters S. (2007) Guidelines for Performing Systematic Literature Reviews in Software Engineering. Version 2.4. EBSE Technical Report no EBSE-2007-01. Keele, UK: Keele University.

Lai E.R. (2011) Critical Thinking: A Literature Review Research Report. London: Parsons.

Lightbown P.M., Spada N. (2021) How Languages Are Learned. Oxford: Oxford University.

Liu O.L., Rios J.A., Heilman M., Gerard L., Linn M.C. (2016) Validation of Automated Scoring of Science Assessments. Journal of Research in Science Teaching, vol. 53, no 2, pp. 215–233. http://dx.doi.org/10.1002/tea.21299

McGrew S., Breakstone J., Ortega T., Smith M., Wineburg S. (2018) Can Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning. Theory & Research in Social Education, vol. 46, no 2, pp. 165–193. https://doi.org/10.1080/00933104.2017.1416320

McGrew S., Smith M., Breakstone J., Ortega T., Wineburg S. (2019) Improving University Students’ Web Savvy: An Intervention Study. British Journal of Educational Psychology, vol. 89, no 3, pp. 485–500. https://doi.org/10.1111/bjep.12279

Mislevy R.J. (2018) Sociocognitive Foundations of Educational Measurement. New York. NY: Routledge.

Mislevy R., Almond R., Lukas J. (2003) A Brief Introduction to Evidence-Centered Design. CSE Report no 632. Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2003.tb01908.x

Mislevy R.J. (2012) Four Metaphors We Need to Understand Assessment. Washington, DC: The Gordon Commission on the Future of Assessment in Education.

Molerov D., Zlatkin-Troitschanskaia O., Nagel M.T., Brückner S., Schmidt S., Shavelson R.J. (2020) Assessing University Students’ Critical Online Reasoning Ability: A Conceptual and Assessment Framework with Preliminary Evidence. Frontiers in Education, vol. 5, December, Article no 577843. https://doi.org/10.3389/feduc.2020.577843

Padilla-García J.L., Benítez Baena I. (2014) Validity Evidence Based on Response Processes. Psicothema, vol. 26, no 1, pp. 136–144. http://dx.doi.org/10.7334/psicothema2013.259

Rozinat A., Aalst van der W.M.P. (2008) Conformance Checking of Processes Based on Monitoring Real Behavior. Information Systems, vol. 33, no 1, pp. 64–95. https://doi.org/10.1016/j.is.2007.07.001

Schmidt S., Zlatkin-Troitschanskaia O., Roeper J., Klose V., Weber M., Bültmann A.-K., Brückner S. (2020) Undergraduate Students‘ Critical Online Reasoning: Process Mining Analysis. Frontiers in Psychology, vol. 11, November, Article no 576273. https://doi: 10.3389/fpsyg.2020.576273

Shavelson R.J., Zlatkin-Troitschanskaia O., Beck K., Schmidt S., Marino J.P. (2019) “Assessment of University Students’ Critical Thinking: Next Generation Performance Assessment”. International Journal of Testing, vol. 19, no 4, pp. 337–362. https://doi.org/10.1080/15305058.2018.1543309

Stadler M., Fischer F., Greiff S. (2019) Taking a Closer Look: An Exploratory Analysis of Successful and Unsuccessful Strategy Use in Complex Problems. Frontiers in Psychology, vol. 10, May, Article no 777. https://doi.org/10.3389/fpsyg.2019.00777

Tang X., Wang Z., He Q., Liu J., Ying Z. (2020) Latent Feature Extraction for Process Data via Multidimensional Scaling. Psychometrika, vol. 85, no 2, pp. 378–397. https://doi.org/10.1007/s11336-020-09708-3

Tarasova K.V., Orel E.A. (2022) Measuring Students’ Critical Thinking in Online Environment: Methodology, Conceptual Framework and Tasks Typology. Voprosy obrazovaniya / Educational Studies Moscow, no 3, pp. 187–212. https://doi.org/10.17323/1814-9545-2022-3-187-212

Ulitzsch E., He Q., Pohl S. (2021) Using Sequence Mining Techniques for Understanding Incorrect Behavioral Patterns on Interactive Tasks. Journal of Educational and Behavioral Statistics, vol. 47, no 1, pp. 3–35. https://doi.org/10.3102/10769986211010467

Ulitzsch E., Pohl S., Khorramdel L., Kroehne U., von Davier M. (2022) A Response-Time-Based Latent Response Mixture Model for Identifying and Modeling Careless and Insufficient Effort Responding in Survey Data. Psychometrika, vol. 87, no 2, pp. 593–619. https://doi.org/10.1007/s11336-022-09846-w

Verbeek H.M.W., Buijs J.C.A.M., van Dongen B.F., Aalst van der W.M.P. (2010) ProM 6: The Process Mining Toolkit. Proceedings of the Business Process Management 2010 Demonstration Track (Hoboken NJ, USA, 2010, September 14–16), pp. 34–39.

Weber H., Becker D., Hillmert S. (2019) Information-Seeking Behaviour and Academic Success in Higher Education: Which Search Strategies Matter for Grade Differences among University Students and How Does This Relevance Differ by Field of Study? Higher Education, vol. 77, no 4, pp. 657–678. https://doi.org/10.1007/s10734-018-0296-4

Weber H., Hillmert S., Rott K.J. (2018) Can Digital Information Literacy among Undergraduates Be Improved? Evidence from an Experimental Study. Teaching in High Education, vol. 23, no 8, pp. 909–926. https://doi.org/10.1080/13562517.2018.1449740

Weijters A.J.M.M., Aalst van der W.M.P., Alves De Medeiros A.K. (2006) Process Mining with the Heuristics Miner Algorithm. BETA Working Papers no 166. Eindhoven: Technische Universiteit Eindhoven.

Wineburg S., McGrew S. (2017) Lateral Reading: Reading Less and Learning More When Evaluating Digital Information. Stanford History Education Group Working Paper no 2017-A1. http://dx.doi.org/10.2139/ssrn.3048994

Zieky M.J. (2014) An Introduction to the Use of Evidence-Centred Design in Test Development. Psicología Educativa, vol. 20, no 2, pp. 79–87. https://doi.org/10.1016/j.pse.2014.11.003

Zlatkin-Troitschanskaia O., Beck K., Fischer J., Braunheim D., Schmidt S., Shavelson R.J. (2020) The Role of Students’ Beliefs When Critically Reasoning from Multiple Contradictory Sources of Information in Performance Assessments. Frontiers in Psychology, vol. 11, September, Article no 2192. http://dx.doi.org/10.3389/fpsyg.2020.02192

Zlatkin-Troitschanskaia O., Brückner S., Nagel M.-T., Bültmann A.-K., Fischer J., Schmidt S., Molerov D. (2021) Performance Assessment and Digital Training Framework for Young Professionals' Generic and Domain-Specific Online Reasoning in Law, Medicine, and Teacher Practice. Journal of Supranational Policies of Education, no 13, pp. 9–36. https://doi.org/10.15366/jospoe2021.13.001

Published
2024-10-22
How to Cite
BeliaevaAnastasia. 2024. “How Students Behave While Solving Critical Thinking Tasks in an Unconstrained Online Environment: Insights from Process Mining”. Voprosy Obrazovaniya / Educational Studies Moscow 1 (3). https://doi.org/10.17323/vo-2024-18051.
Section
Research Articles