Impact of Computer-Based Tests vs. Paper-Based Tests on Australian Students
Increasingly, Australian students are completing tests on computers, including major assessments for national literacy and numeracy progress. This initiative is designed to "prepare students for the future" by familiarizing them with ubiquitous technology, the Conversation wrote.
However, our two recent studies indicate that students might respond differently to test questions depending on whether they use computers or paper. This issue is especially relevant given recent NAPLAN results, which suggest a significant number of Australian students are struggling with basic English and math skills. NAPLAN has been fully online for two years in Years 3, 5, 7, and 9.
Our Research Findings
We analyzed 43 studies comparing computer-based and paper-based tests across 18 countries, including Australia, the United States, Germany, and the United Kingdom. Fourteen studies focused specifically on school-aged children. Generally, younger school students with lower computer skills performed better on paper. This disparity lessened as students aged.
Particularly, scores were lower for complex questions on computers requiring multiple steps. This phenomenon links to the demands on working memory—the brain's capacity to juggle multiple pieces of information simultaneously, like a list of names and coffee orders. High cognitive load occurs when working memory is overloaded.
High cognitive load may happen when students are not accustomed to a specific computer, program, testing platform, or browser, or when the questions themselves are complex, requiring students both to figure out the answers and remember how to use the computer.
Comparison of Paper and Computer Tests
In a 2023 study, we observed this "high cognitive load" even in high school students familiar with the computers used in a science test. Our study involved 263 Year 9 science students from a Perth school using personal devices. These students completed a test on a computer and a very similar test on paper, with questions classified as "easy" and "hard."
Students statistically scored higher (about 7%) on easy computer-based questions than easy paper-based questions. Conversely, they performed significantly better (about 12%) on hard questions on paper than on a computer. This suggests that the computer testing mode can increase cognitive load, akin to a computer slowing down when too many programs are open.
This was supported by a 2018 study on children's verbal skills. Additional tests on Year 9 students' working memory capacity using number sequences indicated that controlling for working memory capacity eliminated score differences between paper and computer. This implies students with lower working memory are particularly disadvantaged in computer-based tests, an issue relevant to individuals with attention-deficit hyperactivity disorder (ADHD), present in one or two pupils per classroom.
Recommendations
Integrating computers into education is valuable, but computer-based tests are not equivalent to paper-based tests. Schools might consider:
- Allowing extra time for complex tasks or tests on a computer.
- Teaching word processing skills from an early age.
- Reducing digital distractions during tests and classwork, such as pop-ups and online games.
Earlier, SSP told that the world's fastest microscope captured electron motion.