Low-tech Tests Shortchange High-tech Students, According to Study

(7-7-99) -- A new study confirms that writing tests administered via paper and pencil may significantly underestimate the capabilities of computer-savvy students, according to assessment specialists at the Center for the Study of Testing, Evaluation and Educational Policy at Boston College.

The study, published by the Education Policy Analysis Archives, compares groups of students taking open-ended (non-multiple choice) tests on paper to those taking the same tests on a computer.

Study results showed that, for students accustomed to writing on computer, responses composed on computer were substantially better than those written by hand. Specifically, for students who keyboard about 20 words per minute or more, performing open-ended language arts tests on paper significantly underestimates their level of achievement. However, for slower keyboarders, performing open-ended tests on computer adversely affects their performance.

These results confirmed those of an earlier CSTEEP study, in which the effects were so large that when tech-savvy students wrote on paper, only 30 percent performed at a "passing" level, but when they wrote on computers (without access to word processing tools such as spell check or grammar check), 67 percent "passed."

"The size of the effects was substantial," said CSTEEP Research Associate Mike Russell, author of both studies. "For the average student accustomed to working on computer, this difference could easily raise his or her score on the MCAS [Massachusetts Comprehensive Assessment System] test from the 'needs improvement' to the 'proficient' level."

Using items from the MCAS and the National Assessment of Educational Progress, the most recent study focused on language arts, science and math tests administered to approximately 230 eighth grade students with different levels of computer skill at two Worcester, Mass. schools. In addition, information on these students' prior computer use and keyboarding speed was collected. The earlier study had measured approximately 100 students participating in a computer-intensive project at the Advanced Learning Laboratory (ALL School) in Worcester (which also participated in the second study) and was published by Education Policy Analysis Archives in 1997.

The results of both studies suggest that the mode of test administration may have a great effect on students' performance on open-ended items, findings which the researchers believe have significant implications for two critical areas: the increasing integration of technology into student learning; and overall education assessment efforts.

Nearly 10 million students nationwide, about half of whom use computers in school, take some form of written state test each year. The researchers believe these study findings indicate that state paper-and-pencil tests may be underestimating the abilities of 2 million to 3 million of these students annually.

And yet, noted the researches, students, teachers and schools are increasingly held accountable for student learning as gauged by handwritten test results -- which has pressured some school administrations into rather extreme measures. At the ALL School, the results prompted the school to increase the amount of time students spent writing on paper and decrease students' time using computers. According to the researchers, that action is comparable to asking modern day mathematicians to abandon calculators for slide rules so that they can perform better on tests that only allow slide rules.

"It's an understandable, but unfortunate, reaction to some important findings," said Professor Walt Haney of CSTEEP. "But, we need to ask, 'What's more important here, that students use traditional writing methods, or that tests measure their abilities regardless of the method the student prefers for writing?'"

Russell notes that there are several options available to schools to improve the situation. "The most logical solution in the short term is to simply recognize that there is a problem," he says, "and that scores from high-stakes state tests are not necessarily a good measure of a student's ability."

Adds Haney, "It's important to take other measures into consideration, such as transcripts and portfolio assessments."

[Coverage of the study appeared in the July 7 Boston Globe, and the study's authors wrote an opinion piece on the subject in the July 1 Christian Science Monitor.]

Back to InfoEagle Home Page

Back to News and Information from Boston College