masthead

HomeAboutCalendarPeopleForumArchive

Feb. 17, 2005 • Volume 13 Number 11

Lynch School of Education researcher Michael Russell directed the USEIT study of computer use among schoolchildren.

Mixed Marks for Computers

BC, UMass-Lowell study finds computers don't always help kids on tests

By Patricia Delaney
Associate Director of Public Affairs
Students who use computers more for schoolwork than pleasure are likely to perform better on standardized tests, according to a new study by researchers at Boston College and the University of Massachusetts at Lowell.

Furthermore, students' class time spent using technology to create multimedia projects does not translate to better test performance, say the study's authors.

Analyzing test performance and computer uses of 986 fourth grade students from 55 classrooms in nine Massachusetts school districts, the study found that the more regularly students use computers to write papers for school, the better they performed on the Massachusetts Comprehensive Assessment Systems (MCAS) English/Language Arts exam. This positive effect occurred despite the fact that students were not allowed to use computers for the test.

Conversely, the study found that students' recreational use of computers to play games, explore the Internet for fun, or chat with friends at home had a negative effect on students' MCAS reading scores. Similarly, students' use of computers to create PowerPoint presentations or other multimedia projects in class was also negatively associated with MCAS writing scores.

This study of students' MCAS performance is part of the three-year "Use, Support and Effect of Instructional Technology" (USEIT) study conducted by the Technology and Assessment Study Collaborative of the Lynch School of Education at Boston College. Funded by the US Department of Education, USEIT assesses educational technology across 22 Massachusetts school districts.

The MCAS achievement component of USEIT is the most sophisticated analysis of the relationships between students' computer use and test performance conducted to date. Building on several shortcomings of past research on this topic, this study collected detailed measures of a variety of student uses of computers in and out of school, allowing for differences in home learning environments, in students' prior achievement and in teachers' instructional practices.

At a time when standardized testing plays an increasingly important role in shaping students' learning experiences and teachers' instructional practices, the USEIT researchers believe this study provides evidence that students' computer use does have an impact on student achievement as measured by tests like MCAS. More importantly, they say, the study demonstrates that access to computers alone does not translate to better academic performance: Different uses of computers have different effects on student learning.

"These findings are important for two reasons," said the study's director, Michael Russell of BC's Lynch School of Education. "First, at a time when schools are under increased pressure to raise test scores, yet are also facing budget shortfalls, this study provides evidence that investments in computers can have positive effects on student achievement. Second, it shows that teachers and students must be thoughtful about how computers are used and what types of learning they expect to impact."

Students' use of computers throughout the writing process had a statistically significant positive effect on MCAS writing scores, Russell said. "Using computers simply to type in final drafts of essays, however, had no effect on students' test performance. These findings are consistent with past research and demonstrate the importance of allowing students to use computers to produce rough drafts, edit their papers, and to produce final drafts."

The study's authors speculate that students who use computers more for recreational purposes may fare more poorly in test performance, particularly in reading, because they simply spend more time at the keyboard than with a book.

Similarly, the study found that use of computers in school to create presentations was negatively associated with writing test scores. According to the researchers, this negative relationship may result from students spending less time writing during class time and more time creating and revising multimedia projects that contain relatively small amounts of written work. In essence, time spent creating presentations may detract from time available during class to develop students' writing skills.

"When examining the effect of computer use on student learning, it is important to consider how well a specific use is aligned with the measure of learning," added the study's lead author, Laura O'Dwyer, a former Boston College researcher now with the Graduate School of Education at UMass Lowell. "While this study found that use of computers to create presentations was negatively associated with writing scores, it does not mean that students should not be creating presentations with computers. Creating presentations may be a positive learning experience, but such effects are not captured by a test like MCAS that measures reading and writing skills."

Adds BC researcher Damian Bebell, the study's third author, "Although this study finds some interesting effects of students' use of computers, teachers in this study generally did not use technology to teach. As more and more schools, districts, and states provide teachers and students with their own laptops, it will be interesting to see if teachers are able to use technology more in the classroom and if these uses add to the effects of student technology use."

The full report is available on-line at www.bc.edu/research/intasc/studies/USEIT/pdf/USEIT_r10.pdf.

top of page