logo

Computer-Based Assessment

 

Results

Examining the Effect of Computer-Based Passage Presentation on Reading Test Performance
Jennifer Higgins, Michael Russell
& Thomas Hoffmann
41 pages
Download PDF

Examining the Feasibility and Effect of a Computer-Based Read-Aloud Accommodation on Mathematics Test Performance
Helena Miranda, Michael Russell
& Thomas Hoffmann
42 pages
Download PDF

Examining the Effect of Text Editor and Robust Word Processor on Student Writing Test Performance
Michael Russell, Jennifer Higgins
& Thomas Hoffmann
73 pages
Download PDF

Examining the Feasibility and Effect of Computer-Based Verbal Response to Open-Ended Reading Comprehension Test Items
Helena Miranda, Michael Russell,
Kevon Seeley & Thomas Hoffmann
39 pages
Download PDF

Demonstrations of Instruments

Below please find files for Macintosh and Windows operating systems. Please note that these files were designed to run in a controlled testing environment, running on Macintosh iBook G3 800 Mhz computers, 384MB RAM, OSX 10.3, using a single button mouse. The files provided use a Macromedia Flash Projector, so no plugins or other applications are needed to try them. All materials are the copyright of inTASC at Boston College, 2004.

For Macintosh (Stuffit):

 

For Windows (Zipped):

 

Paper Documents (PDF)

 

The Enhanced Assessment Project

New England Compact's Enhanced Assessment Project was a federally-funded 18-month project, which supported New England states (Maine, New Hampshire, Rhode Island and Vermont) in their development of assessments that increase access for students with special needs. In conjunction with Educational Development Center (EDC), the Center for Applied Special Technology (CAST) and the four New England Compact states, researchers at inTASC conducted four studies to understand how technology can be used to increase assessment validity and access. Two of the studies (writing and text presentation) examined how the mode and presentation of test delivery and response affects test scores of all students. The remaining two studies (read aloud and verbal response) specifically addressed accommodations for English language learners and special needs students. These large-scale multi-state research projects provided states in the New England Compact with evidence of the effectiveness of applications of computer based testing to state assessments and informed the development of future assessment instruments.

Writing

The purpose of the writing study was to replicate prior experiments that examined the effect of computer-use during writing tests, particularly for students accustomed to writing with computers. The study examined the effects of two "accommodations": a) use of a word processor with automatic editing tools disabled; b) use of a word processor with automatic editing tools enabled. To examine the effects of word processing with and without editing tools on student performance, a randomized experimental design was employed. 1,800 students were randomly assigned to one of three groups: word processor with editing tools, word processor without editing tools, and paper and pencil. A short questionnaire and keyboarding test were administered to all students in the study in order to collect background and computer literacy information. After completing the questionnaire and keyboarding test, all students were given a released state writing prompt. Students were asked to compose an essay on either a word processor (with or without editing) or on paper and pencil. Answers composed on paper were transcribed to computer text to control for mode of scoring effects. Responses were double-scored.

See Examining the Effect of Text Editor and Robust Word Processor on Student Writing Test Performance for results.

Text Presentation

The presentation of reading passages when computer based testing is used for comprehension tests can be problematic. The length of the passage often requires students to scroll down after reading part of the passage. While students who are experienced with using the Internet or reading text passages on a computer may be able to navigate through a reading passage with scrolling, those students who are less familiar with technology may be negatively affected by this format. The purpose of this study was to examine the difference in performance when two different text formats are used to present reading passages. 420 students were randomly assigned to one of three groups: paper and pencil delivery and response, computer delivery with scrolling text and computer response, and computer delivery with whole page text and computer response. Students in all three groups took one test, containing four reading passages and twelve multiple-choice items. All passages and items were taken from previously administered fourth grade state assessments.

See Examining the Effect of Computer-Based Passage Presentation on Reading Test Performance for results.

Read Aloud

Reading aloud tests is the most common assessment accommodation provided to students for large-scale tests. Typically provided with a human who reads the test aloud to a group of students, the accommodation may significantly impact many students’ ability to proceed independently through the test. This study examined how digitally recorded human voice presented in a computer-based test format compared to the standard paper-based read aloud accommodation.

For this study, one test form containing mathematics items was delivered in three formats: read aloud using a human proctor, computer-based without read aloud, and computer-based with read aloud.

See Examining the Feasibility and Effect of a Computer-Based Read-Aloud Accommodation on Mathematics Test Performance (PDF) for results.

Verbal Response

Reading comprehension is most often measured by asking students to write a response to a prompt after reading a passage. However, students who are not able to express themselves in a written manner may be unable to show that they comprehend. The purpose of this study was to examine the difference in performance when verbal response to open-ended items is used to measure comprehension. Specifically, the study examined the effects of allowing students to respond to items about a reading passage using: a) paper delivery and paper and pencil written response; b) computer delivery and word processed response; c) computer delivery and verbal response.

For this study, one test form, containing four reading passages and eight open response items were administered to all students. All items were delivered via the computer. 280 students were randomly assigned to take the assessment using oral response or written response. Students in the verbal response (VR) group responded orally, to the computer. Students in the written response (WR) group responded using paper and pencil. And students in the computer response group (TR) typed their response directly on the computer. All items were taken from previously administered state assessments. Questionnaires and interviews were used to gain a deeper understanding of student response preferences. Analysis of student performance, questionnaires, and student interviews provided evidence for determining if there are differences in student performance based on whether the student response is oral or written, and if ELL and students with disabilities benefit from this oral response accommodation more than regular education students.

See Examining the Feasibility and Effect of Computer-Based Verbal Response to Open-Ended Reading Comprehension Test Items (PDF) for results.

 

© Boston College. All rights reserved. inTASC is affiliated with the Center for the Study of Testing, Evaluation and Educational Policy (CSTEEP) in the Lynch School of Education. Email us at inTASC@bc.edu.