USING ALGEBRA WORD-PROBLEMS TO ASSESS QUANTITATIVE ABILITY - ATTRIBUTES, STRATEGIES, AND ERRORS

Citation
Mm. Sebrechts et al., USING ALGEBRA WORD-PROBLEMS TO ASSESS QUANTITATIVE ABILITY - ATTRIBUTES, STRATEGIES, AND ERRORS, Cognition and instruction, 14(3), 1996, pp. 285-343
Citations number
63
Categorie Soggetti
Psychology, Educational","Psychology, Experimental
Journal title
ISSN journal
07370008
Volume
14
Issue
3
Year of publication
1996
Pages
285 - 343
Database
ISI
SICI code
0737-0008(1996)14:3<285:UAWTAQ>2.0.ZU;2-J
Abstract
Changing goals in mathematics education have encouraged more open-ende d problem solving in assessment. However, the use of these less constr ained approaches has been limited by a lack of demonstrated relations between the underlying cognitive models and measurement consequences. In order to begin to characterize the cognitive basis for this emergin g approach to measurement in a small domain-algebra word problems-deta iled analyses of solutions to 20 problems that had appeared on the Gra duate Record Examination General Test were collected from 51 undergrad uates. Problems were characterized in terms of their major attributes, and solutions were described by students' strategies and errors. Regr ession analyses indicated that models including attributes such as the need to apply algebraic concepts, problem complexity, and problem con tent could account for 37% to 62% of the variance in problem difficult y. Protocol analyses identified four major solution strategies-equatio n formulation, ratio setup, simulation, and other (unsystematic) appro aches-as well as a number of collateral strategies, including the use of pictures, formulae, and verbal descriptions. Higher achieving stude nts used more equation strategies, more collateral strategies, and few er unsystematic approaches than lower achieving students. Student erro rs tended to be idiosyncratic but could be classified into six princip al categories that were used to identify sources of performance. Overa ll, the results support the notion that constructed responses capture strategy formulation and high-level planning, as do more traditional m easures of quantitative reasoning. At the same time, constructed respo nses are more sensitive to individual problem characteristics and proc edural errors that may be helpful in instruction but are a potential s ource of bias in assessment. A preliminary theoretical framework for d escribing performance on algebra word problems is proposed, and its us efulness for instruction and for more systematic design of tests is di scussed.