This paper describes an automated scorer for assessing students' Creative Problem-Solving (CPS) abilities via modeling the intra-structure of students' essays describing their thoughts on solving particular problems. The automated scorer aims to grade students' open-ended responses to an essay-question-type CPS ability test, instead of using typical Likert-type or multiple-choice questions that may be imper- fect to assess the creative perspective of human problem-solving. The scorer is dis- tinguishable to most generic automated essay scoring systems that a bipartite graph- based representation is explicitly built for the pair-wise relation between a student's ideas and self-explained reasons for a CPS task. This design will enable several ana- lytical approaches for CPS, such as quantitative scoring and qualitative diagnoses. The preliminary empirical evaluation with 20 students' data shows that the scoring results of the scorer is satisfactory and highly correlated with those of human experts (Pearson's r=.67~.82) in terms of quantitative scoring task. The approach provides a promising solution to support large-scaled studies on human creativity and may fur- ther enable CPS-aware personalization systems.
International Conference on Computers in Education - ICCE , pp. 524-531