Investigating the suitability of automatically generated test items for real tests

Authors

  • Margit Hoefler Graz University of Technology
  • Mohammad AL-Smadi Graz University of Technology
  • Christian Guetl Graz University of Technology

Abstract

Recent research became increasingly interested in developing tools that are able to automatically generate test items out of a learning content. In this paper, we describe the experience with the Enhanced Automatic Question Creator (EAQC), a tool developed at the Institute for Information Systems and Computer Media (IICM) at Graz University of Technology. Up to now, we have focused on evaluating the formal and content quality of the test items generated by the EAQC. We have shown that the test items mainly ask for the knowledge and understanding of a learning content. In addition, they are comparable to manually generated test items with regard to their relevance and difficulty. The EAQC also provides appropriate answers for the test items. From these results we have concluded that the EAQC-based test items may effectively support learners in self-directed and informal learning settings. Here we want to go one step further and investigate the suitability of the EAQC-based test items from the instructor's viewpoint. In general, generating test items for written tests and exams can be a time-consuming and challenging task. In this paper, we discuss requirements for such test items and present results of two different studies in which test items generated by the EAQC were used in real written tests. Results showed that students performed equally well on these EAQC-based test items as compared to manually generated ones. The general test performance also reflected an enhancement in the learning progress. Thus, test items provided by the EAQC might not only support self-directed learners but also instructors in their work. Nevertheless, our findings also showed some room for improvement and point towards directions for future research.

Downloads