SPARST: Assessing the Science Process and Reasoning Skills of Undergraduate Science Majors Abstract

We are creating an on-line, valid and reliable multiple choice test to assess undergraduate life science major’s mastery of the science process and reasoning skills of experimental design, data analysis, graphing, and science communication. The diagnostic test, SPARST, can be used by faculty and departments to assess student progress though the major as well as to assess the effectiveness of current and transformed pedagogy.

We have surveyed faculty to identify learning outcomes for the four science skills, written questions to address each outcome, vetted the questions with expert faculty to establish content validity, and used student focus group to determine readability of the test. Item Response Theory (IRT) analysis of initial data sets show that the test can discriminate between the academic ability of freshman and senior biology majors.

We will present the method we used to develop SPARST as well as list target learning outcomes that guided development of questions on each of the four SPARST subtests and show results from the initial IRT analysis. An unexpected impact of the project is the increased awareness and deeper understanding of how to both teach and assess science process skills that the faculty on our advisory boards gain by generating and vetting the SPARST questions.