How similar to experts are introductory physics students’ cognitive structures upon completing the course?
Study Overview
This study, which expands the work previously done by Neiles et al., explores students’ structural knowledge—i.e., how students cognitively store and retrieve information—in introductory physics. Oftentimes, researchers measure student understanding of course material through exams, concept inventories and concept mapping, but these forms of measurement may not provide a full picture of a student’s understanding. For instance, on a multiple-choice test, a student may answer a question correctly but for the wrong reason. Similarly, while concept mapping provides a more detailed picture of a student’s understanding, it is more subjective than using the Pathfinder analysis.
Our study uses the Pathfinder program, which provides us with a quantitative measure of how similar a student’s cognitive structure (i.e., “knowledge structure”) is to that of an expert. Pathfinder also creates a qualitative, visual representation of a student’s cognitive structure of course concepts. This visual mapping tells one what connections students are making, what misconceptions they may have about the material, as well as connections or similarities between key concepts that they are missing.
We administered a post-course survey, in both semesters of introductory physics, asking the students to relate how similar or dissimilar 15 key physics concepts were to each other (a total of 105 ratings). These 105 ratings are assigned a scale of 1-7, where 1 represents highly dissimilar concepts, and 7 represents highly similar concepts. For instance, students are given this scale of 1–7 to judge how similar Velocity and Angular Velocity are to each other. The similarity ratings are entered into the Pathfinder program to produce a knowledge structure for each student, providing us with a visual network of the relatedness of these key terms. This knowledge structure can be compared to the instructor’s knowledge structure (derived from their similarity ratings of the same key concepts) to gauge how expert-like the student’s knowledge is.
Our overall goal is to see if Pathfinder can predict student performance. We look to see how a student’s knowledge structure correlates with their overall grade, average-exam score, concept-inventory score, as well as a problem-solving assessment we have developed.
Collaborators
Mark McDaniel (CIRCLE, Psychological & Brain Sciences)
Gina Frey (CIRCLE, Chemistry)
Siera Stoen (CIRCLE, Physics)
Mike Cahill (CIRCLE)
Mairin Hynes (Physics)
Funding
This study was supported by the AAU STEM initiative grant and the Washington University Moog Fund.
References
Neiles, K. Y., Todd, I., & Bunce, D. M. (2016). Establishing the Validity of Using Network Analysis Software for Measuring Students’ Mental Storage of Chemistry Concepts. Journal of Chemical Education, 93(5), 821-831.