Physics

Overview

Our research focuses on introductory college-level physics labs. What do students learn? How can we help them learn better? And what does learning in the lab offer that learning in lecture does not? We have been developing labs that introduce students to the nature of scientific measurement while also developing a conceptual understanding of measurement and uncertainty, a procedural toolbox for handling and analyzing data, and critical thinking behaviours to reason about data scientifically. We are generally interested in how student epistemologies interact with their learning in the lab and how to foster more productive epistemologies and attitudes. In the future, we plan to examine how the pedagogy we have developed transfers to other disciplines, higher-level lab courses, and to undergraduate research experiences. We are also interested in gender and diversity issues in the lab, given that students spend a lot of their time in a lab working on computers and hands-on experiments.

Affiliated Researchers

Faculty

Post-doctoral researchers

Graduate Students

  • Meagan Sundstrom
  • Emily Stump
  • Rebeckah Fussell
  • Matthew Dew

Physics Lab Inventory of Critical thinking (PLIC)

The Physics Lab Inventory of Critical thinking (PLIC) is a closed-response survey designed to assess how students critically evaluate experimental methods, data, and models. We define critical thinking as the ways in which you make decisions about what to do and what to trust. In a scientific context, this decision-making is based in evidence that includes data, models, analyses, prior literature, etc.

Want to try out the PLIC? Please click here to access the "expert" version of the survey!

Interested in using the PLIC? Please click here to access the course information survey!

Need to edit your start/end dates for the survey? Visit this form.

Want to view your data after the course ends? Get your unique course ID (emailed to you) and click here

Design and Validation of the PLIC

The original conception of the PLIC came from analysis of student decision-making and reasoning in lab notes [1]: how well they interpreted data, whether they attempted to improve their experiments, and whether they identified and explained disagreements between data and models. To facilitate broad assessment (rather than painstaking analysis of lab notes), we create a context-rich, closed-response assessment. The PLIC context focuses on two groups completing a mass on a spring experiment to test the relationship between the period of oscillation and mass on the spring. The PLiC poses questions asking students to: a) interpret and evaluate the sample data, b) evaluate the methods, and c) suggest what the group should do next. From a free-response version, we collected common student responses and aggregated them into a closed response version. The questions include a mix of single-option Likert-scale questions (e.g. How well did the group's method evaluate the model?) and choose-many explanation questions (e.g. Which of the following best support your reasoning?). Students are limited to no more than three options for each 'choose-many' question. 

Responses were collected from over 50 expert physicists at multiple institutions. These responses were used to improve the assessment and to generate the scoring scheme. Students' scores, therefore, are related to how closely they respond to what experts say. Details of the scoring scheme will be included on this site soon.

Administering the PLIC

We have set up an automated system for signing up and administering the PLIC, based on work [2] by the Lewandowski Group at the University of Colorado-Boulder. Any instructor interested in using the PLIC should follow the steps below.

1. To get started, instructors should fill out the Course Information Survey (CIS) available here. The survey includes questions about the course, instructor contact information, when the instructor would like the pre- and post-surveys to close.

2. The instructor is sent an email containing a link to the pre-instruction survey, which should be shared with students by the instructor whenever and however they would like.

3. Four days before the pre-survey is set to close, the instructor is sent an email reminder letting them know how many students have completed the pre-survey. If no one has filled it out at that point, the survey deadline is extended by 3 days.

4. Two days before the post-survey is set to open, the instructor is sent an email informing them that the post-survey link is on its way.

5. Two weeks before the post-survey is set to close, the instructor is sent an email containing a link to the post-instruction survey, which should be again shared with students by the instructor.

6. Four days before the post-survey is set to close, the instructor is sent an email reminder letting them know how many students have completed the post-survey. If no one has filled it out at that point, the survey deadline is extended by 3 days.

7. After the post-survey closes, the instructor is sent a report email including the names of students who completed the survey and a summary of their class’s performance.

Bonus: Instructors are able to change the date that they would like a survey (either pre or post) to close by completing a separate course date change form available here.

Publications

[1] Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking. PNAS, 112(36), 11199–11204. https://doi.org/10.1073/pnas.1505329112

[2] Wilcox, B. R., Zwickl, B. M., Hobbs, R. D., Aiken, J. M., Welch, N. M., & Lewandowski, H. J. (2016). Alternative model for administration and analysis of research-based assessments. Physical Review Physics Education Research, 12(1), 010139. https://doi.org/10.1103/PhysRevPhysEducRes.12.010139

Walsh, C., Quinn, K. N., Wieman, C., & Holmes, N. G. (2019). Quantifying critical thinking: Development and validation of the physics lab inventory of critical thinking. Physical Review Physics Education Research, 15(1), 010135. https://doi.org/10.1103/PhysRevPhysEducRes.15.010135

Quinn, K. N., Wieman, C. E., & Holmes, N. G. (2018). Interview Validation of the Physics Lab Inventory of Critical thinking (PLIC). In 2017 Physics Education Research Conference Proceedings (pp. 324–327). American Association of Physics Teachers. https://doi.org/10.1119/perc.2017.pr.076

Holmes, N. G., & Wieman, C. E. (2016). Preliminary development and validation of a diagnostic of critical thinking for introductory physics labs. In 2016 Physics Education Research Conference Proceedings (pp. 156–159). American Association of Physics Teachers. https://doi.org/10.1119/perc.2016.pr.034

Holmes, N. G., & Wieman, C. E. (2015). Assessing modeling in the lab: Uncertainty and measurement. In M. Eblen-Zayas, E. Behringer, & J. Kozminski (Eds.), 2015 Conference on Laboratory Instruction Beyond the First Year of College (pp. 44–47). College Park, MD. https://doi.org/10.1119/bfy.2015.pr.011

Holmes, N. G., Olsen, J., Thomas, J. L., & Wieman, C. E. (2017). Value added or misattributed? A multi-institution study on the educational benefit of labs for reinforcing physics content. Physical Review Physics Education Research, 13(1), 010129. https://doi.org/10.1103/PhysRevPhysEducRes.13.010129

This material is based upon work supported by the National Science Foundation under Grant No. 1611482.

Rethinking Introductory Physics Lab Courses

With support from the Cornell University College of Arts and Sciences Active Learning Initiative, we are undertaking an initiative to revise the lab courses associated with two introductory physics sequences at Cornell University (6 courses over three years). The transformation of these labs is following the transformation model inspired by the Science Education Initiatives (see: cwsei.ubc.ca and colorado.edu/sei): establish learning objectives, evaluate what students are learning, identify instructional strategies that improve student learning.

Establishing learning objectives and course materials: 

The learning objectives for this course were established through interviews and focus groups with faculty and instructors in the physics department at Cornell and consultation with literature and other institutions. Versions of the learning objectives were iteratively revised with additional consultation with the faculty and instructors. The learning objectives fall under five main goals:

By the end of the three-course intro lab sequence, students should be able to:

  1. Collect data and revise an experimental procedure iteratively and reflectively,
  2. Evaluate the process and outcomes of an experiment quantitatively and qualitatively,
  3. Extend the scope of an investigation whether or not results come out as expected,
  4. Communicate the process and outcomes of an experiment, and
  5. Conduct an experiment collaboratively and ethically.

The specific learning objectives that fall under these five upper-level goals can be accessed through this link

All of our lab materials are available on PhysPort.org/curricula/thinkingcritically

Evaluating student learning:

There are a number of additional projects on the go related to the efficacy of labs. For example, how do labs help or hurt student understanding of measurement and uncertainty? How does this understanding impact how they think about measurement and uncertainty in future lab courses or in quantum mechanics for that matter? What do students get out of research experiences that they could be experiencing in labs earlier? What is the benefit of a hands-on activity compared with simulation or even virtual reality?

Other ongoing projects include:

  • Studies of student attitudes towards labs, both quantitative (surveys) and qualitative (video and interviews)
  • Studies of student behaviors in lab, especially how lab framing can impact responsible (and irresponsible) conduct of research
  • Studies of the impact of learning from hands-on activities compared with technology such as computer simulations and virtual reality (the role of embodied cognition)

Equity in Physics Lab Courses

Through support from a PCCW Affinito-Stewart grant, we studied the degree to which group work is equitable. Comparing traditional and transformed labs, graduate student Katherine Quinn applied her work with Jim Sethna to perform clustering algorithms on students' behaviors during lab. The research team found that there were no differences in roles between male and female students in the highly structured lab course, but there were gendered divisions of roles in the new, less-structured labs. We are continuing the work on equitable groups in labs to further explore whether these experiences impact students' plans to pursue physics. With support from Prof. Eleanor Sayre from Kansas State University, the study will combine observations of students in lab with reflective interviews about their perspectives of their experiences and their intentions to pursue physics. This work will generate testable strategies for improving equity in lab work and students' persistence in physics, which will lead to subsequent grant proposals and projects.

This material is based upon work supported by the National Science Foundation under Grant No. 1836617.

Publications

Quinn, K. N., McGill, K. L., Kelley, M. M., Smith, E. M., & Holmes, N. G. (2018). Who does what now? How physics lab instruction impacts student behaviors. In A. Traxler, Y. Cao, & S. Wolf (Eds.), Physics Education Research Conference 2018. Washington, D.C. https://doi.org/10.1119/perc.2018.pr.Quinn

Student understanding of measurement and uncertainty in Classical and Quantum mechanics

We are collaborating with Dr. Gina Passante at California State University-Fullerton to study student reasoning about measurement and uncertainty between classical and quantum mechanical measurement. Preliminary results indicate that students describe measurement in the two contexts distinctly, explanations for which we will test in coming semesters. More broadly, this work will contribute to future work evaluating student understanding of the nature of scientific measurement.

This material is based upon work supported by the National Science Foundation under Grant No. 1808945.

Top