In the department of Ecology and Evolutionary Biology, current research focuses on developing assessment tools that explore student thinking in biology, enhancing diversity and inclusion in the classroom through instructional changes to include more active learning, evaluating the long-term impact of different kinds of teaching on student retention and professional development in STEM, and creating faculty communities to explore issues such as helping students transition from high school to college STEM courses. These research areas are explored using classroom-based assessments, interviews, observation protocols, and surveys analyzed through quantitative and qualitative methods.
Cornell has a growing and enthusiastic community of scholars who are engaged in Discipline-Based Education Research. Our research programs engage undergraduate and graduate students, postdocs, and university faculty. We collaborate with Discipline-Based Education Research scholars in Physics, the Active Learning Initiative community, and the Center for Teaching Innovation. Members of our group participate in weekly journal clubs and research group meetings, and often sponsor events to engage the larger Cornell teaching community. We also collaborate with Discipline-Based Education Researchers at several other institutions, strengthening the questions we can ask and the generalizability of the results.
Biology Measuring Achievement and Progression in Science or Bio-MAPS, is a suite of diagnostic assessments that aim to measure student understanding across a degree program and are aligned with the Vision and Change nationally-validated set of core biology concepts (AAAS, 2011), further elaborated in the BioCore Guide (Brownell et al., 2014).
All of the assessments have been response validated through standard methodologies (e.g., student interviews, expert feedback, and pilot testing at multiple institutions) (NRC, 2011; Bass et al., 2016; Adams and Wieman, 2011). The questions present a scenario and students respond true/false or likely/unlikely to be true to a set of statements.
More information about the development of the Bio-MAPS assessments and how they can be used is found here: Tools for Change: Measuring Student Conceptual Understanding Across Undergraduate Biology Programs Using Bio-MAPS Assessments
We recommend that Bio-MAPS assessments are given to students at several time points throughout the undergraduate major: at the beginning, after the introductory course series, and just before graduation.
All of the Bio-MAPS assessments include questions addressing the five Vision and Change core concepts. However, each assessment focuses on a different area of biology. The assessments are:
- EcoEvo-MAPS (Ecology and Evolution) Summers et al., 2018; Smith et al., 2019
- Molecular Biology Capstone Couch et al., 2017
- Phys-MAPS (Physiology) Semsar et al., 2019
- GenBio-MAPS (General Biology) Couch et al., 2019
We have set up an automated system for administering the Bio-MAPS assessments here.
Want to view data from your course? Locate the course ID in your email (starts with "R_") and click here.
Publishing Active Learning Lessons
We publish many of our active learning classroom lessons in peer-reviewed journals such as CourseSource because it:
-demonstrates a commitment to high-quality teaching
-fosters an opportunity to collaborate with colleagues
-provides a unique opportunity for us to receive meaningful feedback on instructional materials from peers through the review process
-can help other instructors overcome barriers to using active-learning
Here are several published active learning lessons from the CDER group and Cornell faculty:
Understanding the High School to College Transition for STEM Students
The overarching goal of the project is to conduct fundamental research on how faculty members’ first-hand knowledge of differences in STEM instruction at the college and high school levels can be used to develop the infrastructure, capacity, resources, and expertise needed to work toward a more seamless transition for incoming first-year students in gateway STEM courses. To accomplish this, we survey students about their experiences and host Faculty Learning Communities where gateway instructors can support each other in making changes that align with student needs.
We have an in press paper about student expectations regarding classroom instructional practices:
Meaders C, Toth E, Lane AK, Shuman JK, Couch BA, Stains M, Stetzer MR, Vinson E, Smith MK. What will I experience in my college STEM courses? An investigation of student predictions about instructional practices in introductory courses. CBE-Life Sciences Education. 2019 in press
Biology Lab Inventory of Critical Thinking for Ecology (Eco-BLIC)
The Biology Lab Inventory of Critical Thinking for Ecology (Eco-BLIC) is a closed-response survey designed to assess students’ critical thinking and reasoning about ecology. We define critical thinking as the ways in which you make decisions about what to do and what to trust. In a scientific context, this decision-making is based on evidence that includes data, analyses, prior literature, etc.
Want to try out the Eco-BLIC and provide helpful feedback about the instrument based on your disciplinary expertise? Please click here to access the "expert" version of the survey! Your answers will help us learn how the questions are functioning, how to design the scoring scheme, and where future adjustments need to be made; this process should take approximately 15 minutes of your time. In addition to completing the Eco-BLIC, we have also added a comment box at the end of the survey where you can add feedback.
Interested in sharing the Eco-BLIC with your students this fall? Email Ash Heim (firstname.lastname@example.org)! The assessment can be taken online and we will send you 1) the link to give to your students and 2) email text and lecture slides that you can use to advertise the assessment in your course. We encourage instructors to offer a few participation points to incentivize student participation. When your students have completed the assessment, we will send you a list of who took the assessment and their student IDs. At the start of the spring term, we will share a report with you about your students’ answers.
Design and Validation of the Eco-BLIC
The conception of the Eco-BLIC was inspired by the Physics Lab Inventory of Critical Thinking (PLIC); more information on the PLIC can be found here: https://cder.as.cornell.edu/physics#physics-lab-inventory-of-critical-thinking. We are using a similar validation process on the Eco-BLIC as was used during development of the PLIC .
The Eco-BLIC context focuses on two scenarios in which two research groups attempt to answer ecological research questions based on predator-prey relationships; in each scenario, one research group conducts their study in a laboratory setting, while the second research group conducts their study in a field (e.g., observational) setting. The first scenario of the Eco-BLIC focuses on smallmouth bass and mayflies, while the second scenario focuses on great-horned owls and house mice.
The Eco-BLIC poses questions asking students to: a) interpret and evaluate the sample data, b) evaluate the methods, and c) suggest what the group should do next. From a free-response version, we collected common student responses and aggregated them into a closed response version. The questions include a mix of single-option group comparison questions (e.g. Which of the groups was more effective in their study setup?) and choose-many next step questions (e.g. Which of the following steps should the group take next?). Students are limited to no more than three options for each 'choose-many' question.
As of Summer 2021, we have interviewed several students about the Eco-BLIC questions and piloted the assessment in 26 courses at 11 institutions. Further, responses have been collected from 40 expert ecologists at multiple institutions. Expert responses are used to improve the assessment and will ultimately be used to generate the scoring scheme. Students' scores, therefore, will be related to how closely they respond to what experts say. Details of the scoring scheme will be shared on this site once available.
Administering the Eco-BLIC
Any instructor interested in using the Eco-BLIC should follow the steps below.
1. To get started, instructors should email Ash Heim (email@example.com) to express interest in sharing the Eco-BLIC. Information such as when the instructor would like the pre- and post-surveys to close, as well as the course start date, would be helpful to mention.
2. The instructor is sent an email containing a link to the pre-instruction survey, which should be shared with students by the instructor whenever and however they would like. The instructor will confirm what date they would like the pre-survey to close, after which they will be sent a list of participating students (e.g., if points are being offered for completing this assessment).
3. Depending on when the instructor indicates they would like to distribute the post-survey, the instructor is sent an email containing a link to the post-instruction survey, which should be again shared with students by the instructor. Again, the instructor will confirm what date they would like the post-survey to close, after which they will be sent a list of participating students (e.g., if points are being offered for completing this assessment).
4. At the start of the following academic term, the instructor is sent a report including a summary of their class’s performance.
 Walsh, C., Quinn, K. N., Wieman, C., & Holmes, N. G. (2019). Quantifying critical thinking: Development and validation of the physics lab inventory of critical thinking. Physical Review Physics Education Research, 15(1), 010135. https://doi.org/10.1103/PhysRevPhysEducRes.15.010135
Eco-BLIC Research Team
Ash Heim, Post-doctoral Associate, Ecology and Evolutionary Biology
David Esparza, Graduate Student, Ecology and Evolutionary Biology
Natasha Holmes, Assistant Professor, Physics
Michelle K. Smith, Associate Professor, Ecology and Evolutionary Biology
Cole Walsh, Graduate Student, Physics