Marta R. Stoeckel

Physics teacher and STEM education PhD student into student thinking @MartaStoeckel


Gender, Self-Assessment, and Classroom Experiences in AP Physics 1

This post originally appeared as an article in The Physics Teacher:

Stoeckel, M. (2020). Gender, self-assessment, and classroom experiences in AP Physics 1. The Physics Teacher, 58, 399-401.

One of the many ways issues of underrepresentation appears in the physics classroom is female students frequently have a lower perception of their performance and ability than their male peers1, 2, 3, 4. Understanding how classroom experiences impact students’ confidence, especially for underrepresented students, can provide an important guide to designing physics classrooms where every student sees themselves as capable of learning and doing physics. To explore these issues in my AP Physics 1 classroom, I started asking my students to self-assess as part of my assessment process, allowing me to collect data comparing students’ perceptions to their actual performance. I also conducted interviews and collected student reflections to gain insights into the classroom experiences that impacted students’ confidence in physics. My students made it clear that discovering concepts in the lab contributed to their confidence. Girls also built confidence from teacher feedback, even on assessments where they scored poorly, while boys saw peer interactions as a source of confidence.

Confidence & Why It Matters

Confidence describes a students’ perceptions with respect to actual achievement and is often a precursor to self-efficacy5. Self-efficacy refers to a students’ beliefs about their ability to achieve particular goals and is shaped by four major types of experiences: performance accomplishments, where the individual demonstrates mastery; vicarious experiences, where the individual watches someone they relate to demonstrate mastery; verbal persuasion, where someone else expresses their belief in the individual’s abilities; and emotional arousal, which describes the individual’s mental and emotional state during a task6.

Self-efficacy and confidence are important not only because they correlate with academic success6, but also because they appear to be connected to issues of underrepresentation in physics. Women in introductory physics courses tend to have much lower confidence than men1, 2, 3, 4. Marshman et al. also found that while the confidence of both men and women declined during an introductory physics course, the decline was much greater for women4, suggesting that understanding how classroom experiences impact confidence is an important piece of understanding issues of underrepresentation.

Quantitative Data Collection & Results

This study focuses on AP Physics 1 at a suburban high school. AP Physics 1 is a year-long elective taken almost exclusively by seniors. The curriculum for the course is loosely based on Modeling Instruction7. Typically, around 40 students per year, approximately 10% of a graduating class, enroll in AP Physics 1. 31% of the students in the course are girls, which is  stark contrast to both the school’s non-AP physics course and to other Advanced Placement courses in the school, including AP Chemistry, where around 50% of the students are girls, suggesting there is an issue in the school unique to AP Physics 1. 

In the course, students take assessments approximately once per week where they receive a score on a scale of 2 to 5 for each learning target assessed. At the end of each assessment, I ask students to predict their score for each learning target, then complete a short written reflection, as shown in figure 1. Over the course of two years, I recorded the scores students predicted, along with their actual scores for each learning target. I collected this data for a total of 92 students, 29 of which were girls.

Figure 1: Sample self-assessment

I put these scores and self-assessments into a framework called the CCL Confidence Achievement Window5. This framework compares students’ confidence and actual achievement to sort them into four profiles: public, with high confidence and high achievement; underestimating, with low confidence and high achievement; unknown, with low confidence and low achievement; and overestimating, with high confidence and low achievement The public and unknown profiles are considered to have good calibration between students’ achievement and confidence, while the underestimating and overestimating profiles indicate poor calibration.

For each student, I calculated total actual scores and total self-assessment scores as a fraction of the possible points. I used the self-assessment values as a measure of confidence and the actual score values as a measure of achievement in order to plot each student onto a CCL Confidence Achievement Window5, as shown in figure 2. The majority of students had fairly good calibration between their self-assessment and actual scores, falling into the public and unknown profiles. In addition, boys and girls fell into each profile at similar rates, suggesting boys and girls in this classroom had similar degrees of overall confidence. 

Figure 2: CCL Confidence-Achievement Window

What Affected Confidence?

To understand how my students developed such a well-calibrated sense of their achievement, regardless of their gender, during the second year of data collection, I also recorded the responses students had to the open-ended prompt I included on the self-assessments, such as the one in figure 1, from the 52 students enrolled in the course that year. In addition, I interviewed ten student volunteers at the end of the second year about the experiences that affected their confidence. Three key themes emerged from the qualitative data: labs, peer interactions during whiteboarding, and assessment feedback.


On nearly every assessment reflection, students consistently mentioned labs as helping them achieve mastery, usually mentioning a specific lab done as part of the preceding unit, regardless of their gender. The interviews revealed what about the labs helped students develop their sense of confidence. In the Modeling7 approach, a new topic typically begins with a guided inquiry lab followed by a whole-class discussion of the results that allows students to develop conceptual and mathematical models for the new topic. The students I interviewed described this approach as giving them a sense of ownership of the material and showing them that they could discover new concepts, suggesting these labs were an opportunity for performance accomplishments, where students developed self-efficacy by demonstrating their own mastery of key skills6. As one girl put it:

“I think the self-discovery thing, like when you figure it out yourself, that’s always really good. Cause it makes you feel like you’re doing it yourself and you’re this scientist that knows everything.”

Students in the interviews also described making the transition from a lab to written problems as an important moment. Figuring out for themselves how to apply what they discovered in the lab to a new type of problem was another performance accomplishment That helped students see themselves as capable. In the words of one boy:

“I think when, not the lab, but after a lab that we do. So we do a lab that hammers at different the way that physics works and we get a problem set the day after. That it’s the same–they’re not the same thing, but it’s like the same concept. And then it’s like I semi-understand what we did yesterday and then we practice it and all the sudden, I just really understand the problems.”

Interestingly, during the interviews, several students also talked about labs as detrimental to their confidence; one boy even specifically described labs as having both a positive and negative effect on his confidence. Most students had minimal exposure to guided inquiry prior to AP Physics 1, resulting in some frustration for students. In interviews, students interpreted the confusion, mistakes, and other issues that are a normal part of guided inquiry as evidence they were not good at physics, especially if they had done well in previous science courses. This suggests it is critical to foster a classroom culture that normalizes confusion as part of the learning process to maintain the positive impact of labs on confidence.

Whiteboarding & Peer Interactions

The other major activity students mentioned on their self-assessments was whiteboarding, where students work in small groups to prepare a whiteboard with the solution to a problem which is then presented to the class. For boys, these activities were also an opportunity to build self-efficacy through verbal persuasion6. In the interviews, several boys brought up peer responses to their input during these activities, typically recalling specific problems and exchanges, often from several months prior, suggesting peer interactions had a lasting impact on students. 

By contrast, while girls also said whiteboarding helped them master the content, only one girl spoke about peer interactions during the interviews. She recalled a specific exchange, in which her all-male group responded positively to her input, but she interpreted it as evidence she was fooling her peers, rather than an affirmation of her abilities:

“I think it’s one of those things where I’m generally a smart person, so they’d just assume that I kinda know what I’m doing, but they’re all super good at physics so I think they overestimate my abilities almost.”

This raises the question of what contributed to the very different recollections of peer interactions. Did boys have more interactions than girls where peers affirmed their abilities, or did girls have other experiences that lead them to view those interactions with a greater distrust?

Assessment Feedback

During the interviews, both boys and girls talked about in-class assessments, especially when I asked whether they think I believe they are good at physics. Most of the boys brought up specific assessments where they earned high scores as evidence of their physics ability. However, the girls talked about assessments very differently. Rather than talking about assessments where they had done well, the girls tended to talk about assessments where they had done poorly. The girls saw the kind of feedback I wrote on their assessments, along with a course policy encouraging retakes, as evidence that I saw them as capable of mastering the material, regardless of their initial score. As one girl put it:

“[You] offer constructive criticism when needed and it’s really helpful when trying to understand what I did incorrectly on quizzes and labs. So I believe that feedback really shows that [you] believe that I can do the course.”

For these girls, verbal persuasion in the form of my feedback did not have to be paired with a performance accomplishment to have a positive impact on their confidence. 


Confidence is shaped by students’ experiences in the classroom, and understanding those experiences is particularly important for addressing issues of underrepresentation. In this study, students saw discovering a new concept in the lab and figuring out how to apply it to problems as particularly important opportunities for performance accomplishment, with girls in particular reporting less confidence on topics where they had fewer of these opportunities. Students of both genders responded to verbal persuasion, though boys focused on peer interactions and girls focused on teacher feedback, especially on assessments where they performed poorly. Designing a classroom where all students have the opportunity to develop a sense of confidence and self-efficacy means ensuring that all students have access to these kinds of experiences. It also means listening to students to better understand not only the kind of activities but the critical elements of activities that enable students to see themselves as good at physics.


  1. Emily M. Marshman, Z. Yasmin Kalender, Timothy Nokes-Malach, Christian Schunn, & Chandralekha Singh, “Female students with A’s have similar physics self-efficacy as male students with C’s in introductory courses: A cause for alarm?” Phys. Rev. Phys. Ed. Res., 14, 020123 (December 2018)
  2. Tamjid Mujtaba, & Michael J. Reiss, “Inequality in experiences of physics education: Secondary school girls’ and boys’ perceptions of their physics education and intentions to continue with physics after the age of 16,” International Journal of Science Education, 35, 1824-1845 (July 2013)
  3. Jayson M. Nissen & Jonathan T. Shemwell, “Gender, experience, and self-efficacy in introductory physics,” Phys. Rev. Phys. Ed. Res., 12, 020105 (August 2016)
  4. Emily M. Marshman, Z. Yasmin Kalender, Christian Schunn, Timothy Nokes-Malach, & Chandralekha Singh, “A longitudinal analysis of students’ motivational characteristics in introductory physics courses: Gender differences,” Canadian Journal of Physics, 96, 391-405 (May 2017)
  5. Lesa M. Covington Clarkson, Quintin U. Love, & Forster D. Ntow, “How confidence relates to mathematics achievement: A new framework,” Mathematics Education and Life at Times of Crisis, 441-451 (April 2017)
  6. Albert Bandura, “Self-efficacy: Toward a unifying theory of behavioral change,” Psychological Review, 84, 191-215 (January 1977)
  7. Jane Jackson, Larry Dukerich, & David Hestenes, “Modeling instruction: An effective model for science education,” Science Educator, 17, 10-17 (Spring 2008)

Leave a comment

Self-Assessment & Underrepresentation in AP Physics 1 (NARST 2020 Presentation)



Leave a comment

Getting Students On Board with Active Engagement

Over the past few weeks, I keep finding myself in conversations about navigating pushback from students and parents when using student-centered, active-engagement instruction, such as Modeling Instruction. Brian Frank tweeted a thread this fall on the fact that that while the frustration and misery that lead to pushback are common, they aren’t inevitable.

When I used a more teacher-centered, traditional approach, building relationships with my students was enough to make kids comfortable in my classroom. But, when I started using Modeling Instruction, I found myself dealing with angry students, fielding phone calls and e-mails from frustrated parents, and even meeting with administrators when a few parents escalated upstairs. I eventually figured out the point Brian made in his thread–that you can reduce the misery by planning the right kind of classroom environment. While I certainly haven’t eliminated my students’ frustration, I now find my students are mostly onboard with the instructional approach I use and there are some particular steps that have been especially impactful.

Keep Students’ Perspective in Mind

Kids don’t get frustrated because they’re lazy or disinterested in learning; I have yet to meet a student that isn’t curious, hard-working, and persistent under the right circumstances. I think part of the reason students don’t always bring those traits into the classroom is school more often rewards students for compliance and reproducing procedures or reciting knowledge provided by the teacher. Students, especially older ones, have gotten familiar, and even comfortable, with seeing those actions rewarded. Expecting and rewarding something different feels like changing the rules of the game; for students who’ve done well in school, the change can even feel threatening since you’re changing the rules of a game they’ve been winning. Heidi Carlone found students may even see what they’re asked to do in a reformed science class as in tension with their identity as a “good student”. Keeping this in mind isn’t enough to prevent students’ resistance to active learning, but it helps me approach student resistance from a place of empathy, rather than frustration or judgement, and empathy is a much better place to build a classroom climate from.

Tell Students What I Want From Them

During my first year of Modeling, I saw a lot of students working hard in my class, but still struggling with the content because they were working in unproductive ways. Telling these students they needed to participate or put more effort into the class only made their frustration worse, because they were already participating. As I gained empathy for my students’ perspectives, I realized students were engaging in ways that are usually rewarded in school, like memorizing answers or focusing on what’s right during a lab, rather than what actually happened. I started spending more time talking about what productive engagement looks like during different kinds of activities. Since directions like “focus on sense-making” or “collaborate well” are too vague to be useful, I’ve been working on ways to make what I’m looking for more concrete, like using group roles to give students a clear target for good collaboration.

I also keep in mind that high school students have a lot of experience with getting rewarded for superficial engagement. Just telling them those approaches won’t work in physics is almost never enough to overcome years of experience as a student. Students need space to try more familiar approaches, reflect on whether they are working, and the chance to improve. For me, this has meant copious opportunities for reflection on the course and a generous retake policy so early mistakes don’t stick with students the rest of the term.

Finally, every fall, I remind myself to be very patient in September and October (and sometimes even longer) as I give my students the time and tools they need to develop the skills and mindsets necessary for active engagement. Even once students recognize they need to learn how to collaborate or how to have good discussions, they need practice to develop those skills, so a lot of the lessons and activities those first few weeks of the year are rocky. I have to remember that just because my classroom doesn’t look the way I want it to in September doesn’t mean my students and I won’t get there.

Listen to Students

I remember a day during my first year of Modeling where I started class by sketching a graph on the whiteboard of panic in physics vs. time. Students laughed, and we made some jokes about it, but it gave students a much-needed opportunity to be open about their frustrations with the course, as well as to talk about what in particular was frustrating them. The relief in the classroom was palpable; naming and normalizing what students were feeling made their frustration feel smaller and more manageable. I now spend a lot of time listening to students when they are frustrated and having conversations about how physics is different than other courses they’ve taken. I focus on listening to where they’re at, validating their discomfort with my class, and assuring students it is something I will help them work through. The release students get from these conversations doesn’t prevent them from getting frustrated, but it keeps their frustration from festering into something worse. It also gives me the opportunity to help students find ways to channel their frustrations and engage productively in the class, which leads to less frustration down the line.

Share My Purpose

If students are going to sit with their discomfort and take risks, they need to know there’s a reason for what I’m asking of them. Rather than asking students to trust I have a purpose, I talk to students about the reasoning for my instructional choices. We talk about my decision-making both when it comes to the course as a whole and when it comes to individual lessons, which makes it clear to students that I know where I’m taking them. I think teacher-centered instruction feels to students a little like hiking a well-marked trail while following a guide who has a map; even if the trail is unfamiliar, you can see the direction you’re going and you know the guide will keep you on the right path. Active engagement feels more like trying to find your own way in the deep woods without a map or trail. Students need to be reminded that even when I’m hanging back, I know where we’re going, I have a plan to get us there, and I will intervene before anyone gets too far off track. Explaining my choices gives students that reminder, and helps them feel safer, which makes it less likely their discomfort will become frustration. Making it routine to explain my decisions also means I get a lot of benefit of the doubt from students when I make a move I haven’t justified; students trust I have a purpose and either ask about my goals long before they get upset or simply go along with what I’m asking of them.

Make Sure Students See Their Progress

I’ve found that students don’t always recognize how much they are learning when they are constructing knowledge themselves. I give short assessments almost weekly, rather than big unit tests every few weeks. This means students get frequent reminders that they are learning new physics content and making progress towards mastery. When students have evidence they are learning, they are more willing to go along with what I’m asking of them.

The deeper skills, like collaboration and science practices, are harder to track. On a regular basis, I take class time for students to reflect on their growth on these skills. I also try to notice when I’ve been quieter than usual during a discussion, when I manage to stay out of the way during a lab, or when I hear high-quality discourse, so I can point it out and contrast with where the class was early in the year. When students see what they’re gaining, their discomfort feels worth it.

Build Student-to-Student Relationships

If revealing ignorance in front of a teacher is nerve-wracking, revealing ignorance in front of a peer is downright terrifying. If students are going to try out ideas, offer an answer before they know what’s right, and take other intellectual risks, it isn’t enough for students to trust me; they have to trust each other, as well. I wrote about some of the concrete strategies I use in a previous post, but the most important piece has been a shift in the relationships I’m thinking about in my classroom. Previously, the main relationships I paid attention to were the ones between myself and my students; now, I work to cultivate positive relationships between students, as well. My students don’t all need to like each other, but they do need to be able to trust and support each other while they are in my room.

Teach Collaboration

Even when students trust each other, collaborating well is a skill, and a complex one. When I started Modeling, I underestimated how difficult it is for students to collaborate effectively, which ensured my students spent huge amounts of time in ineffective groups, feeling frustrated and miserable, unsure how to improve their situation. I read Cohen and Lotan’s book Designing Groupwork: Strategies for the Heterogenous Classroom. I started using group roles, reflections on collaboration, and other strategies to teach my students how to work well together (I talk more about these things in the same post where I addressed student relationships). It turns out, when you teach students a skill, they get better at it and when more students are in high-functioning groups, fewer students feel frustrated.

Give It Time

When I first switched to Modeling Instruction, I wasn’t comfortable or skilled with the instructional approaches, and my teaching was often clumsy at best. My first post-lab board meeting flopped and the first round of mistakes whiteboarding was a disaster. As the year went on, I forced myself to try again and gradually got more skilled at facilitating active engagement. It was incredibly uncomfortable to work through those lessons that went poorly, but I needed to fail to figure out how to get better. I spent a lot of time reflecting, I asked for help, and I latched onto evidence that my students were still learning physics during those lessons that felt very rough. With time, I got more skilled with this kind of teaching and more lessons got the results I wanted.

Time is also important for shifting students’ expectations about the course. Kids talk to each other and have heard what physics was like for older friends and siblings, so it felt like a bait and switch to them when I started Modeling. The second year, most of my students had heard about how I teach, so registered for the course knowing it would have lots of group work and minimal lecture. By year four, I’d been using active engagement for as long as they’d been in the high school, which may as well be forever. At this point, even when students are frustrated, it doesn’t seem to occur to them that there is another way to teach physics, which makes for very different conversations than I had those first years.

Final Thoughts

Frustration, pushback, and other misery are common reactions to active engagement, but they aren’t inevitable. Creating a space where students feel safe, both with me and with each other, takes effort, but it means that students can accept, and even enjoy, the challenges of active engagement. It’s also important that nothing I talked about here is one-and-done; they are things I work on from September all the way through May. This work isn’t easy, but to see students not only rising to the challenge, but enjoying themselves while they do it is well worth the effort.

Leave a comment

Building a Whole-Class Culture

Going into this school year, I decided my biggest goal in regular physics would be to be intentional about the kind of class culture I was building. From a pedagogical perspective, I want the kind of classroom where students feel comfortable participating and taking intellectual risks. From an equity perspective, I want classroom where students value working with diverse groups and every student is valued as they are. At the end of the year, my students let me know I’d made some important progress in this area when, on the last day of school, students talked about how much they would miss being in their particular physics class and the sense of community they felt with their peers. I don’t think there is any one thing I can attribute this success to; part of the credit certainly goes to the personality of this senior class, but there a few things I did that I think played an important role.

Daily Check-Ins

For a few years now, I’ve had the very simple routine of stopping by each table while students are working in small groups and asking everyone how they are today. I didn’t have any intention or thought behind this habit until I had a student who wouldn’t let me have any other interaction with her group until I’d done the check-in. She also came to class every day with a plan for what she was going to tell me, so the ritual was clearly important to her. Since then, I get several notes from students each year that specifically comment on how much they love my routine of asking how they are each day and the way it makes them feel safe in my classroom. On my end, I really enjoy that I have a low-stakes, positive interaction with every student every day and I get to hear about what’s important to my students. If that makes them more comfortable letting me know when they have a question or when they need something, all the better.

Randomly Assigned Groups

Kelly O’Shea convinced me to try assigning visibly random groups that change frequently. She uses the list function on, but I ended up putting my roster into a spreadsheet made by Scott Lotze, the other physics teacher at my school. Making new groups almost daily ended up being one of the most impactful aspects of this strategy. The usual complaints about assigned groups and requests to switch groups disappeared very quickly since students recognized they only had to manage a challenging group for a day or two. In addition, my school is big enough that I usually have students in the same section who don’t even know the names of most of their classmates but, this year, within a few weeks, every student felt like they knew everyone else in the class at least a little bit. This made a huge difference in whole class discussions; without any changes to how I ran whole class discussions, students were more engaged, more willing to speak up, and more willing to question each other than in previous years. Students told me they felt more comfortable speaking up in physics than in other classes because they actually knew everyone in the room.

Students also learned more when the groups changed frequently, especially when students started working problems with one group, then prepared a whiteboard with a new one. Inevitably, within the first few minutes of moving to the new groups, someone would ask the rest of the group “How did you do our problem?” which lead to great discussions comparing different strategies and finding each other’s mistakes. While this mirrored some of the discussion that happened as a whole-class during mistakes whiteboarding, this small-group discourse drew in every student in a way that is not possible in a whole-class discussion with a class of 33.

Group Roles

In my licensure coursework and in PD I’ve done over the years, I’ve been exposed to group roles numerous times, but always dismissed them as something unnecessary and a little silly for the older students I teach. Reading Cohen & Lotan’s Designing Groupwork: Strategies for Heterogenous Classrooms finally shifted my thinking; they discuss the ways that group roles set the tone for what it means to contribute to a group and can disrupt patterns in who is granted status by their peers, which strikes me as especially important when thinking about the experiences of underrepresented students.

I developed a set of group roles based on conversations with Kelly O’Shea, the group roles from the University of Minnesota’s PER group, and the needs I saw in my classroom. I printed the roles on laminated cards so that students could have a description of their role, including some suggested sentence starters, on the table in front of them while working.

At the start of the year, I used the roles most days, sometimes letting groups decide who did what and sometimes assigning roles randomly. Regardless of how the roles were assigned, they served two important purposes. First, they communicated a clear expectation that every group member was involved and actively contributing to the task. Second, none of the roles required any physics knowledge, which made explicit that there are important ways to contribute to a group besides being able to tell everyone else the answer. Ultimately, these messages were more important than the roles themselves. An instructional coach observed me on the first day of a term, before I introduced the roles, and a day or two later when I’d assigned roles to students. He commented that while we saw very little evidence that students were using the official roles, students were much more engaged and collaborating more effectively during the second observation.

I don’t feel the need to use the roles all the time. I used them quite a bit the first two weeks of the school year, then less and less until the end of the first month, when I retired them for the term. At the end of each trimester, around half of the students in regular physics not only switch between hours, but switch between teachers, which tends to reset the class culture. To help with this transition, I had students go back to using the roles for a week or so at the start of each new trimester to make sure each new mix of students had the shared expectations that came from using the group roles built into their class culture.

Valuing Diverse Abilities

There are a lot of different skills and abilities that are critical to success in science, but students often have a limited view of what it means to be good at science. To try and shift that, I used a simple exercise from Cohen & Lotan’s Designing Groupwork: Strategies for Heterogenous Classrooms where, after an activity, we did a debrief where students identified some of the skills the task required and describe how those skills were demonstrated by someone in their group. In those debriefs, it became apparent that it would be unreasonable to expect any one individual to have all of the skills required, which lead naturally into a discussion of why it was useful to do the task in groups and encouraged students to consider how to take advantage of their peers’ strengths on future activities. It also gave students who see their strengths as incompatible with being a “science person” the opportunity to recognize the value they bring to a group.

During the first month of school, I picked one activity per week that we’d debrief, usually selecting one that I expected to generate a diverse list of required abilities. Like the group roles, this helped set a tone in the class, but became less necessary as students settled in. Similar to the group roles, I picked a few activities to debrief again at the start of each trimester when students moved between hours and between teachers, again ensuring that all students had certain shared expectations and beliefs about collaboration in my classroom.

In the future, I’d like to connect the skills students are identifying to something like Eugenia Etkina’s scientific abilitiesKelly O’Shea’s scientific competencies, or the science practices used in NGSS or AP sciences. There is a lot of overlap between each of these lists and the skills and abilities my students have identified in our debrief discussions this year, and I wonder if connecting what my students see as important to a list that feels more formal would give additional weight to their value in my classroom.

Frequent Reflection

Collaboration is a skill and part of how you get better at any skill is evaluating your strengths and weaknesses so you can make a plan to improve. With that in mind, I had students complete some kind of reflection almost weekly. Some weeks, the questions were about using the group roles, some weeks I asked students to reflect on a list of things effective groups do that I originally got from Scot Hovan and posted at each lab table, and some weeks I used Colleen Nyeggen’s participation goals. All of the reflections were completed during class to ensure students saw the value I placed on them and, on the first few reflections of each term, I took the time to respond to something each student wrote to make it clear I was reading and thinking about what they had to say. Because it was clear that I valued the reflections, most of my students took them seriously, writing insightful comments and having meaningful conversations with their peers. With all of the reflections I used, I was able to get information about was and was not going well with group work and students were consistently thinking about how to be a better member of their group in physics.

What’s Next?

I mostly used these strategies in my regular physics classes partly because I fall into the trap of thinking my AP students don’t need the same support; they come in to my class more skilled at collaboration and more comfortable with each other. My AP classes also have very few students who switch between hours and all of them stay with me all year. In spite of those advantages, by the end of the year, my regular physics classes were much tighter knit and typically had higher-functioning groups than my AP classes. That tells me it’s worth making the time to bring these strategies into my AP classes next year.

I also know there is more room to put equity at the forefront of my classroom. It’s fairly easy for students to drop courses at the end of a trimester and white girls and students of color drop the regular physics course at a higher rate than white boys. Next year, the other physics teacher and I are planning to use our PLC time to take a critical look at our classrooms to think about what in our classroom cultures reinforces this pattern and find changes we need to make.

My colleague and I also want to work on building a classroom culture where students value challenge. Most of the students who drop say the course is “too hard”, even when they are getting good grades. If we want to reduce our drop rate, one piece may be building a classroom culture where the challenge is seen as something positive.


Linking Kinesthetic and Quantitative with Pivot Interactives

Peter Bohacek shared an interesting article with me that found students who’d had a kinesthetic experience with a bicycle wheel gyroscope not only performed better on an angular momentum assessment, but fMRI scans showed the sensorimotor parts of their brain became active while thinking about angular momentum. This validates my gut instincts that students should have lots of hands-on experiences, and I feel like I do a pretty good job of that in physics, but what does a kinesthetic experience look like in chemistry? I teach a basic chemistry course where concrete experiences are critical in developing student understanding and I think students could especially benefit from the kinds of kinesthetic experiences described in the article.


Gas laws ended up being a great place for me to start thinking about kinesthetic experiences in chemistry. Last year, I started doing a lab where students play with a sealed syringe, including heating it up in a water bath and manually changing the volume. Throughout, students are able to feel the pressure difference as the plunger pushes or pulls against their fingers, giving a great kinesthetic experience we can refer back to throughout the unit.

The trick has been connecting this experience to the equations. Feeling the plunger push back when they held it at the same volume in a hot water bath was enough to convince students that pressure goes up with temperature, but a lot of them struggle enough with math that they had a hard time seeing how the qualitative relationship from the lab fit with PV=nRT; the inverse relationship for volume was enough tougher for students to make sense of! My students needed more of a bridge between the kinesthetic, qualitative experience and the math.

That’s where Pivot Interactives came in this year. As part of the Chemistry Fellows program, I’ve been piloting their new chemistry resources in my classroom and this seemed like a perfect opportunity. Since Pivot Interactives has several activities where students can collect data for the ideal gas laws and we’ve been working a lot on interpreting graphs this year, I was hoping that collecting their own data could serve as a bridge between the kinesthetic activity and the math.

gas laws ptAfter some discussion on the qualitative results with the syringes, including developing an operational definition of pressure, we fired up the computers to collect some pressure and temperature data in Pivot Interactives.  Students got a nice, linear graph and I had them turn the slope into a “for every” statement to describe how much the pressure went up for every 1 degree of temperature increase. We also had a lot of discussion about how these results fit with what they’d observed previously with the syringes. By the end of the hour, students were on board that P = “stuff” x T and could clearly explain how their experience with the syringes supported that result.

gas laws pvVolume was a little trickier. A lot of my students haven’t taken geometry and finding the volume of a cylinder was a big barrier for a lot of them on a lab earlier this year, so I was nervous about having them find the volume of the bubble. We did some whole-class discussion on what we could measure that would tell us about the volume of the bubble, and students readily settled on the diameter as a good option. The graph of pressure vs. volume still looked pretty inverse.

The discussion was also trickier. Students had felt the changes in pressure as they changed the volume of their syringe, so we had to spend some time working through how that connects to the Pivot Interactives video showing changes in volume as the pressure drops. It took some time, but students were eventually able to make the connection. It also took a little more for my students to make sense of the graph. Since we don’t do linearization in my chemistry course, we weren’t able to make a “for every” statement about the graph, but students were able to recognize that as pressure went down, volume went up and eventually get to V = “stuff” / P.

After this series of labs, it was time to start working some problems. Last year, students struggled through gas law calculations and had a very difficult time reasoning through whether their answers made sense. This year, students frequently talked about their experiences with the syringes when making sense of a problem and were able to breeze through the calculations. I also saw the difference in much higher scores on the end-of-unit assessment.

Using the kinesthetic lab to introduce gas laws wasn’t new to me, but Pivot Interactives gave me new tools to build a bridge between what students experienced directly and what the calculations described. This proved to be an important piece in developing my students’ understanding of the material.


Kontra, C., Lyons, D. J., Fischer, S. M., & Beilock, S. L. (2015). Physical experience enhances science learning. Psychological science26(6), 737-749. Retrieved from

Leave a comment

Pivot Interactives for Make-Up Labs

This year, I’ve been able to pilot some of the new Pivot Interactives chemistry activities in my Chemistry Essentials course as part of their chemistry fellowship program. There is a much higher absence rate in Chemistry Essentials than in our other chemistry courses and one of the challenges I’ve been able to tackle with Pivot Interactives has been finding an approach for make-up labs that balances equity with a meaningful lab experience.

First, a little background on the course. My district offers four different chemistry courses, and Chemistry Essentials is designed to meet the minimum graduation requirements. Many of my students have seen limited success either in science in particular or in school in general and one of my challenges as a teacher is to make sure my students see my class as an opportunity to change the patterns they’ve experienced in other courses.

In my department, the standard approach when a student is absent from a lab has been to have them come in before or after school to complete it. The trick is many of the same issues that keep a student from coming to class, such as obligations outside of school or transportation issues, can also make it difficult for them to come in outside of the school day. Even if I’m willing to bend for a student who talks to me, how many never do because they see coming in outside of school as just one more immovable barrier they face? This is doubly frustrating to students who have a study hall or similar space in the school day where they could make up the lab, but the lack of available space or staff to monitor lab safety mean I can’t give students that opportunity.

My go-to has been to provide a make-up version of the lab with the data already filled in. While it gets away from requiring students to come in outside the school day, the data often feels like meaningless numbers when students don’t have any connection to how it was collected. Students also miss out on a lot of science practices, such as designing the experiment, using the necessary tools accurately, and the countless decisions that come with collecting your own data. While I think a student can make progress on these skills missing a lab here or there, a student who is gone frequently can easily miss out on a crucial part of the course.

Pivot Interactives has allowed me to give students something in-between these two approaches. While it can’t completely replace the kinesthetic experiences that happen in an apparatus-based lab, students still can make qualitative visual observations and develop a clear understanding of where the measurements come from since they are seeing the experiment and takin the data themselves. I can also easily write a make-up version of the lab that includes similar experimental design and data collection decisions that students had to make in the classroom. At the same time, students can complete the lab when and where it works for them, rather than having to make a small window of time work. As a result, many of this year’s make-up labs have felt more to students like an actual lab experience rather than a box to check using disembodied data.

Leave a comment

Where Does the Energy Go?: Using Evidence-Based Reasoning to Connect Energy and Motion

This post appears as an article in the January 2018 issue of The Science Teacher.

Stoeckel, M. (2018). Where does the energy go?: Using evidence-based reasoning to connect energy and motion. The Science Teacher, 85(1), 19-25.

PDF download


Defining Electric Potential Difference by Moving a Multimeter’s Ground Probe

This post appears as an article in the January 2018 issue of The Physics Teacher.

Stoeckel, M. (2018). Moving multimeter ground to define electric potential difference. The Physics Teacher, 54(24), 24-25.

PDF download

1 Comment

Making the Move to Standards-Based Grading

This spring, I’ve spent a lot of time analyzing how the year went and trying to identify my biggest frustrations. My goal isn’t to wallow in negativity; I’m much more interested in figuring out what I can do differently next year to reduce or eliminate those frustrations. As I reflected on the year, I identified my two biggest frustrations:

  • My students, at least near the start of the year, are very focused on points and this makes it difficult for them to take risks or try something they don’t have step-by-step directions for. This isn’t a surprise since most of my students are 12th graders who’ve done very well in school by focusing on points. While most students got comfortable not always knowing the answer immediately by the end of the year, I’d like to make that transition faster and less painful.
  • Many of my students did a brain dump after each test and at the end of each term. I quickly found that when I wanted students to build on concepts from a previous unit or see connections to a topic from last trimester, I had to build in time to review the earlier concept. Like the focus on points, this serves students very well in most classes, including ones I’ve taught, and the majority of students eventually made the necessary shifts, but I’d like to help them make the jump much sooner.

Both of these frustrations are promoted by my grading system. I’ve used a fairly traditional gradebook where I record scores for selected labs and problem sets in one category and scores for large unit tests in a separate category (with a larger weight). Of course when my grading system is built on accruing points students will focus on points! Of course when we have a unit test on some arbitrary date, then move on to a brand new topic with its own big test students will mentally move on, as well! Clearly, it is time for me to take a new approach for grading.

I decided I dive into standards-based grading (SBG). The key idea is that instead of receiving scores on specific assignments (such as unit 1 test or chapter 12 test), students receive scores on specific objectives. This steers the focus away from points in the traditional sense and towards what students truly need to know. Another key feature is that students have the opportunity to reassess standards, usually with the new score replacing the old one. This dramatically lowers the stakes for students. They can take a risk, trying a new approach to a problem or a lab, knowing that if they fail, they can always try again. In addition, students can’t get away with forgetting what they learned since every standard will be assessed multiple times. In many cases, teachers record only the most recent score for a standard, even if it goes down, with the goal of making a student’s final grade represent their knowledge and skills at the end of the course.

This is also good timing for a shift in my grading practices. My building has had a group of teachers studying the issue of grading for a few years and they have arrived at a several grading practices that every teacher in the building will need to follow next year, which means I’ll be making some changes no matter what, so I may as well make some big ones. The first task, however, is to make sure I see how I can fit SBG into next year’s requirements.

Requirement 1: Grades will have three weighted categories: summative (75%), formative (15%), and cumulative final (10%).

A major tenet of SBG is that students should have the opportunity to practice and master content without being penalized for mistakes, so the formative category isn’t in-line with SBG, but I think I know how I’d like to approach this requirement. The summative category is where I’ll place the course objectives. To keep things simple, I’ll update scores on each objective every time it is assessed so that only the most recent score affects a student’s grade. The formative category is where I’ll record scores for the formal lab reports I have students write (usually two per trimester). I see the lab reports as addressing overarching skills such as scientific practices and communication that I would like to include in grades, but are much broader than the typical content objective. At this point, I’m comfortable placing the lab reports in the formative category in order to give those skills more weight than a single objectives.

Requirement 2: In-progress and final grades will be reported as a percentage and mapped to a traditional letter grade.

Our gradebook software reports student percentages to two decimal places, a level of precision I don’t think I’m capable of as a grader. But, its what we have and percentages aren’t going away in my district any time soon, so I need to figure out how I’m going to work within those confines. For now, my plan is to simply make each objective an assignment worth whatever maximum I set my scale to. The summative category will then be worth points equal to the number of objectives x maximum possible score on each objective. The software will then take an average that it uses as a student’s grade in the summative category.

The main issue I have with this approach is a student could conceivably get a respectable grade with no progress towards mastery on some objectives (this can happen just as easily in a traditional grading system; its just easier to hide). For this year, I want to keep things simple, so I’m planning to just keep this in the back of my mind; I doubt I’ll see a significant number of students who do well overall, but ignore a few key standards. Down the line, I may try conjunctive SBG where certain standards are required to earn a passing grade. I may also consider giving certain standards more weight in the gradebook either because they are more complex or more crucial to future learning than the other standards.

Requirement 3: Every class will have a culminating activity during the final exam period.

While I haven’t found much on final exams in the SBG materials I’ve read so far, giving significant weight to what a student does during a certain 90 minute period seems to go against much of the thinking behind SBG. Ideally, what I’d like to do is move away from a traditional written exam, where students do an assortment of problems from throughout the trimester, and toward a more authentic assessment. One option would be an open-ended project, such as Casey Rutherford’s final project where students must come up with a physics question, then collect data to answer it. Another option would be to follow my district’s STEM integration efforts and develop an engineering design challenge where students must apply physics to solve some kind of real-world problem. The trick here would be to come up with something where students would truly have to apply their physics knowledge in a meaningful way.

Requirement 4: Scores no lower than 50% will be recorded for any summative assessment students attempt.

I absolutely agree with this requirement; it makes no sense that most grades cover a range of 10 percentage points (less if you count grades with a + or -) while an F covers 60 percentage points. The main trick is what it will look like to follow this guideline using SBG. My plan is to give students a numerical score for each standard, and have the floor at half the points. For example, many teachers who use SBG give their students a 1, 2, or 3 on each standard they attempt. I will probably give a 2, 3, or 4, instead.

Requirement 5: Students will have at least one reassessment opportunity on all summative assessments.

This requirement is very in-line with SBG; the only question is how I want to manage reassessments. In a good physics class, there is some spiraling of content that happens naturally, and I plan to treat that as one option for reassessment. For example, I had some students this year who did poorly on linear constant acceleration, but, by the time we finished projectile motion, were nailing complex problems that used the same skills. In those cases, I would have no problem updating a student’s scores for constant acceleration objectives.

I also want to offer more explicit reassessment opportunities. I am a fan of Sam Shah’s reassessment application and am planning to modify it for out of class reassessment. I really like that he forces students to reflect on what got in their way and to articulate what they’ve done to improve, rather than allowing students to take the all-too-familiar approach of just trying again on the assumption that it will go better.

I’m also considering Kelly O’Shea’s “test menus” for in-class assessments. It sounds fairly easy to manage for a large number of students (which is important, since my average class size will be somewhere above 30 next year) while still providing significant student choice in their assessment.

What’s Next

I feel like I’ve got the broad strokes in place for next year, but there are still a lot of details to work out. My next big task will be to revise my objectives. My district has been using learning targets (a certain flavor of objective) for a few years, but we didn’t have much dedicated time to work on objectives, so mine are, at best, mediocre. If they are going to become the basis of my gradebook, I need to put in the time to write clearer, more precise learning targets.

My other big summer task will be to finalize (at least for now) some of the details for how I want to grade and revise my syllabus accordingly. While I fully expect to revise my syllabus and details of my grading system as the year progresses, I need to have some of the structure worked out before the fall to help students feel some sense of security in this new adventure.



Engineering to Learn Science

I teach a one-trimester 9th grade course called Engineering & Physical Science.  For the engineering standards, I fell into the same kind of engineering projects that I’ve seen many science teachers fall into.  My students did a short straw tower project that did an okay job of teaching the nature of engineering standards from the Minnesota Science Standards and was something the students enjoyed, but connected to science concepts at a superficial level, at best.  I was well aware that, since students were not able to apply their knowledge in a meaningful way to design their towers, the project was really just tinkering.

This summer, thanks to a combination of my district’s participation in the University of Minnesota EngrTEAMS project and a generous grant from 3M, I was able to not only get some professional development over what good engineering instruction looks like, but I got the significant curriculum writing time and the materials budget that developing more meaningful engineering instruction takes.  The past two weeks in my 9th grade classroom, I had the rewarding experience of implementing the unit I developed with a teacher from St. Paul Public Schools and an instructional coach from EngrTEAMS.

The unit began with instruction over Newton’s Laws to prepare students to design a cargo carrier that would protect an egg in a head-on collision after rolling down a ramp, a variation on the classic egg drop project.  To keep students focused on the cargo carriers, where they could apply Newton’s Laws most directly, we provided cars the carrier could Velcro to.  The cars also had a spot on the front to attach a Vernier Dual-Range Force Sensor to measure the impact force when the vehicle crashed.

student project

A student project ready to head down the ramp

Realistically, students could create an effective cargo carrier without knowing anything about Newton’s Laws, so a major instructional task has been to give students a reason to make the connection.  Next week, students will be delivering presentations where they make a pitch for their design, which must include references to Newton’s Laws to justify design decisions.  To prepare students for this task, I’ve been spending a lot of time going from group to group to ask them to explain what they are doing, and I’m excited about the results.  Students who are normally checked out were not only able to articulate connections between Newton’s Laws and their designs, but some even started participating in class discussions intended to extend their understanding to other contexts.  Even when I just listened in, rather thank asking about connections to Newton’s Laws, students had a lot of great conversations about how to use Newton’s Laws to improve their design.  Next week, students will be preparing presentations intended to serve as a pitch as their design and will have a chance to share what they’ve been thinking with the entire class.

The past week and a half, while students have been designing, building, and testing, my classroom has been chaos, filled with noise and mess and activity.  Because that chaos is a result of students who are engaged and excited about their work, I was glad to embrace it.  My challenge now is to find ways to bring some of that same energy and ownership into other topics in the 9th grade course.

Leave a comment

Randall Physics

Ideas for Physics Teachers

explorations and inspirations... in how we learn science

Teach. Brian. Teach.

Brian reflects on his physics teaching

O'Shea Physics 180

(sometimes) daily updates about physics classes, mistakes, and sensemaking

Val Monticue

Recording reflections about teaching and learning


Physics, Modeling Instruction, Educational Technology, and other stuff I find interesting


Reflections on the dynamics of teaching

Physics! Blog!

Ever tried. Ever failed. No matter. Try again. Fail again. Fail better. - Samuel Beckett

SuperFly Physics

Physics questions, ideas, hare-brained schemes