Category Archives: teaching

Making Progress

My class made some predictions about car data, without seeing it, and came up with 3 claims:

  1. The heavier the car, the lower the MPG.
  2. Electric cars will have a lower curb weight (than non-electric cars).
  3. Gas powered vehicles will have higher highway MPG than electric or hybrid vehicles. (We think this was written incorrectly, but didn’t catch the error, so decided to go with it.)

We focused on claim 1 first. Students easily produced the scatter plot …

03-15-2017 Image002

and concluded that there didn’t appear to be much of a relationship between highway MPG and curb weight. But they wanted to quantify it – evidence has to be clear, after all.

03-15-2017 Image001

Because of the viewing window, the line looks kind of steep. But the slope of the line is -0.01 (highway mpg / pound), so it’s really not very steep at all. And the correlation coefficient is -0.164, so that’s a pretty weak relationship when we group cars of all fuel types together.

Are there different relationships for the different fuel types?

03-15-2017 Image003

Turns out, yeah.

After some individual analysis, some discussion, and a scaffold to help organize their work, students shared their claim-evidence-reasoning (CER) paragraphs refuting claim 1.

Working on the quality

Step one was getting my students to write these CER paragraphs. (I’ve written about this before and how disastrous my efforts were.) Step two is improving the quality. I shared a rubric with my students.

rubric

We all sat around a table (it’s a small class) and reviewed all of the paragraphs together. They talked, I listened and asked clarifying questions. They assessed each paragraph. They decided that most of their paragraphs were below target. They said things like:

  • “That’s some good reasoning, but there’s no evidence to support it.”
  • “I’d like to see some actual numbers to support the claim.”
  • “I really like how clearly this is stated.”

Even though it took time to review, it was worth it.

4 Comments

Filed under BMTN, teaching

Learning from Failures

Continuous improvement in my practice is about identifying a specific process that can be improved, applying a change idea, collecting data, and analyzing the results. This term, I am attempting to apply change ideas in my Statistical Analysis class. This is a twelve week introductory class focusing mostly on descriptive statistics. My goal is to have my students reason more about what the statistics are telling them and to justify their claims with evidence. Our 9th grade team has put an emphasis on the structure of claim-evidence-reasoning across the content areas, meaning that students are using this structure in humanities and science and math. I wanted to continue that structure with my 10th graders in this statistics class. So I revamped my approach to the course.

My idea was to use claims to drive the data analysis. It started off well enough. I created some claims and used a Pear Deck to ask students to consider the kind of data that they might need to collect and analyze. (Pear Deck allows them to think individually and respond collaboratively.) Here are the claims:

  • Women who win the “Best Actress” Academy Award are typically younger than men who win the “Best Actor” Academy Award.
  • Sales of vinyl records are rising and will soon overtake the number of digital downloads.
  • Opening box office for sequels in movie franchises (for example. Captain America, Star Wars, Harry Potter, Hunger Games) is typically higher than for other movie openings.
  • LeBron James is the best professional basketball player of all time.
  • For-Profit colleges are more likely to recruit low income individuals for admission.
  • More African American males are incarcerated than any other group of Americans.

Conversation around these claims also included predictions about whether or not the students thought they were true.

Remember, though, the goal was to use the structure of claim-evidence-reasoning, and my kids needed a model. So I gave them this one. After a conversation with a humanities colleague, the students analyzed my example using the techniques they learned in humanities class (highlighting claims and evidence in two different colors). This led us to create “criteria for success” and structure for a five paragraph essay. The analysis showed me that my example could be improved, so I came back after Christmas break with a second draft. We had some discussion about what had changed and whether or not the second draft was an improvement or not. Seemed like all was well. Time for them to “have at it.”

But I wanted them to practice with a single, agreed upon, class claim first. So we brainstormed lots of different claims they could research and settled on:

Original films are typically better than newer entries or sequels.

They had this document to remind them about what to write and off they went to collect whatever data they thought was relevant. And then snow season began. During the first 3 weeks of January we had 6 classes due to holidays, workshop days, snow days, and a broken boiler (no heat). Even though we ask kids to do work during snow days, my students were making very little progress on this assignment. Colossal failure. I gave them too much all at once. They were wallowing in the data collection.

I regrouped. I looked at all of the data that they had collected and gave them this data set to analyze and this document to write their essays. Problem solved, right? Wrong, again. Still too much. At the end of week 4 of this “practice” assignment (interrupted by two more snow days), and after talking with by Better Math Teaching Network colleagues and my humanities colleague, I realized that I had never actually taught them how to write a paragraph that interprets a specific kind of statistic (even though they had examples).

So, at the end of January, I tackled how to write those body paragraphs. We started with writing about means and medians. Given these box plots of average critics ratings, I asked students to write what the medians say about the claim. 02-17-2017-image001

Thinking it would take them about 5 minutes to write, I thought we’d be able to critique the paragraphs that students wrote before the end of class. Wrong, again. But we were able to take a look at what they wrote during the next class. (It’s a very small class.)

I called on my humanities colleague once more and she helped me to create some scaffolding to help them organize their thoughts. This time, with variability. Each group of two received one of the variables to analyze and organize a paragraph around. Once again, we shared the paragraphs they wrote for each measure. I’m not sure how I feel about this, since all of the paragraphs are basically the same. But I guess the point was to focus on the statistics that they included as evidence and not the specific language used. Were the paragraphs “quality”? Here’s a first draft of a rubric to measure that.

As January turned into February, and the snow making machine really kicked in, I called uncle on this, feeling like I had eventually learned something – along with my students – and decided to move on. (We only have 12 weeks, after all.) I’m not sure if this is one iteration, two iterations, or three iterations of my change idea. How ever many iterations it is, it led me to a slightly different approach with scatterplot analysis.

But that’s another blog post.

 

4 Comments

Filed under BMTN, teaching

“I’m not good at math.”

I can’t tell you how many times I’ve heard this from students. I guess I was lucky. Growing up, nobody ever told me that I wasn’t good at math (or anything else, really). More importantly, nobody ever made it okay for me not to be good at math, or school, or whatever I was interested in learning about. But not all of my students have the family support that I did (and continue to have). So part of nurturing their talent falls to me. I’ve always told my students that I want them to be fearless problem solvers – to face life unafraid of what lies ahead. To nurture this I have to allow space within my classroom for some of the “messy stuff” like playing around with patterns and numbers, wondering about data and statistics, or building seemingly impossible 3D geometric structures. And then pointing out how they just did math – and they were good at it.

You see, when my students say, “I’m not good at math,” they really mean, “I’m not good at quickly recalling math facts when I think everyone is looking at me and waiting for me to respond in some brilliant way.” They equate math with arithmetic and math facts and speed. I try to point out the difference between math and arithmetic (math facts), which sometimes helps. I tell them how bad I am at subtracting numbers quickly in my head.

So what do I do to develop fearless problem solvers? I pose a problem for my students to solve. Then I step back. I observe. I listen. I ask questions. I make them all think before anyone is allowed to speak. I make them talk to me about what they’re thinking and I make them talk to each other, especially to each other. That way I get to listen more. I practice wait time, sometimes for awkwardly long, silent moments. Eventually, I no longer hear, “I’m not good at math.” Except when they want to compute something quickly, on the spot, in the moment, and it isn’t working. And then they turn and say, “Sorry, arithmetic.”

3 Comments

Filed under MTBoS Challenge, teaching

#LessonClose versions 1.1 & 1.2

WordPress tells me that I created this draft 3 months ago. I had every intention of updating along the journey of my Lesson Close adventure. Alas, that didn’t happen. Here’s what did happen …

I found it very difficult to decide, in the moment, which survey to send to students. So, I edited the survey to allow the students to choose what they wanted to tell me about the class – what they learned, how they learned it. I used the same survey structure as before, but this time students made a choice. I honestly thought that given a choice of what to reflect on, students would engage more. Wrong.

I asked them what happened: Too many choices, completing it electronically was too much of a hassle, there wasn’t enough time at the end of class to complete it.

Enter version 1.2: paper option, fewer choices, a few extra minutes. Still didn’t work. So I asked again: Still too many choices, still not enough time. One student said, “Even though the posting reminder came up with 5 minutes to go, our conversations about the math were so engaging that we didn’t want to stop to do a survey.” Another said, “The first question was fine, but I really didn’t want to take the time to write stuff for the second question.” This was the general sentiment.

When I reflected on this sequence of events with my colleagues at the Better Math Teaching Network, one teacher (who also has several years of teaching experience) said, “I feel like exit slips are just data for someone else who isn’t in my classroom. I know what my kids know and what they don’t know because I talk with them.” And I thought, she’s absolutely right. Here I was, trying to do something with exit polls – trying to get my students to reflect on the class, to be meta-cognitive about their learning. They were telling me through their actions and class engagement that they were learning just fine, thank you.

I have lots of formative assessment strategies, but this is the last time that I try to implement exit slips for the sake of implementing exit slips. I know what my kids know because I talk to them.

2 Comments

Filed under BMTN, teaching

Tidbits from the Week

I tried my #LessonClose a couple of times this week in my 3D geometry class. The first time I used the Collaboration poll and the second time I used a new Learning from Mistakes poll. Both polls were given while working on the question: “Which Platonic solids can tessellate space?” It was clear that cubes would work, but there was some disagreement about the tetrahedron.

Student comments about how they collaborated included:

  • I was part of the conversation when we were brainstorming answers
  • I participated but did not lead
  • I argued about shapes that would or wouldn’t work
  • Everyone’s voice was heard

Student comments about Learning from Mistakes included:

  • I assumed an angle stayed that same when rotated on the plane. This turned out to be false, and I had to later revise my answer.
  • I forgot Trig, and I may have messed up a little bit with the answer of #1
  • A few times, we started to follow a faulty train of logic, based on angle assumptions, that messed us up until we figured it out.
  • I wasn’t exactly wrong just wasn’t very sure. My group had made a prediction that was actually true.

I’m finding it difficult to decide on the specific poll to give. I might create a new poll that let’s the student select which aspect they would like to give me some information about. This is the heart of the PDSA cycle – learning quickly and making changes to improve.


Earlier this week, Dan Meyer wrote this post about explanations and mathematical zombies using z-scores as an example. In the comments I shared an activity that I’ve used, one that I posted about last year. It so happened, that it was time for that activity again this week. In both classes, students were able to develop the reasoning behind the formula through discussion. One student even described what we were doing as a transformation to the standard normal distribution. Never once did we write a formula.


Once again my Flex Friday work connects me with students who are new to Baxter Academy. This year we are teaching them skills through workshops. This Friday’s morning workshop employed design thinking: asking and answering “How Might We … ” (HMW) questions. (For more about this approach check out The Teachers Guild or the d.school at Stanford.) Once you have a bunch of HMW questions, you attempt to answer each one in as many ways as you can think of. My colleague called this “thinking around the question” and illustrated it with the question, “What’s half of thirteen (or 13)?” And here’s what we came up with.

img_20160916_120055631

 

Leave a comment

Filed under Flex Friday, problem solving, teaching

Lesson Closure & Exit Polls – Images

I’ve received a couple of requests for some larger images from the last post on Lesson Closure. Here’s my attempt at providing them.

First, the process map.

exit poll process map

A few exit poll examples.

I have a few other exit polls, but you get the idea. One question to rate the day and one question to elaborate a little bit.

Leave a comment

Filed under teaching

Lesson Closure & Exit Polls

At the end of June I wrote about Continuous Improvement and promised that I would share updates throughout the upcoming school year. Well, here’s update #1, thanks to a great post by @druinok about Closure and Exit Slips.

Just like the post says, I, too, have always struggled with wrapping up lessons before the bell rings. Okay, we don’t have bells, but there comes a time when the students have to move on to another class. Too often, it seems like we are all so involved that the time just creeps up on us and off we go. That means that I have to rely on my gut instincts to plan for the next day. After so many years of teaching, it seems to work, at least from my perspective, but am I really serving my students in the best possible way that I can?

As a member of the Better Math Teaching Network, I had to come up with a plan – something in my practice that I can tweak, test, and adjust with ease. So, I decided to focus on class closure. Since I don’t have an actual process for this, I had to think intentionally about what I might be able to do. I created this process map:

exit poll process map

I focused on the final 10 minutes of class. Who knows if this is appropriate or not. That will be one of the adjustments that I will have to make, I’m sure. But, I have created a set of Google forms that are designed to solicit some focused feedback that I’ve designated as “process” or “content” oriented. Here is a sampling of “process” Exit Polls I’ve created:

And for “content”:

 

What I like about the Google Form is that I anticipate it will be easy for students to access (most have smart phones, all have laptops) and I can post a link on the Google Classroom.

I am hopeful that this process, this structure, will push me to gather deliberate and intentional data from my students so that I am able to plan better each day. Time will tell.

8 Comments

Filed under teaching

Continuous Improvement

How do teachers improve their practice? This is a question I have been asking for my entire career (over 25 years). During the past year, I was involved with a group of high school teachers, coaches, administrators, and researchers working on how to scientifically study how to improve. In our case, the focus was on improving student engagement, specifically in Algebra 1. Since this is seen as such a gateway into high school mathematics, if we cannot help students to engage, we are narrowing their future opportunities. So we tried this new (to me) approach called a PDSA (Plan, Do, Study, Act). You set a goal, decide how you will measure your progress toward the goal, make some predictions, collect the data and analyze it, then revise. These are meant to happen in short cycles, 1 to 2 weeks.

What did we do?

My small group focused on student communication. Students often seem reluctant to share their thinking, so we devised a protocol called “Structured Math Talk” during which students were given a task to work on individually for a few minutes and then turn and talk with a partner. The partner talk was by turn and timed. One partner talked and the other listened and then they switched. This is our first PDSA form. It turned out to be quite challenging to gather this data. We were teaching under different circumstances: some of us had 55 minute classes that met every day, some had 80 minute classes that met every other day, and others had 90 minute classes that met every day. Trying to figure out the right amount of time that constituted that 1 to 2 week cycle was a challenge. (Plus, I often forgot to have students complete the exit slips.) But, it was clear that our students were compliant. We asked them to talk about math and they did. We were concerned, however, that they were only talking to each other because of the structure we imposed. Would they continue to share their thinking with each other even when we weren’t watching? This was our revision for PDSA cycle 2.

Our data was showing so much success that we questioned our entire process. Are we asking the right questions on the exit slip? Do our students understand the questions on the exit slip?  Are we using the right kinds of tasks? Are we asking our students to engage in meaningful mathematics? So, we paused. We went to the ATMNE 2015 Fall Conference together. We read. We learned. We regrouped and refocused on the idea of productive struggle. That would feed the conversations, get our students to persevere, and push us to make sure that we were providing meaningful mathematical tasks.

What did I learn from this experience?

  • It’s difficult to document the small adjustments that teachers make every day, all the time. It’s difficult to be scientific about those small changes that happen in the moment. It’s important to develop a mindset of doing this, however, because that is how we can help each other improve.
  • I’m not sure we were asking the right questions. Not the right question to study, not the right questions of our students, and not the right questions to help us learn.
  • My students are generally willing to engage in whatever task I throw at them. It was never a problem for me to get them to talk to each other or to try something that they had never done before.
  • This process is an adaptation of Edward Deming‘s process cycle. My brother has done this work for 30+ years and is an expert in Lean management techniques.

What’s next?

The small group has expanded and we’re now known as the Better Math Teaching Network. Our first meeting is in July, a 4-day institute where I hope to share my new learning with others and learn better techniques for meaningful data collection. The trick, I think, will be to ask the right questions.

4 Comments

Filed under teaching

More 3D Geometry

teach

After my last post, Mike Lawler gave me all of these awesome ideas for my 3D geometry class. Considering that my class has been working on nets, I was most fascinated by the dodecahedron that folds into a cube, which came from Simon Gregg.

When I first watched the gif animation, I just couldn’t figure out what was going on. I thought, “I’ve got to show this to my students!” Thursday was that day. I tasked them with a build challenge. Of course 55 minutes wasn’t enough time to complete anything, but students had drawings (which gave us insight into the construction)

a CAD rendering (completed during a snow day)

and a previously constructed dodecahedron that had been re-purposed (completed during lunch).

So, thanks Mike, for the inspiration, and thanks #MTBoS for being there helping us to support each other.

1 Comment

Filed under MTBoS Challenge, teaching

“How Do We Know That?”

questions

I’m teaching this 12 week geometry class focusing on 3-dimensional figures. It’s a brand new class, like many at Baxter Academy, so I get to make it up as I go. Since our focus is on 3-dimensional figures, I thought I would begin with some Platonic solids. So I found some nets of the solids that my students could cut and fold. Once they had them constructed, there was a lot of recognition of the different shapes and, even though I was calling them tetrahedron, octahedron, and so on, many of my students began referring to them as if they were dice: D4, D8, D12, D20. Anyway, I must have made some statement about there only being 5 Platonic solids, and they now had the complete set. One student asked, “How do we know that? How do we know that there are only 5?” Great question, right?

I really felt that before we could go down the road of answering that question, my students needed a bit more knowledge and exploration around these shapes, and maybe some thinking around tiling the plane would help, too. So we spent some time trying to draw them, counting faces, edges, and vertices, visualizing what they might look like with vertices cut off, unfolding them into nets, and wondering why regular hexagons tiled a plane, but regular pentagons did not. We played around with the sides – a lot – and even talked about this thing called vertex angle defect. Then we returned to the question of why only five. Students were able to connect the need for some defect (angles totaling less than 360 degrees) and the ability to create a 3-dimensional figure. Through the investigation, they were able to see that the only combinations of regular polygons that worked (by sharing a vertex) would be 3, 4, and 5 equilateral triangles, 3 squares, and 3 regular pentagons. They could give solid reasons why 6 triangles, 4 squares, 4 pentagons, and any number of other regular polygons could not be used to create a new Platonic solid.

I had not anticipated this question, and had not included it in my plans. But, because it was asked, thankfully, by a student, it pushed us into thinking more deeply about these shapes (and their definition). And, ultimately, my students were able to answer the “why only five” question for themselves.

7 Comments

Filed under MTBoS Challenge, problem solving, teaching