Monthly Archives: July 2017

Claim, Evidence, Reasoning: Final results

In this final post about the Claim, Evidence, Reasoning approach to teaching statistics, I will share some student results. Fundamental questions with the PDSA approach to reflecting on and improving practice are:

  • Will students engage?
  • Will students learn what I am attempting to teach?
  • Will students produce quality work?

Nearly all of the students in these two classes had prior experiences with statistics which allowed me the freedom to find a new approach. That said, there were definitely times when it became clear that some content instruction was needed, especially when we got into correlation and linear regression. But instead of trying to front-load all of the content, I waited until the need arose. For example, in looking at what students wrote about the class data it became clear that some instruction about regression lines and correlation coefficients was needed.

Now, to answer those questions.

Will students engage?

They didn’t at first – in that disastrous failure only 10% completed the first assignment. But I certainly learned from that experience, regrouped and restructured my approach. And then they engaged. My data show that 100% of my students engaged with the class, process, and content at some point and that 90% engaged consistently by the end of the term.

Will students learn what I am attempting to teach?

I was attempting to teach my students how to apply the claim, evidence, reasoning process that they had previously learned in humanities and science to statistics. Reviewing work against the rubric helped to build an understanding of what quality looks like. It also kept us focused on the goal of claim, evidence, reasoning. By then end of the class, 95% of students were able to review statements through this lens and identify whether or not they were on target.

Will students produce quality work?

This is the big question, right? It’s great if they will engage – that’s the first step – but if they aren’t working to producing quality work then what have they actually learned? Here are some representative examples of student work.

Analyzing movie data This assignment followed the best actor/actress investigation.

Education vs unemployment  Vinyl vs digital album sales  Juvenile incarceration rates This was the final assignment of the univariate data unit. Students had their choice of data to analyze.

Analyzing cars This assignment followed the class data investigation and included the opportunity for students to revise their work following feedback.

Fast food nutrition  1919 World Series This was the final assignment of the bivariate data unit. Students had their choice of data to analyze.

I will leave the question of whether these examples represent quality work to you, the reader. I hope you will let me know what you think.

 

1 Comment

Filed under BMTN, teaching

Claim, Evidence, Reasoning: About the data

In the last post, I shared the general process that I developed to teach statistics through the lens of Claim, Evidence, Reasoning. This process was tested and refined through several iterations. The data that I chose for these assignments & iterations was critical to student engagement and learning.

How do I know what kind of data is going to be interesting to students? Well, I ask them. I’ve been asking them for a lot of years. Every data set isn’t going to be interesting to every student, but overall, I have been able to identify and collect pretty good data sets.

In the spring term I used these data sets (and the associated class devised claims):

  • Minutes between blast times for Old Faithful (Claim: The time between blasts will be 90 minutes plus or minus 20 minutes.)
  • Ages of Best Actress and Best Actor Oscar winners (Claim: The ages of the Best Actress Oscar winners is typically less than the ages of the Best Actor Oscar winners.)
  • Box office (opening weekend, domestic, worldwide), critics & audience ratings for “original” movies and their sequels (Claim: Original movies are better than sequels.)
  • Juvenile detention/incarceration rates for various types of crimes by sex and race (Claim: African-American males are incarcerated at a higher rate than any other subgroup.)
  • Education level and unemployment rates (Claim: People with a higher level of education have lower unemployment rates.)
  • Sales of vinyl records and digital album downloads (Claim: Sales of vinyl records will soon overtake digital album downloads.)
  • Class measurements such as height, arm span, kneeling height, forearm length, hand span, etc (Claim: Human body measurements are related in a predictable way.)
  • Car data including curb weight, highway mpg, fuel type, and engine size (Claim: Highway mpg depends the most on fuel type.)
  • Fast food burger nutrition including calories, fat, protein, carbohydrates, etc (Claim: Fast food burgers are unhealthy.)
  • Baseball data from the 1919 Chicago White Sox (Claim: The evidence supports the decisions made about the accused players in the 1919 World Series.)

Even with all of these options, students added their own:

  • Skateboarding data including ages and birthplaces of known skaters and number of skate parks in a state (Claim: Professional skateboarders are most likely to come from California.)
  • Olympic male swimming data (Claim: Michael Phelps is the best Olympic swimmer of all time.)

What’s important about all of these data sets?

They all provide multiple variables and opportunities for comparison. They offer students multiple ways to investigate the claims. They allow students to create different representations to support their reasoning. So, the lesson here is that the data sets used much be robust enough for students to really dig into.

Imagine what could happen if the course were integrated with science or social studies.

Next post: The results

1 Comment

Filed under BMTN, teaching

Claim, Evidence, Reasoning: Starting fresh

I’m not an AP Stats teacher. I did that once. What’s important to me is that all students who graduate from high school have an opportunity to think and reason about real data in a deep and meaningful way. AP Stats is typically reserved for a few juniors or seniors. Maybe they don’t want to do AP Calc, or maybe they’ve already done it. Both of these reasons are unacceptable to me. Data literacy needs to have a higher profile – it needs to be more important than being able to simplify rational expressions. Our students need to be able to reason about data that’s presented to them in the press, or on social media, or by our elected officials. That’s my personal crusade.

Since last December I’ve been on this journey to improve my statistics teaching and the learning of my students. I shared my catastrophic failing first attempt and progress made with that group.  One of the beautiful things about our trimester schedule is that it allows me to immediately apply new learning to a new group – assuming that I am teaching a new section of the same course, I don’t have to wait a whole year to apply what I’ve learned. Luckily, this was the case this year. So, in late March I was able to begin anew, armed with what I learned during the previous term.

My spring term class was also a small group, but quite different from the winter class. This new class had more than 50% who struggled with writing. Since the focus of our work would be “claim, evidence, reasoning,” I would have to find alternative ways for these students to share their learning and their arguments. I wrote my new PDSA form and jumped in, hoping that I had learned enough from the winter term to be somewhat successful this time. (For information about PDSA cycles, see here and here.)

In general, I used this process to introduce concepts:

  • Tell students about the data, usually on paper and verbally, and give them time to make predictions about what they expect from the data. Students do not have access to the data yet. Have some discussion about those predictions. Write them on the board (or some other medium). These predictions become claims to investigate.
  • Give students access to the data & some graphical representations, usually on paper, and have them think about how the data might or might not support the claims that they made. Then ask them to discuss the data with a partner and determine whether or not the data support their claim.
  • Ask them to write a statement about whether or not the data support the claim and why. The why is important – it’s the evidence and the reasoning piece of the “claim, evidence, reasoning” approach.
  • Collect students’ statements, collate them into one document, then have students assess the statements according to the rubric. The focus here is on formulating an argument, not on calculating statistics or representing data. That comes later.

I completed this cycle twice, with two sets of data: minutes between blast times for Old Faithful and ages of winners of Best Actor and Best Actress Oscars.

These are the scaffolds that I provided for the first couple of steps for the Oscar data: predictions & analysis. Remember, the objective at this point is on making an argument, not calculating statistics or creating representations. Taking that piece out of the mix allowed students to focus on finding evidence and formulating reasoning for the claim that we had produced as a class. The next step is to collectively look at the statements that the students produced and assess where they fall on the rubric. This was the second time that we reviewed student work against the rubric. All of this introduction was treated as formative, so although the assignment (and whether or not it was completed) went into the grade book, no grade was attached.

The process for practicing was similar, but included less scaffolding and did not include the step of reviewing student statements. It generally went like this:

  • Tell students about the data, usually on paper and verbally, and give them time to make predictions about what they expect from the data. Students do not have access to the data yet. These predictions become claims to investigate.
  • Give students access to the data, generally in digital form, and a template to help them organize their thinking.
  • Have students calculate statistics and create representations to provide evidence to support or refute their claims.
  • Have students paste their representations into the template and write a statement or paragraph explaining the evidence (this is the reasoning step).

I did this cycle twice for our unit on univariate data: once using data about movies and their sequels and again using a variety of data from which students could choose. By the 4th cycle this is what the assignment directions and template looked like. This was the end of unit assignment for the spring term.

At the beginning of this post I mentioned that more than 50% of this particular class had been identified as having difficulties with writing. So, what did I do? I pushed them to write something – at least one statement (or, in some cases, two) – and then offered to let them talk through their evidence and reasoning with me. I knew that there was good reasoning happening, and I wasn’t assessing their writing anyway. So, why not make the necessary accommodations?

Next post: The importance of data choices.

1 Comment

Filed under BMTN, teaching