Welcome to the Explore the MTBoS 2017 Blogging Initiative! With the start of a new year, there is no better time to start a new blog! For those of you who have blogs, it is also the perfect time to get inspired to write again! Please join us to participate in this years blogging initiative! […]
My advisory students are now seniors. We started together four years ago, along with the school. It’s a humbling journey to spend four years with the same group of students, helping them navigate through high school, getting them ready for whatever adventure follows.
We do a lot of work in advisory – research about “Life after Baxter,” prepping for student-led conferences, creating and maintaining digital portfolios, keeping track of academic progress, and completing any required paperwork, for starters. Even though we meet three times a week for about 35 minutes each time, we still have some “down” time.
There’s lots of talk out there, and especially in New England, about standards-based education. Whatever you think about standards-based, or proficiency-based, or competency-based education (they are all the same to me – just using some different words), the bottom line is that we teachers are now supposed to be able to certify that, regardless of any other factors beyond our control, our students are able to _________. Fill in the blank with your skill or habit of choice. This is tricky business. The tricky part is
- not to distill learning into a checklist of discrete items that have no connection to each other.
- to maintain a cohesive, robust curriculum with a clear scope and sequence.
- to develop cross-curricular, integrated courses that give students rich opportunities to build those skills.
- to build an assessment system that students, teachers, and parents have a common understanding of.
My school has put a lot of energy into creating a standards-based assessment (and reporting) system. Since we are still a new school, there is nothing to change except our own perceptions. We started out using the old 1-2-3-4 system, but ran into trouble with different interpretations of what those numbers represented and how students were able to achieve, or not. Some teachers maintained that standards in a course were global and that there was little chance for a 9th grader to demonstrate at a level higher than a 2. Other teachers defined course standards as local, so that students could earn a 3 or even a 4 on the standards within that class. Clearly, this was a problem.
The other problem is that any time grades are represented using numbers, people want to operate with them, or break them down further (using 2.3, for example). But those numbers represent discrete categories of performance or understanding. A 2.3 doesn’t make any sense if it isn’t defined. So we had to create a brand new system.
Each reporting standard – those big things like Algebra & Functions – has indicators that are connected to each level on the big scale toward graduation benchmarks. These are defined in a rubric. For any given course, we identify what the “target” knowledge & skills are, what level of the rubric we are targeting. For example, in the Modeling in Math class, the target level is Entering.
During a course, we report if a student is “below target,” “on target,” or “above target” for an assessment on particular indicator of a reporting standard. This way a student can be “on target” – meaning that the student is making solid progress and is doing what is expected in the course – but still not be at the graduation benchmark for that standard. After all, Modeling in Math is the first course that our 9th graders take. It’s unlikely that they will meet the graduation benchmark after just this one twelve-week class.
Report cards and transcripts report the big picture status toward graduation. So that 9th grader who was “on target” during the class has made progress toward graduation, but still has work to do to meet that benchmark. And that work could happen in a series of courses or through some combination of courses and portfolio, giving the student control over her education.
WordPress tells me that I created this draft 3 months ago. I had every intention of updating along the journey of my Lesson Close adventure. Alas, that didn’t happen. Here’s what did happen …
I found it very difficult to decide, in the moment, which survey to send to students. So, I edited the survey to allow the students to choose what they wanted to tell me about the class – what they learned, how they learned it. I used the same survey structure as before, but this time students made a choice. I honestly thought that given a choice of what to reflect on, students would engage more. Wrong.
I asked them what happened: Too many choices, completing it electronically was too much of a hassle, there wasn’t enough time at the end of class to complete it.
Enter version 1.2: paper option, fewer choices, a few extra minutes. Still didn’t work. So I asked again: Still too many choices, still not enough time. One student said, “Even though the posting reminder came up with 5 minutes to go, our conversations about the math were so engaging that we didn’t want to stop to do a survey.” Another said, “The first question was fine, but I really didn’t want to take the time to write stuff for the second question.” This was the general sentiment.
When I reflected on this sequence of events with my colleagues at the Better Math Teaching Network, one teacher (who also has several years of teaching experience) said, “I feel like exit slips are just data for someone else who isn’t in my classroom. I know what my kids know and what they don’t know because I talk with them.” And I thought, she’s absolutely right. Here I was, trying to do something with exit polls – trying to get my students to reflect on the class, to be meta-cognitive about their learning. They were telling me through their actions and class engagement that they were learning just fine, thank you.
I have lots of formative assessment strategies, but this is the last time that I try to implement exit slips for the sake of implementing exit slips. I know what my kids know because I talk to them.
Hey everyone! I just felt like I should email all of you to say hi, and to assure you that I haven’t forgotten Baxter, and to remind myself to stay in touch, and to tell you that [college] is great and almost everything is going really well! I’m actually kind of tearing up writing this, which surprised me since that’s not something that happens to me very often. I’ll try to remember to stop by the school if it’s in session while I’m visiting home!
Teaching is about building relationships with our students.
I tried my #LessonClose a couple of times this week in my 3D geometry class. The first time I used the Collaboration poll and the second time I used a new Learning from Mistakes poll. Both polls were given while working on the question: “Which Platonic solids can tessellate space?” It was clear that cubes would work, but there was some disagreement about the tetrahedron.
Student comments about how they collaborated included:
- I was part of the conversation when we were brainstorming answers
- I participated but did not lead
- I argued about shapes that would or wouldn’t work
- Everyone’s voice was heard
Student comments about Learning from Mistakes included:
- I assumed an angle stayed that same when rotated on the plane. This turned out to be false, and I had to later revise my answer.
- I forgot Trig, and I may have messed up a little bit with the answer of #1
- A few times, we started to follow a faulty train of logic, based on angle assumptions, that messed us up until we figured it out.
- I wasn’t exactly wrong just wasn’t very sure. My group had made a prediction that was actually true.
I’m finding it difficult to decide on the specific poll to give. I might create a new poll that let’s the student select which aspect they would like to give me some information about. This is the heart of the PDSA cycle – learning quickly and making changes to improve.
Earlier this week, Dan Meyer wrote this post about explanations and mathematical zombies using z-scores as an example. In the comments I shared an activity that I’ve used, one that I posted about last year. It so happened, that it was time for that activity again this week. In both classes, students were able to develop the reasoning behind the formula through discussion. One student even described what we were doing as a transformation to the standard normal distribution. Never once did we write a formula.
Once again my Flex Friday work connects me with students who are new to Baxter Academy. This year we are teaching them skills through workshops. This Friday’s morning workshop employed design thinking: asking and answering “How Might We … ” (HMW) questions. (For more about this approach check out The Teachers Guild or the d.school at Stanford.) Once you have a bunch of HMW questions, you attempt to answer each one in as many ways as you can think of. My colleague called this “thinking around the question” and illustrated it with the question, “What’s half of thirteen (or 13)?” And here’s what we came up with.
I’ve received a couple of requests for some larger images from the last post on Lesson Closure. Here’s my attempt at providing them.
First, the process map.
A few exit poll examples.
I have a few other exit polls, but you get the idea. One question to rate the day and one question to elaborate a little bit.