If you’re just tuning in, consider checking out the first post in the series, or the most recent post.

We’re drifting into a little section of my Algebra 1 curriculum that I’m at least a little bit ashamed of. I have no one to blame but myself since I put the textbook on the shelf and created lessons, practice, and assessments from scratch. Big plans for improvement in the months ahead, but for now, warts and all…

Here are the two-and-only questions from Form A of my old Topic 4 assessment:

Oh, the shame! I’ll talk more about the gap between this assessment and all-that-is-decent-in-this-world in a moment. For now I’ll just remark that what these questions *actually* demand of students is so far below what I originally intended that they are essentially useless as an assessment tool in Algebra 1.

The first major flaw in the original assessment is that the questions appear out of nowhere and drift away from our attention just as suddenly. So I replaced these two unrelated questions with a sequence of five questions related to a single scenario. Here are the questions from the new Form A:

My initial intention was for students to use expressions and/or equations to answer the questions on the assessment. On the original version, almost none of my students approached the problems algebraically. Many were able to answer the questions (and many were not), but nearly everyone who answered correctly did so with nothing more than some numerical tinkering.

While I’m not opposed to numerical tinkering (quite the contrary; I think it’s a fantastic practice for students), in this class and on this assessment I was hoping to see whether they could write an expression to model a situation and use the expression to answer another question or two in an efficient manner.

With the original assessment, this was a lost cause. With the updated version (particularly #3-5) I was able to measure at least part of what I set out to measure.

While my current assessment is an improvement over the first version, it still strikes me as terribly inadequate. Here we are, at the end of a unit on linear modeling, and there’s a massive void when it comes to two hugely important things: (1) At no point is any connection made between the verbal/numerical/algebraic representations and a graphical one (and it would be so easy to fix this!), and (2) The scenario is decidedly boring and contrived.

I have ideas for how to address #1, but am at a loss for how to remedy #2 in the space of a single-page assessment. More to think about for the next round of revisions.

One additional minor/medium flaw I see in the updated version is this: At no point do I ask students to explain their reasoning, justify their thinking, etc. (And word on the street is that those are cool things to do.)

Until next time…

]]>

The first post in the series is here. The previous post (Topic 2, Part 2) is here.

When I first drew up this assessment, my goals were to evaluate students’ ability at simplifying linear expressions and solving linear equations. Here’s what the two questions of Form A looked like:

I had the same all-my-eggs-in-one-basket problem with this original Topic 3 assessment as I did in an earlier assessment. If students aced these two questions, I knew they were capable of what they ought to be able to do. However, if they missed one or both, I was stuck without much information. There was no gradation in the all-or-nothing results.

Another issue: the assessment focused entirely on procedural skills and demanded nothing from students in terms of demonstrating deeper conceptual understanding.

As was the case with Topic 2 (detailed in posts here and here), I addressed the above concerns by lengthening the assessment quite a bit. The revised Topic 3 assessment weighs in at two pages and a total of ten questions.

In the first three questions, I try to get a read on whether students understand conceptually what a solution of an equation is. (For the record, what I’m looking for is something along the lines of “this value does/does not satisfy the equation,” along with numerical support—via substitution—of that claim.)

After that, students move through a series of four increasingly difficult linear equations, giving me the leveled progression I was lacking in the original assessment that would help me distinguish the “almost there” from the “completely lost.”

Next up, an error-analysis/explain-your-reasoning style question:

And to close, two more “solve” questions (including one at the same level of difficulty as the original Topic 3 assessment):

The net result of the these changes is a much stronger assessment, with improvements in at least two categories. The new assessment (1) provides me with more specific insight about student strengths and weaknesses, and (2) demands more of students in the way of critical thinking and clear communication.

I fully expect that this new assessment could be improved in half a dozen ways. Part of the beauty of teaching (and writing many of my own lessons and all of my own assessments) is the opportunity for continual improvement over the years. This job will never leave me bored!

Is there anything in particular you liked about the improvements I already made to my Topic 3 assessment? Do you have a few more ideas for making it even better? Share away!

]]>

It all started here. In the last post, I looked at additive and multiplicative inverses. Onward!

The second half of my original Topic 2 assessment assessed whether students were able to evaluate expressions involving integers and various operations (including radicals, rational exponents, and a few other things). My original approach included a single question, with everything all smashed together:

For those who were able to evaluate the expression correctly, I got precisely the information I needed (“Johnny can do this, that, and the other thing.”). But for those who answered the question incorrectly… Was it because they were lost on everything? Or because they struggled with one skill in particular? While a close look at their work would often reveal the answer to that latter question, I find that I’ve stripped one of the benefits of SBG (specific insight into specific strengths and weaknesses) right out of the question.

To address that weakness, I bumped this section of the assessment up from a single question to several (three, in fact):

I lose a minute or two more of class time to administer the assessment, though I gain a quick and clear sense of who’s struggling with exponentiation, rational exponents, and simplifying expressions involving multiple radicals. Note that while grouping symbols are entirely absent from #9 above, they make an appearance in some of the other assessment forms, including this one:

Even with this more discrete-ified set of questions—which I view as an improvement over the original—I still feel like this assessment is short on critical thinking and “explaining your reasoning.” A nice quick addition might be to present students with an expression (similar to #9 above) with two (or three) incorrect step-by-step approaches (each of which has exactly *one* error). Ask the students to identify the error in each approach and then show their own (100% correct) step-by-step solution. Here’s a quick mockup:

I’ve now written four of these “better-assessments-in-sixty-seconds” posts. Since I’ve taken two posts to address each topic (the content fell rather naturally into four categories, rather than only two), I might want to consider breaking these apart for the purpose of grade book entries. I might even leave the assessment handout itself unchanged, but the idea of more refined grade book categories for tracking student mastery certainly has its appeal.

Thoughts on that last thought? Comments on something else? You know what to do.

Cheers!

I’m terrible at coming up with imaginary student names for my handouts. So I often use my students’ names or my kids’ names (I have lots to choose from in this second category, now!). Today I borrowed some names from a list of fictional butlers. Oh, I also have a preference for names to follow an A, B, C, etc., pattern.

]]>

Last time in this quick-look-at-improving-assessment series (which began here) I shared my attempt at improving the questions related to distribution on an Algebra 1 assessment. As always, you can check out the topic list here (or here, if you want “I can…” statements as well).

This time we’ll take a look at a series of questions related to operations on numbers. Here’s the rubbish version (from the original Form A):

I was trying to get a read on whether students understood what additive and multiplicative properties are. For reasons similar to those shared in the first post in the series, this question type wasn’t particularly effective. Also, there’s the issue of “What am I actually trying to accomplish with these questions?” I don’t think I had that settled in my mind when I wrote the original assessment, and that led to the lackluster questions shown above.

If this assessment was going to improve at all, I first needed to nail down what I wanted to accomplish. Then I needed to work on better ways to ask questions (even just spicing up the originals with “explain your reasoning” or “defend your answer” would have been a nice start.

At any rate, I decided on three goals, so I wrote three mini-sections of the assessment. Here they are:

And here’s how I attempt to measure that on the new-and-hopefully-improved assessment:

Simple, but to the point. On to the next one…

Here’s how I tried to assess that skill:

I decided that this was actually the main reason we were exploring additive and multiplicative inverses in the first place, so a rather direct assessment question seemed appropriate. On to the third goal related to inverses…

The content isn’t profound or complex, so I thought it might provide a nice opportunity for students to create their first “mathematical” argument, one with complete sentences and mathematical “evidence.” With these two questions, I’m really trying to pave the way for more complex arguments students will make in Geometry, Algebra 2, Precalculus, and Calculus.

Now that I’ve written three of these posts, I’m wondering if I should add student work. I don’t have anything for the original versions, but for some of the revamped assessments I took pictures of strong and weak responses in order to facilitate in-class discussions the following day. If I can dig those images up, would they be worth posting? Share your thoughts (on this last question, or in general) below.

Cheers!

]]>

Two posts and this is officially a series, right? Off we go!

Let’s spend a little more time (how about 60 seconds?) looking at the second half of that old, filthy Topic 1 assessment:

That’s it. One measly simplification question. Nothing inherently wrong about the question itself, but…

With the updated Topic 1 assessment, I wanted to address to potential weaknesses to the approach I took on the original. First, I felt like one question wasn’t enough to get a sense of student mastery of distribution. For students who have mastered this, each question takes between 5 and 20 seconds, so time wasn’t an issue. With that in mind, I added two more “standard” distribution questions and scaled their difficulty:

(If the “compact form” comment doesn’t make sense *and* you wish it did—rare combination, possibly—let me know in the comments.)

Next, I wanted to add a question that required students to think outside the box, even if just a little, and at the same time would provide them with an opportunity to explain their thinking. It’s not terribly profound, but since we didn’t even hint at factoring during the lessons leading up to this assessment, I think this rather bland factoring problem, when appearing in this context, actually demands some critical thinking on the students’ part. Enough yammering, here’s the question:

Thoughts? Comments? Questions? Outrage? Share any and all in the comments!

The “original” revamped version of the Topic 1 assessment only included 7 questions. I added #8 just to spice up the blog post. Actually, in thinking about writing the post, I had an idea for how to—potentially—make this assessment even less terrible for next year. So it’s sort of a lie, and sort of not, with an emphasis on the latter, at least in/from the future. Errrr… Time travel. Brain hurts. 12:12 am. Time for bed.

]]>

Here’s an idea: I’ll write a post. It will take me a few minutes, or more.

Next, you’ll read the post. It’ll only take you a minute. It’ll be about assessment. Specifically, me describing how I took a terrible assessment question and made it less terrible.

Ready? Here we go!

In my previous post I linked to the Algebra 1 SBG assessments I wrote in 2011-2012. Largely, they stink. Here’s an example of a terrible question from the Topic 1 assessment (full list-o-topics is here):

That was Form A, and I’ve created about a million forms (okay, more like 5-10 forms) for each assessment (in every class, though, so the total really is pretty close to a million). Here’s a similar question from Form B:

I was trying to write questions that assess whether my students understand the commutative, associative, and distributive properties. In particular, I wanted to see if they could name the properties based on an algebraic or numerical example. I was also hopeful that they knew which operations are commutative and associative (and which are not).

Well, what I ended up with in my first attempt were some miserable true/false questions that don’t really accomplish any of what I was hoping for. An especially unfortunate consequence of the way I wrote the questions was that students who might otherwise have explained their reasoning quickly learned that this problem demanded no such thing. A one-word answer for each part is all that was called for. Worse yet, because I failed to require any record of thinking on the page, the majority of my students resolved to do no thinking at all in their minds. It became a guessing game, and one that they’re not particularly skilled at.

This year I’ve set about rewriting my Algebra 1 assessments. They’re not perfect, and I’ll probably want to run them through a revision cycle again next year (and the year after, and so on forever), but there are a few questions here and there that strike me as significant improvements over their original counterparts.

Here’s what the corresponding question looks like on the current Topic 1 assessment:

Just below #2 I ask whether multiplication is associative (#3) and whether division is associative (#4). It’s immediately more demanding, doesn’t let students off the hook, doesn’t tempt them to do less thinking than they might naturally do, and gives me a fairly clear sense of whether students know which property is which, and which operations are commutative/associative.

In writing this up, the only immediate change that I’d like to make to the question is to throw a third sentence in between the two already there. “Explain.” It’s implied, but it would be nice to state it explicitly. So then I would have something like, “Is addition commutative? Explain. Support your answer/explanation with an example.”

So, was that terribly longer than a minute? Or was it simply a terrible minute? Let me know what you think about this feature in general and/or this question comparison in particular in the comments.

Cheers!

]]>

In August 2011, inspired by this post from Dan Meyer, I introduced SBG into my Algebra 1 class. There were some bumps along the way, but overall I was thrilled with the change. I loved being able to answer student and parent questions along the lines of, “What can Johnny do to improve his grade,” with specific comments about strengths, weaknesses, and areas to attack. (Of course, the improve-your-grade-by-improving-your-understanding-and-then-demonstrating-this-improved-understanding is already a huge upgrade over the days of old when kids—and sometimes parents—would ask/beg for extra credit. But I digress…)

I’m somewhat embarrassed of the quality of the assessments I wrote then, but it was a start. For anyone brave enough to look at that first wave of SBG assessments I wrote, enter at your own risk. (BTW, I’ve abandoned my favorite app in the history of the world—Dropbox—for the new kid on the block—Copy—because of the oodles and oodles of free space, and the seemingly comparable features to Dropbox. After a few referrals I’m at 80GB of free storage. If you’re interested in signing up—15GB + 5GB for using the referral link—go here.)

This year I’ve decided that my assessments need a major improvement in quality, so I’ve set about to accomplish as much of that as I’m capable of (while still creating resources and preparing for all of my other classes). The I’m-totally-not-finished-yet-but-have-a-look-if-you-want second wave of Algebra 1 SBG assessments live here.

I’ll continue writing new assessments for those 24 topics—the same topics I sketched out in August 2011—through the rest of this semester. Originally I just had a list of topics; since then I’ve added “I can…” statements to the course outline. However, I’ll scrap both of those in the next year or two (even with the “new-and-improved” assessments I’m writing this year) as we transition to an integrated Common Core sequence. I expect some of the topics, lessons, and assessments will slide into our new Grade 8 course, while others will make their way into the high school’s Math 1 course. Nevertheless, one major and lasting benefit I’ll carry forward from this Algebra 1 SBG experience is (I hope) a better understanding of how to write decent assessments. At the very least, I’m learning to identify what I don’t like about many of the assessments I’ve written.

That’s all for now. I’ll write again soon about “Phase 1.5,” my experience applying SBG to AP Calculus AB midway through the 2012-2013 school year.

]]>

CCSSM Grade 7 Concepts and Skills List

Earlier this year I described my schedule, assumptions, goals, and game plan as they all relate to my school’s transition to CCSSM. Here’s an update on that process.

I haven’t spent as much time on this task as I originally intended in the first twelve weeks of school. I originally thought about 95% of my prep time would be devoted to the CCSSM transition project. Instead, about 60% of that time has been dedicated to helping teachers integrate technology into their classrooms in meaningful ways.

The emphasis has been on **students** using technology to **create** and **collaborate**, rather than students watching teachers use technology, or students using technology simply to receive/consume content.

It’s been an enjoyable experience, one in which I’ve learned a lot as we make mid-stream adjustments to our in-house technology training plan. We recently put together a “Technology Leadership Team” with teachers from various grade levels and disciplines throughout our K-12 school. With this team in place, I have an opportunity to shift a bit more of my attention back to the CCSSM transition.

My original goal was to transition our entire 7-12 math program to Common Core next year. After some wise words from the other member of the math department, and an okay from our superintendent, we’ve decided to transition middle school in 2014-2015 and high school in 2015-2016.

It’s only a tiny piece of the whole puzzle, but this week I finished writing a CCSSM Grade 7 Concepts and Skills List.

If you open up the document, you’ll find a number of comments I’ve written in the margin describing my rationale for certain things, my uncertainty about others, and my game plan for (in many cases) weaving content throughout the course (rather than forcing a set of topics—e.g., geometry—to stay confined to a particular time period).

I hope you’ll add your own comments and questions. Let me know if you want a duplicate copy of the document so you can put your own spin on sequence, emphasis, etc.

In the next couple of weeks I plan to do the same thing for our CCSSM Grade 8 course. At that point I’ll turn my attention to writing brief, SBG-style assessments for each topic on the lists.

Further down the to-do list: Performance assessments for the big ideas in each course (we’ll probably start with one per quarter) and rich anchor tasks/lessons for each unit.

Much further down the to-do list: Individual lessons/activities/tasks/practice to fill out each unit.

And beyond that… The same process for our high school courses.

]]>

Simple slide stealing in three flavors: Keynote, PDF, PowerPoint.

Attended the workshop? Let me know what you thought.

**Estimation 180**

A great resource for developing students’ number sense, estimation skills, unit sense, and ability to explain their reasoning in concise, specific ways.

**Statistics Learning Centre**

http://learnandteachstatistics.wordpress.com/

A blog all about teaching and learning statistics from ~~Middle Earth~~ New Zealand.

**Illustrative Mathematics**

http://www.illustrativemathematics.org/

A free online source of rich tasks illustrating the Common Core mathematics standards.

**Progressions (Tools for the Common Core)**

General website: http://commoncoretools.me/

Progressions category: http://commoncoretools.me/category/progressions/ or http://ime.math.arizona.edu/progressions/

HS Statistics and Probability document: Here

**Emergent Math’s PrBL Curriculum Maps**

http://emergentmath.com/my-problem-based-curriculum-maps/

“Geoff Krall Combs The Internet For Lesson Plans So You Don’t Have To”

Join Twitter, follow some of these people, and check out their blogs.

]]>

Here they are:

P.S. The links and comments are worth their weight in gold.

]]>