# Transforming a Math Program, Pt. 3: Assessing what we value

“Assess what you value; value what you assess” – Grant Wiggins

Last we left off, I set out to institutionalize problem-based learning into my charter network’s math program, with the goal of more deliberately fostering students’ conceptual understanding. Of the many ways this could be done, I started with the easiest approach that had the smallest ripple effect: I selected a few rich math problems and worked them into our existing project-based learning structure. As explained in my previous post, this approach seemed to be an improvement, but it wasn’t sufficient.

One reason was precisely because this change had a small ripple effect. Specifically, I was trying to change what we value – an increased emphasis on conceptual understanding – without changing what we assessed. As a result, our efforts to implement rich math tasks to develop students’ conceptual understanding worked against against our other efforts to backwards-plan curriculum and instruction from assessment. Deciding to adjust curriculum and instruction without assessment was like changing the lengths of two of three legs on a stool. It brought to mind a corollary to the Grant Wiggins quote above: if you don’t assess it, you don’t value it.

Before going further, I should outline what our summative assessment and grading system was at the time: every course, regardless of grade level or discipline, had the same overall grading system, involving two components: cognitive skills and content. Both require some explanation.

Our network identified 36 interdisciplinary cognitive skills based on CCSS Math & ELA, NGSS, and the C3 Social Studies standards. Attached to each cognitive skill is a longitudinal 8-column rubric. Every project serves as an assessment of one or more of these cognitive skills. Students’ final products in these projects are scored such that students are placed at one of the 8 levels on the applicable cognitive skills. The rubric levels result in different grades depending on the student’s grade level, e.g. a 6th grader must score a level 3 to get an A, but a 10th grader must score a level 6 to get an A.

Each course also has its specific content, broken up into about 8 to 15 “focus areas.” In math courses, these consist of what most teachers call the rote, procedural skills. Students must pass a 10-question computer-scored test for each focus area; the questions are pulled from a pool, so students can take the test multiple times. Students learned content primarily during their “personalized learning time,” primarily from playlists, curated lists of online resources.

Recall that the Common Core defines rigor in three categories: procedural fluency, conceptual understanding, and application. Our content system was useful for assessing procedural fluency, and our projects were useful for assessing applications; but we weren’t explicitly assessing conceptual understanding. And at our schools, where teachers plan their lessons, formative assessments, feedback, etc. with a deliberate focus on the summative assessment criteria, efforts to weave PrBL for conceptual understanding into projects created confusing incentives for teachers and students.

The most logical solution was to add a new component to our assessment system: concepts. Changing the assessment would encourage instruction and curriculum aimed at conceptual understanding to take hold. In fact, this allowed a new structure to be born, aimed at teaching concepts: problem-based concept units. No longer would we need efforts at PrBL to be embedded within a project, which had its shortcomings. Here are some visuals of how our program changed:

In my first post in this series, I described some of the problems I set out to address: disjointed treatment of math concepts, a lack of mathematical coherence in the curriculum, inefficient use of instructional minutes, math topics that didn’t have natural real-world applications, to name a few. I believe the changes described in this post set in place the conditions necessary to address those challenges.

Next up, the implementation.