It’s in the Numbers… or is it? #sol22
The last two and half weeks of school for me has been taken over by our online benchmark assessments in literacy and math. From grade K through fourth grade, we logged students into a platform based test in these curriculum areas and also measured their progress with discreet tests in letter sounds, phoneme segmentation, oral reading, and nonsense word recognition. Some of these students had spent the week before taking our state based tests, also online. While the administration of the test and mostly likely the taking of it by our students is a grueling gauntlet filled with technology potholes and attention draining, language challenging obstacles, it’s looking at the numbers that gives me the most pause.
After the assessments, we dig into the numbers. The latest ‘magic’ number is SGP, student growth percentile, the latest ‘accountability’ measure. The short version is that can we say that a student has made a year’s growth in a year or even more. Additionally can we evaluate the program and the teacher with these measures. Those are big shoes to fill. This assumes that the assessment measures what we want the student to learn, the student understands what they are being asked, and the conditions in which they take the assessment give us the optimal picture of these results. Those are big asks.
All things considered, we arrive at these objective numbers: the student’s score, the grade level equivalent, the percentile rank, and this student growth percentile. Earlier this year we also began to look these numbers across populations of students as well. So, what do we think about all of these numbers?
After years of looking at various assessments and various data from them, I expect that many of the students who assessments indicate concern will, in fact, at the middle of the year, begin to look as though it’s coming together. Generally, we hope that the trajectory after winter break will be steep and most students at the end of the year will meet expectations of grade level, growth, or hopefully both.
In the last few years, our measurements have lacked consistency as we fluctuated from in-person to remote learning. Finding the numbers to determine the growth students are achieving has been a challenge. But we arrive here, today, after a full year of in-person school. As we look at the numbers, can we trust them? Are the students who are indicated by the measurement the ones that need the most attention? What about the outliers? The surprises? What about when measures conflict?
In the end, the numbers are indicators, not crystal balls. We talk together about what we notice day to day, what might be getting in the way, and what direction we might take next. The way things have changed in the recent consideration. We talk more often about progress. We talk more about what we might try next at school. Our toolkit of what to consider and try is getting deeper all the time.
The truth is that many students are struggling to catch up to those expectations we had just a few years ago. The truth also is that I have to help people see that if the students don’t meet that artificial goal from pre-pandemic, that’s to be expected. That doesn’t make the effort wrong or less or anything. When we look at this new-fangled student growth potential, we shouldn’t judge the student or the teacher. We should think ‘what next’.
So over the next few days, I’ll be talking about what’s next. What might we teach a little deeper? What techniques really were successful this year? What felt like success? What does it feel like student need right now?
I have a few ideas. Let’s consider joy and count success. Let’s look at surprises and hope that it will leads to know students better. Let’s dream and affirm.
Then… let’s rest.