Wednesday, September 7, 2011

Information about the California Standards Test Part 2

Part 1

(You have my permission to skip this post. It's not all that interesting unless you're curious about the lengths we have to go to in order to extract any useful information out of the CSTs)

In the previous post I wrote about the most common ways I see our state test scores interpreted. You can't make a straight comparison of API year to year and you can't use it to track student growth from year to year. The quote from page 6 of the technical guide:
When comparing scale score results from the CSTs, the reviewer is limited to comparing results only within the same content area and grade.
The hard part is they're not *really* designed as anything more than a school report card. (Poorly designed, that is)

In this post we're going to see the two pieces of useful information I've been able to extract from the CSTs. Prepare to be underwhelmed!

You can't compare across grade or content, but you can compare districts/schools/teachers to each other as long as its the same grade/content/year. Basically, you're down to comparing teachers who teach the same content at your own school and comparing departments in different schools.

We don't get item analysis or even standard analysis. The smallest grain we get is by strand. For science, there's six. If I'm reading the Algebra test right, you get a whopping four strands. You'd think you could compare how your students did year to year by strand, but alas, California doesn't norm the strands either so you have no way of knowing if your students improved or if the strand just got easier. You can compare the percentage of students who were proficient in that year's strand to the mean California percentage. I've found it moderately useful to look at the difference between the two and look for big gaps. The problem of course is that California won't release enough detailed strand information for you to really tell what's going on. So my kids perform poorly in "Number Properties" or "World History and Geography." Lots of help there. You might as well tell me to "teach better" and expect results. Oh wait. That's exactly what happens.

I have, however, gotten good results from comparing strand information between teachers at your school (or district if you're lucky). The key to this one is acknowledging that every teacher's class makeup is different. At my school, we send certain types of kids to certain teachers. Even if your school does scheduling randomly, chances are in any one year those classes aren't going to be balanced between teachers. If you acknowledge that upfront, you're more likely to get honest conversation rather than people being defensive about their test scores.
cst2010 (version 1).xls

Here's what ours looked like. We mostly tracked each other. However, the blue teacher seemed to teach the Solar System section better. I'd probably also say the blue teacher performed a bit worse on the Chemical Reactions section. So that's a good start to a conversation. "Hey Blue Teacher, take us through your Solar System unit." This is pretty fast and worth it.

The other useful information I've gotten is comparing schools. Why compare schools? I visit a couple of schools each year to go observe. An easy place to start is with those schools with excellent science scores. This is a tricky one because demographics are so huge here. I've got three years of test scores for the other middle schools in San Jose and the percentage of students on free and reduced lunch and percentage of students who are English learners both correlate about -0.9 to every 8th grade test score. 1.0 would be a perfect line and 0 is a random scattering. 0.9 is really really close to a perfect line.

I've gotten some good information out of graphing that relationship (science score vs. free and reduced and vs. %ELL) and looked at who stuck out. I visited 2 of those schools and exchanged a few emails with another. Unfortunately, it takes awhile to gather that information and it's not all in the same place. You can cheat a bit by going to the CDE website and looking at what schools are listed as the similar 100 schools to yours. If nothing else, it's pretty interesting what the CDE considers "similar." Spoiler alert: They don't seem very similar to me at all.

A quicker way is to just gather a few of the other subject area tests and compare them. Again, you're just looking for a break in a trend. In 8th grade, you can take the History, Science, and ELA scores. Math is trickier because some kids take Alg, some take Gen Math, and some take Geometry (and a very few take Alg II).1


cst2010 (version 1).xls

The Red Line shows the state averages. The Yellow Line shows the county averages. You can see they track each other almost perfectly. Light blue is my school. Nearly every school I've looked at is reasonably close to that nice straight line. Yeah, we're a little better in science than expected but before I start bragging check out the purple line. They kick butt in science (also shown when I graph vs % free and reduced). The light green school? If I'm a social studies teacher I'm going to go visit that school. As a side note, that light green school actually has a population that's 100% on free and reduced lunch and close to that for % ELL. Every other score across the entire school lines up along expected values. If I'm the principal of that school I'm going into the history classes and figuring out what the heck they're doing and doing whatever I can to spread that. If I was a history teacher at a different school, I'd spend some series time observing those classes.


(wait! This got too long. I'm going to split this up and talk about student placement in post 2b)


1: If you wanted to, you could do what California assumes and bump the Gen Math kids down a level (only count Advanced as Proficient) and then maybe take the Geo kids and bump them up a level. You might get a good approximation. 

No comments:

Post a Comment