In an earlier post I wrote about the dangerous, counterproductive, if casual racism of the Math Task Force Report and in another I wrote about the almost complete lack of assessment of MMSD Middle School Math teacher knowledge and skills to back up the assertion that “the adequacy of teacher preparation is a significant problem” and the linked recommendation that there be a “substantial investment in mathematics content-based professional development and a change in hiring priorities at the district level.”
On Monday April 20, the Board of Education will likely act on this recommendation, to the tune of $392,500 through 2011-12 1nd a continuing cost of $159,00 annually, thereafter. Before that happens, I think it is appropriate to revisit and expand on the earlier post concerning the treatment of middle school teacher preparation as “the problem” by the Math Task Force.
Before doing that I want to say a few things about the “Data ____ Decision Making” in the title. I’ve always believed in “data informed decision making,” or “data guided decision making” and have rejected the “data driven decision making” because it gives too much power to the in all ways limited data we have to work with, pretends to a false objectivity, when the data itself, and how it is presented, are the products of biases and choices, ignores what cannot be quantified, and marginalizes human feelings and judgment as somehow illegitimate. [For previous related posts, see here, here and here; for similar ideas see Deborah Meier, “‘Data Informed,’ Not ‘Data Driven.’”] Some in MMSD administration and governance (and at least rhetorically the Math Task Force) espouse the “data driven” ideal. In the case of this recommendation, the Emperor has no (or few) clothes, where there should be data to inform or guide, there is a blank, a hole.
I mapped part of that hole in the previous post and I’m not going to do over the details again. Here is the short version. The Task Force was asked to provide (among other things), “A discussion of how to improve MMSD student achievement.” As explored in the earlier post, nationally there has been a lot of heat and some light around the idea that inadequately trained Middle School math teachers are a major problem. They then lazily applied this analysis to Madison.
Finding out if this is a problem in Madison would have been a good thing for the Task Force to do. Instead, they did a very cursory survey of how many middle school Math teachers had a credential that the report itself says needs to be changed and then jumped on the “middle school teachers are the problem” bandwagon. Now the MMSD administration has joined them and the Board is poised to board the same wagon, bringing our tax dollars with them.
As I recommended in the previous post, the first (now next step) should be assessing the knowledge of our teachers via “the materials being developed by the Learning Mathematics for Teaching Project at the University of Michigan or a similar inventory.” Certainly before we commit $392,500 to fixing a problem we should have some idea if there is a problem.
There is another way to see if Middle School Mathematics instruction is the problem and that is by looking at student achievement. Although the report devotes 83 pages to WKCE data analysis, the question of whether the test scores in any way reveal a problem with Middle School instruction is never addressed. From what I can tell, they don’t.
I don’t have access to the scale scores or the grant money or the expertise time the Task Force had, but I did take a look and there were no red flags identifying the Middle School years as being the locus of achievement problems. The chart at the top (and below for your convenience) is the product of my exploration.
What I did was try to identify trends by looking at cohorts, starting with the class that was in 4th grade in 2002 (the WKCE changed that year and use of prior years for comparisons is not recommended). This chart shows the percentage of students who scored in the advanced or proficient range as they moved from 4th grade to 8th grade. The scores aren’t there for each grade for each year, but this is what we have to work with (informed, guided, not driven). Call it a poor man’s “value added.”
The most striking thing is that things don’t change much from year to year, or cohort or between the cohort. The range of variation is small; from 71.5% to 75.6%.
When we look at the cohorts, the 4th, 5th and 6th grade scores are mostly a baseline and any identification of failures in Middle school would be in the 7th and 8th grade scores. The data is minimal, but in all three data points for 7th grade there is improvement over the previous data point for that cohort and one 8th grade cohort shows continued improvement while the other shows a good sized drop off.
What does this tell us? Not much. What might that mean? To me it means we should know more before we commit to investing in improving Middle School teacher preparation.
One more chart, because I think we always need to keep inequalities in mind.
Same idea as the other chart, but this time only looking at “Economically Disadvantaged” students. The first thing I notice is how low the percentages are and see yet more evidence that we have a long way to go to eliminate gaps in achievement based on wealth. That this may never happen is no reason to stop trying. Next, again a narrow range, this time between 48% and 55.9%.
Some up and down but I don’t see anything compelling pointing to the Middle School years as the problem.
I did find the 2003 cohort up and down here and for all students intriguing, so I looked at state trends for those years, What I found was a similar if less extreme pattern, which may indicate that it is a product of the test and not a measure of the students. Statewide the 6th grade advanced/proficient percentage (A/S) for that all students (AS) in that cohort was 73%; for Economically Disadvantaged students (ED) it was 53.9%. In seventh grade the A/S rose to 79.1% and the ED to 61.5%. The 8th grade drops were to 75.3% AS and 56.8 ED. [I’d do a full chart, but I don’t have time, maybe in an update later). Data guided or informed, not driven or blank.
One last related note. A good thing about the new governance system in MMSD is that action items are supposed to come before committees a week before the Board acts. No more release of recommendations late Friday for action on Monday. This one didn’t. Last Monday the Board of Education agenda included a 10 page, 13 point “Summary Response” that gave an outline, but few details (Recommendation 3, here). I think a commitment of $392,500 deserves to go through the process.
As the above should make clear, I have serious doubts about whether the commitment of these resources is a good idea and have no doubt that the case that has been made so far, recommending this commitment, is less than weak. I ask that the Board and the administration step back and begin by clearly demonstrating there is a problem that needs to be addressed, and again recommend that part of this be a real assessment of the knowledge and skills of our Mathematics teachers. What will be before the board on Monday is an expensive solution in search of a problem.
Thomas J. Mertz