Thomas J. Mertz
Author Archives: Thomas J. Mertz
In December I announced my candidacy for the Madison Metropolitan School District Board of Education, Seat #5 (announcement below). There will be a primary election February 19, 2013 and the general is April 2, 2013. I hope I have earned the support of the readers of this blog. You can find out more about my campaign, endorse, volunteer and donate at MertzforMadison.com. I am not sure if I will be doing any blogging during the campaign, but if I do things directly related to the Madison schools will be posted at MertzforMadison.com, and anything posted here at AMPS will be more about state and national matters.
Prepared, Progressive, Passionate
I am excited to announce my candidacy for the Madison Metropolitan School District Board of Education, Seat #5.
Our public schools are the backbone of our community, the wellspring of our democracy, and the best means we have of providing a better future to all our children. As a parent, scholar, advocate, activist and organizer, I have worked with parents, professors, students, school boards, administrators, legislators, educators, and their unions to better understand and strengthen public schools. I don’t think there has ever been a time when the challenges to our schools have been greater. I want to help Madison meet these challenges by serving on the Board of Education.
I have stood against the pressures of privatization, worked against the expansion and misuse of standardized testing, and have fought for adequate and equitable funding based on the idea that all of our students deserve broad and rich opportunities.
These struggles will continue and expand. As Madison prepares to welcome a new Superintendent, I see opportunities to do more than react. Madison is a community and district where we have the means and the will to show that diverse public education can live up to its promises. To do this we must honestly assess those failings illustrated by the achievement gaps, but also listen to voices of our classrooms and community to understand what is working and build from our strengths.
None of this will be quick and none of this will be easy. I ask for your help and support. Visit www.mertzformadison.com to endorse, donate, or volunteer; and “like” the TJ Mertz, Madison School Board, Seat #5 Facebook page to keep updated.
Thomas J. Mertz
Update: Still haven’t seen the details, but according to the Press Release, the answer to question number three is a partial yes, with calls for “full funding” of SAGE (it isn’t really “full funding,” see here on the complexities of SAGE funding), increased sparsity aid, increased Bilingual/BiCultural aid in the second year, increased special education aid in the second year, new grant programs around STEM and vocational education, “educator effectiveness, ” and more). No poverty aid. For overall state funding the combined “categorical and general school aid” Evers calls for would be “a 2.4 percent increase in the first year of the budget, the same as the Consumer Price Index, and 5.5 percent in 2014-15.” I don’t see anything on Revenue Limits. More later.
Update #2: From a second Press Release, on Revenue Limits: “The plan restores revenue limit authority to all districts. It calls for an increase in the per pupil revenue limit to $225 per student in the first year of the budget and $230 per student in 2014-15.” More details on Fair Funding and other matters in this Press Release also. A district by district tally may be found here.
Wisconsin State Superintendent of Public Instruction Tony Evers will reveal the remainder of his 2013-15 budget proposal on Monday (the first portion was released in September, but it lacks full school finance information; WisconsinEye will be covering the event). Evers has also announced he is seeking re-election next April (campaign website here; see here for thoughts on elections and holding Evers and others accountable for their actions and inaction).
We know that Evers budget will be based on the Fair Funding For Our Future framework. We know that in outsourcing how our state defines what it means to be educated to American College Testing (the ACT) it will call for an increase in spending of time and money on standardized testing and the processing of standardized test based data (for a horror story outsourcing testing related things in Florida, see “The outsourcing of almost everything in state departments of education,” from Sherman Dorn. We know that it will in most ways be better than what Governor Scott Walker proposes, especially if the rumors that the Walker proposal will include Tim Sullivan’s “Performance Based Funding” are true ( by design this would direct resources away from those students and schools that are struggling and toward those that are thriving, an incredibly bad idea and the essence of the Republican philosophy). But there are some essential things we don’t know. Here are three things I’ll be keeping an eye on.
1. How much of an increase in State Aid will Evers call for?
Wisconsin school have endured huge cuts in state aid in both the last budgets. Depending on how you count the combined dollar total is close to $2 Billion. According to the Center on Budget and Policy Priorities, the per pupil cuts in Wisconsin have been the fourth largest in the nation. Here is their chart:
The vast majority of districts have experienced cuts in state aid (the most recent figures from the Legislative Fiscal Bureau, here). How much of this lost ground will Evers try to make up?
2. What increases in Revenue Limits will Evers call for?
The FitzWalker gang
essentially froze cut Revenue limits for 2011-12 and provided a $50/student increase for most districts for 2012-13. Revenue Limits matter. Higher Revenue Limits give local district the power to make up for lost state aid and more. To what extent will the DPI budget restore this local control? As the bar in expected achievement keeps getting raised, through a combination of state and local resources, we need to give the schools the resources they need to meet their challenges.
3. Will the DPI budget direct resources to those students and schools with higher needs?
In particular, will it call for increases in aid for English Language Learners, for Special Education, for SAGE reimbursements, for Sparsity (see this column for Kathleen Vinehout on school budgets in general and sparsity in particular)? Will it direct real aid to those schools identified as needing improvement by the new “Accountability” system (see here for a discussion of that system, including this issue).
One thing the new State Report Cards confirmed is that poverty is a great predictor of which students and schools are struggling. Will the Evers budget address this in a real way by providing additional resources instead of the property tax cuts to based on student poverty that have been in every other iteration of the Fair Funding plan? Property tax cuts don’t help students; students need help. For more on school funding “fairness,” see this report from the Eduction Law Center (Wisconsin doesn’t rank very well).
Those are the big three. I’ll also be looking at the size of the guaranteed state funding per pupil (which in essence replaces the levy credits in Fair Funding), what kind of “hold harmless” provisions Evers includes, and like all of us I’ll be looking at the impact of the package on my school district (along with a variety of other districts I’ve been informally tracking for years).
This is step one; the next steps involve key players like WEAC and WMC, advocates in general, the Governor and the Legislature. Much of what will happen with these is predictable. I can say with great confidence that I will consider whatever Tony Evers proposes to be better than what comes out of the Republican controlled budget process.
One thing I don’t know is how advocates and Democratic Legislators will react. If past actions and the recent press release from Senator Chris Larson are indications, they will follow Tony Evers lead and take up Fair Funding as their own. Depending on the answers to the questions offered here, I hope that people who care about our students, inside and outside the Legislature, keep an open mind to advocating for something better than Fair Funding, something that does make up the ground lost over four years of cuts, something that does give real local control, and most of all something that does a better job directing resources to the schools and students who most need the opportunities of quality public education. Penny for Kids would be a start, perhaps in conjunction with Fair Funding.
How one person’s abilities compare in quantity with those of another is none of the teacher’s business. It is irrelevant to his work. What is required is that every individual shall have opportunities to employ his own powers in activities that have meaning.
Democracy and Education, 1916
The current “accountability” madness is almost all based on misusing metrics of questionable value to make comparisons among students, among teachers, among schools, among districts, among nations (see here and here for two recent manifestations). If we are going to be “holding people accountable,” I’d prefer the metric be whether they are providing all students with the “opportunities to employ his [or her] own powers in activities that have meaning.”
Thomas J. Mertz
Check the Wisconsin League of Woman Voters for all your voting information, including your rights, your polling place, the candidate answers, referendum information, and same day registration.
VOTE TO BE HEARD!
Thomas J. Mertz
A recent post — Who cuts the barber’s hair? or Whither “accountability”? — centered on some of the failings of the new Wisconsin “Accountability” system designed by team led by Scott Walker and Tony Evers and adopted in order to gain an NCLB waiver from Arne Duncan and what we as citizens can do to hold them accountable for the bad choices they have made. With the second iteration of the “Accountability Requirements for Achievement Gap Plan“(online version ahs been updated on pages 58-9 ,see here) on the agenda of the Madison Metropolitan School District Board of Education Student Achievement and Performance Monitoring Committee this Monday (11/5, 5:30 PM, rm 103, Doyle Blg) I thought it would be a good idea to do something similar on the “Accountability” work being done by MMSD. This time via an extended and at times strained baseball metaphor.
Who’s at bat?
Or who should be held accountable for the accountability design work being done by MMSD? These aren’t easy questions. Accountability is confusing, maybe not as confusing as the Abbott and Costello routine, but confusing (who should or should not be held accountable for the results of accountability measures is even more confusing….add teachers, families, the economy, inequality, …. to the list below). The chain of accountability goes from the voters who elect Board Members, to the Superintendent who the Board hires, fires and evaluates, to the administrators the Superintendent hires (with the consent of the Board, but for better or worse this has been a rubber stamp consent), supervises and evaluates. It also loops back to the Board, because they are responsible for making sure administrators have the resources they need to do good work, but this chain continues back to the Superintendent and the administrators who prepare draft budgets and should communicate their needs and capacities to the Board. The Superintendent is the bottleneck in this chain each time it loops around because the the MMSD Board has almost entirely limited their action in evaluation, hiring and firing to the Superintendent. Right now MMSD has an Interim Superintendent, so evaluation, hiring and firing is moot and the key link in the chain is broken. Like I said, confused.
What is clear is that the only lever of accountability community members hold is their vote in school elections. Three seats are up in April (Board President James Howard has announced his intent to run for re-election; Maya Cole and Beth Moss have not publicly stated their plans).
The impetus for creating the “Accountability Requirements” was a budget amendment from Board Member Mary Burke. I believe it passed unanimously. For the purposes here, I’m saying “The Administration” is at bat and the Board of Education is the manager sending signals from the dugout. I didn’t count, but there are at least a half dozen administrator names listed on the “Accountability Requirements for Achievement Gap Plan,” if you want to get more personal with who should be accountable, feel free.
Swinging for the Fence or “Small Ball”?
The public loves power hitters; the long ball is a crowd pleaser. Baseball insiders and aficionados understand that swinging for the fence increases the likelihood of striking out and that often the situation calls for “small ball,” like trying to draw a walk, attempting a sacrifice bunt, hitting behind the runner, or lining a single in the gap. the key to small ball is that you do many little things and they combine to produce runs.
With educational “accountability” I would argue that setting “goals” (any goals at all, but especially unrealistic ones like the NCLB 100% proficiency, or the “goals” listed in the draft MMSD”Accountability Requirements,” more on the latter below) is the equivalent of swinging for the fence. This is part of the “data driven” mentality. I think the situation calls for an educational version of small ball, something not as crowd-pleasing, demanding a higher level of engagement by all involved, and more likely to produce a productive unerstanding. What I have in mind is monitoring multiple measures, or “data guided” decision making.
Although the reporting has not been good, MMSD tried something like this with the Strategic Plan “Core Performance Measures.” Unfortunately there seemed to be collective agreement among Board Members and administrators at a recent meeting that these measures would be set aside in favor of the “Accountability Requirements” now under consideration and by implication that all the Strategic Plan work would be left to gather dust. There were targets associated with “Core Measures” but the main idea was that the Board and the Administration pay regular attention to multiple measures and their movement, individually and collectively. This is far different than stating as a goal that 90% of students will score in the proficient range by year 3. The first thing policy makers need to know is whether things are getting better or worse and at what pace. The use of standardized test score goals (and goals for many other measures) in “accountability” doesn’t help with that and creates difficulties.
What is the “accountable” action if some measures go up and some go down? What if demographics or the tests themselves change along the way? And then there are the uncomfortable questions of who will be held accountable and how if none of the goals are met. We should have learned from NCLB that this approach is not what the situation calls for, but apparently MMSD administrators did not.
At a previous meeting on the “Accountability Requirements” Board Member Ed Hughes moved closer to the small ball position by suggesting that instead of absolute goals, the goals be presented in terms of change or growth. Better, but the problems identified remain. The whole goal oriented approach could be called “Strike One,” but I’m not going to do that.
The first draft of the “Accountability Requirements” was presented to the Student Achievement and Performance Monitoring Committee on September 30th and appeared essentially unchanged on the full Board October 29th agenda as part of a Committee Report. In baseball parlance it was a unbalanced, badly mistimed swing for the fences at a ball well outside the strike zone. It isn’t pretty. Strike one.
Some managers would have been tempted to pull the batter and send up a pinch hitter, but instead Board Members sent some signals from the dugout, pointing out some of the mistakes and offering tips for improvement.
Mary Burke noted that the left hand and the right hand didn’t appear to be coordinating. To be more specific, she pointed out that on page 15 (of the pdf) there is a chart with the stated goal “95% of all 11th graders will take the ACT in 2012-13,” but chart itself shows annual incremental increases, culminating at 95% for all groups in 2016-17. It was long ago decided that all students would take the ACT in 2012-13, whoever prepared the left part of the chart knew this, but whoever did the increments on the right did not (and apparently didn’t read the left part). Here it is:
Other problems with the swing are more subtle. There is also another section where ACT goals are expressed in terms average scale scores. This appears to be another case of lack of coordination between the two hands. As discussed below, the sections related to students reaching the ACT “College Readiness” benchmarks are left mostly blank in recognition of the fact that increased participation due to the test-taking mandate will almost certainly lower the starting point. The people doing the average scale score section don’t seem to have understood that. Their chart shows steady and unrealistic growth (except a 0.1 drop for white students in the final year), with all reaching 24 after five years. Here it is:
This is absurd. At Hersey High School in Arlington Heights IL, a much less diverse school with much lower poverty than MMSD (14% low income) that since 2001 has become the Mecca for those who worship at the alter of the ACT/EXPLORE/PLANN system of placing ACT prep at the center of school activities, no doubt starting above the MMSD full participation benchmark, it took six years to get the composite average to 24.0 (the current is 25.2). Closer to home, the temple for ACT worshipers is (much less diverse, at 15.8% Free/Reduced Lunch, much less impoverished) Monona Grove. They joined the ACT religion in 2008-9. That year their ACT composite was 21.7; it is now 22.3. Nationally, only 26% of (mostly self-selected) test-takers achieve a 24 composite. Absurd and incompetent.
You may think this is nitpicking, but these are highly paid professionals who didn’t do their homework to arrive at realistic goals and have made the kind of stupid errors that would cost students serious points on the standardized tests that these same highly paid professionals are employing in the name of “accountability.” Shouldn’t they be accountable?
Despite some coaching from the Board that resulted in fixing the above issues, problems related problems remain the second version. Those are covered in the “Strike Two” section.
The second swing — the version of the “Accountability Requirements for Achievement Gap Plan” on the 11/5/12 agenda — is much expanded (61 pages in comparison to 31), but not much improved. Another wild, unbalanced and mistimed lunge at an almost unhittable pitch. Like the first (and so many of the things produced by MMSD administration) much space is devoted to documenting that staff are very busy (of course repeatedly documenting this helps keep people busy) and very little to what is going on with students (I’m not sure why this is “accountability’). Like the first, the actual “accountability” focus is on “goals.” Like the first, many of these goals (and many of the benchmark starting points) are left blank or labeled “TBD.” Like the first, where there are numbers attached to the goals, they are wildly unrealistic.
As the play-by-play announcer, I’m going to limit detailing how this swing misses to two places where numbers are attached to standardized test based goals. The first involves the ACT; the second the state achievement tests (now WKCE, soon to be “SMARTER Balanced Assessments”).
As explained above, I don’t like “goals” in standardized test based “accountability systems” (I’m not very fond of standardized test based “accountability systems” in general, but no room for all that here), but if you are going to have goals, they should be realistic, they should be based on in-depth knowledge of the tests, the performance of comparable students on these tests, and the improvements achieved elsewhere using similar programs. As one Board Member pointed out at a recent meeting, this is exactly the kind of expertise that the Board expects from their highly paid professional administrators. They ain’t getting what they paid for (in baseball terms, we are approaching Alex Rodriguez in the last post-season).
The error Mary Burke pointed out with ACT participation has been corrected.. At the previous meeting there was a discussion of how expanded ACT participation will yield new baseline starting scores, and this was (in the first version) and is (in the second version) reflected by leaving blank most those portions covering percents of students scoring at or above the college ready benchmarks set by the ACT. For the same reasons, the ACT “Average Composite Score” section discussed above is now blank. All this is good, but in the left hand column of the benchmark charts in both versions ,for each subject area there is a 40% goal (page 32). I’m going to leave aside important criticisms of the ACT Benchmarks, to address why the 40% goal is problematic. Nationally last year, only 25% of the mostly self-selected test-takers met the benchmark in all four subjects. The percents varied greatly, from 67% in English to 31% in science. At Hersey High (with their test friendly demographics and over ten years of emphasizing the ACT) only 39.2% of test-takers made all four benchmarks. The goals for MMSD should reflect this reality, (and similar evidence on subgroups;i it should be noted that you can reach the 40% goal in each individual subject and still not have 40% meeting all four benchmarks, but my point is that the data we have shows that 40% is easier or harder for different subjects , and that 40% in any may be out of reach in some subjects).
There are similar, but more pronounced and complex problems with the section that sets goals of 90% “proficiency” on state tests in Mathematics and Literacy at the end of five years (page 17). Here is the chart for literacy (sorry for the bad reproduction):
Although the WKCE is referred to, the numbers in the far right column reflect the very problematic “WKCE as mapped to NAEP cut scores” (see “The news from Lake Gonetowoe” for some of the problems with these cut scores) and the WKCE is on the way out to be replaced by “SMARTER Balanced Assessments,” Some confusion here that I’m going to avoid by simply saying “state tests.” Since the NAEP derived cut scores are the order of the day, I guess MMSD has to use them, but they have a choice about which levels to concentrate on and “Proficient” is the wrong level.
My preference would be to do the multiple measures, small ball thing and track movement among scale scores, or failing that movement among the various cut score defined levels (which is what the “Growth” calculation in the new Report Cards does). If you are only going to use one level and are going to set goals, “Basic” is the level you want. It is where you will see the most movement and get the most useful information.
Eventually I hope to do a few more posts about the meaning of NAEP cut score levels and how they compare to the old WKCE levels and many related things. For now I’m going just repost my new favorite quote from National Academy of Sciences publication, “Grading the Nation’s Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress,”
and add the general NAEP level descriptions (there are more detailed ones for the grades NAEP tests, 4, 8, and 12). Here is the quote (again):
Although standards-based reporting offers much of potential value, there are also possible negative consequences as well. The public may be misled if they infer a different meaning from the achievement-level descriptions than is intended. (For example, for performance at the advanced level, the public and policy makers could infer a meaning based on other uses of the label “advanced,” such as advanced placement, that implies a different standard. That is, reporting that 10 percent of grade 12 students are performing at an “advanced” level on NAEP does not bear any relation to the percentage of students performing successfully in advanced placement courses, although we have noted instances in which this inference has been drawn.) In addition, the public may misread the degree of consensus that actually exists about the performance standards and thus have undue confidence in the meaning of the results. Similarly, audiences for NAEP reports may not understand the judgmental basis underlying the standards. All of these false impressions could lead the public and policy makers to erroneous conclusions about the status and progress of education in this country. (Emphasis added)
Here are the descriptions:
|Achievement Level Policy Definitions|
|Basic||This level denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at each grade.|
|Proficient||This level represents solid academic performance for each grade assessed. Students reaching this level have demonstrated competency over challenging subject matter, including subject-matter knowledge, application of such knowledge to real world situations, and analytical skills appropriate to the subject matter.|
|Advanced||This level signifies superior performance.|
I think that at this time the “Achievement Gaps” work in MMSD should concentrate on getting students to the “Basic” level, as defined by NAEP.
This belief is reinforced by national data on student NAEP performance. This first chart shows the 8th grade NAEP level distribution for all students (NAEP tests a sample of students and adjusts reporting to reflect the entire population, charts from here):
In 2011 42% were in the “Basic” level. This is where the median and mean are. If we are most concerned with the students who aren’t reading and can’t do simple math, that means moving them from “Below Basic” to “Basic.” I have no problem with also monitoring “Proficient” and “Advanced,” but the heart of this is in the basic category.
Two more graphs to show a little more of this and transition to the goals being set. This one shows the distribution of scores for those students not eligible for Free and Reduced Lunch:
The next is for Free Lunch students (NAEP reporting here does not combine Free and Reduced):
I’m not going to deny that the “proficiency” gap between these two groups of 28 points isn’t worthy of attention, but I will argue that the gap in “Basic” or above of 26 points and the gap of 22% in those reaching “Basic” are more important and more likely to be narrowed by the programs in in the Achievement Gaps Plan. This is where the action should be and what we should be watching (if we are only going to watch one level).
If it isn’t already obvious from these charts, the 90% “Proficiency” in five years set as a goal in versions 1 and 2 is a pipe dream, like the Chicago Cubs winning the World Series. No competent education professional familiar with NAEP cut scores and performance levels and MMSD would put this before the Board of Education for consideration, yet some combination of MMSD administrators signed off on it, twice. Strike two.
The Next Pitch
I wanted to get this finished and posted before the 11/5 meeting, but I didn’t. I also wanted to attend the meeting, but it is/was my son’s birthday. I hope that some of these issues and some others were raised at the meeting (I’ll watch the video and find out).
There are many other issues, like the fact that the AVID section doesn’t appear to recognize that if the other “goals” are reached, the comparison group will be an upwardly moving target; that “Stakeholders” is most often defined as district staff and not students, parents or community members; that the Cultural Responsiveness work has no academic results attached to it; that in Madison — a Union Town — the Career Academy section has no role for organized labor in planning or implementation, but business interests have the best seats at the table (and some will be paid for being there, this is what you expect from Scott Walker, not MMSD); and to repeat what was said above that much of this is documenting staff being busy and in many key places where measurement of one sort or another is called for the lines are blank or say ‘TBD.” On this last (with the exception of the ACT where the mandated participation warrants holding off) , the idea of attaching a requirement to have an accountability plan was to have a plan, not a promise to come up with one at some future date. I could go on (and on), but I think I’ve made the point that the quality of thought and work that has gone into this by the administration thus far has been lacking in many areas.
It looks like another draft (the third pitch) will be coming back to the Board on November 26th. I very much hope that draft is much better than the work we have seen to this point. I hope it isn’t strike three. The administrators have demonstrated that they can make corrections when problems are pointed out to them (like the inecusable errors with ACT participation in the first draft), when they get good coaching from the Board. That is a good thing, but expectations should be higher. It isn’t the Board’s job to know the distribution of NAEP scores, and it certainly isn’t their job to educate the administration on this (it goes without saying that there is something very wrong when it falls to me — an interested community member — to point out their apparent ignorance in the very areas they are being paid to be experts in). There needs to be some accountability here, the Board and the community have a right to expect better work. If we aren’t getting it from those now responsible, we need to find people who can provide it. The Board is not going to make good decisions without good information. The improvements our students and community need and deserve are not going to happen without competent people at the top. There needs to be some accountability. The Board needs to hold the administration accountable , and we need to hold the Board accountable for doing that.
Three Board seats on the ballot in April 2013. Could be a whole new ballgame.
Thomas J. Mertz
The WKCE testing and related assessments are scheduled for next week in the Madison Metropolitan School District schools (full schedule of MMSD assessments, here), but your child doesn’t have to be part of it. You can opt out. Families with students in grades 4,8, & 10 have a state statutory right to opt out of the WKCE; I have been told that it is district practice to allow families to opt out of any and all other, discretionary, tests. We opted out this year. In order to opt out, you must contact your school’s Principal (and do it ASAP, (contact info here).
The WKCE does your child no good. Just about everyone agrees that even in comparison to other standardized tests, it is not a good assessment. Because results are received so late in the year, it isn’t of much use to target student weaknesses or guide instruction. There are no benefits for students.
There are also no benefits for schools and the district, and some potential for harm. The WKCE is central to the new Wisconsin “Accountability” system (discussed here) and will be part of the new “Educator Effectiveness” system, being implemented. Both of these are built on the — likely false (see: “Snookered by Bill Gates and the U. S. Department of Education“) — promise of “SMARTER Balanced Assessments,” but because the Report Cards include a “growth measure” and the educator evaluations include a Value Added component, the WKCE will be part of the calculations for at least two more years (this will be accomplished by pretending that the WKCE is essentially the same as the new test, which in fact it likely is, in that it will no doubt measure scocio-economic status better than it measures anything else).
A large-scale, summative assessment such as the WKCE is not designed to provide diagnostic information about individual students. Those assessments are best done at the local level, where immediate results can be obtained. Schools should not rely on only WKCE data to gauge progress of individual students or to determine effectiveness of programs or curriculum (emphasis added).
But in the mania to compare and rate and evaluate that is the new Status Quo, this is almost exactly how the WKCE is being used. Not the WKCE alone, but in the Report Cards the WKCE dominates and in the Educator Evaluation the WKCE test scores may be decisive (test scores only account for a small part of the evaluations, but if the other portions show little variance, the test score portion will be determinative). No good can come from this and the mis-impressions created — about districts, schools, educators and students — are harmful, if only because they create confusion and make it more difficult to have productive policy deliberations.
I would be remiss if I didn’t acknowledge that opting out can have consequences for schools and districts. The new Wisconsin system takes away points based on low participation, so there will be an impact there. If your child is likely to score in the higher ranges, their absence will lower the scores used to produce the “accountability” measures. If the school consequently falls into one of the two lower tiers, extended day programs and school improvement plans are required. If it is in the lowest tier, then the plans must include out-sourcing to an approved “turnaround” vendor. As I noted before, this is privatization of public services and turnaround specialists do not have records of success that inspire confidence. A school or district that fails to “turnaround” is subject to further intervention by the State Superintendent. A school or that does not cooperate with these directives “will close.”
Although school administrators have criticized the system, I doubt districts will choose the noncooperation option. Too bad, that would be a fight that would shine a bright light on the this conception of “accountability.”
Opting out is a smaller version of noncooperation that is available to every family. You don’t have to be part of the madness.
It can also become something larger. Without all of childrens’ test scores, the machine grinds to a halt. There is a national Opt Out movement. Here are some places to find out more (including opt out rights and procedures in other states and districts):
In closing, I want to point to an alternative to the over-use and abuse of standardized testing. Thee are many; this one — New York Performance Standards Consortium’s performance-based assessments — was featured in a Washington Post post, “An alternative to standardized testing for student assessment,” by Monty Niel today. Check it out. We can do better.
Thomas J. Mertz