Sunday, October 30, 2011

Smoke and Mirrors

Guest Post by Allan Alach

The October 25th issue of the New Zealand Education Gazette features an article from the Ministry of Education which starts as follows:
Supporting professional judgments in reading, writing and mathematics.
A progress and consistency tool is being developed to support teachers’ professional judgments in relation to the National Standards, and to improve the measurement of student progress over time.”
The article presents this as a positive development in education, although everyone knows that the introduction of national standards was a purely political move by the National Party.  It follows then that any development of national standards also has political origins.
We know that New Zealand education is highly regarded internationally, even by the OECD who develop and implement the PISA testing programme
New Zealand is a top-performing OECD country in reading literacy, with the average student scoring 521 out of 600. This score is higher than the OECD average of 493, making New Zealand the 4th strongest OECD country in reading skills. Additionally, students performed well in mathematics and science, with more than 17% reaching the two highest levels of proficiency.” Source: OECD Better Life Initiative
So why destroy our highly regarded primary school education system?
The article then goes on to outline the rationale behind the development of this ‘progress and consistency tool.’
“This approach has been taken to avoid problems that have beset the implementation of standards in countries that have used national tests. In particular, national tests tend to result in a narrowing of the curriculum to the aspects of learning that teachers believe are likely to appear in the test.”
Sounds plausible? 
Wait a moment, if the Ministry of Education are taking note of overseas research and evidence here, then why do they completely disregard all the similar evidence about the damaging effects of any kind of standards based system?
After all, this information is very easy to find, as is illustrated by this article from Alfie Kohn: “Beware of the Standards, Not Just the Tests (2001).”
This article is essential reading, to give readers the necessary strength to be able to read the Gazette article without damage to their mental health.
Kohn says:
 

“But the more comprehensive and detailed a list of standards, the more students (and even teachers) are excluded from this process, the more alienated they tend to become, and the more teaching becomes a race to cover a huge amount of material. Thus, meeting these kinds of standards may actually have the effect of dumbing down classrooms”
That is pretty clear, even to ideologically driven bureaucrats implementing a political agenda. 
The Gazette article then discusses the broad range of knowledge and skills that teachers need to integrate into one overall teacher judgement rating of a child’s achievement against the relevant standard. 
There is an immediate illogicality here - how can a broad range of knowledge and skills possibly be reduced down to one ranking against one standard and yet retain any validity and value?
The Ministry’s solution is to develop their ‘progress and consistency tool.' Excuse me, what does this jargon actually mean?
This is a well known language trick, using a meaningless phrase to hide the true meaning.  
The Ministry then attempts to muddy the waters even further; 
“The progress and consistency tool is intended to support teachers’ overall judgments and assist with increasing the consistency of judgments across the country and over time. The tool will also enhance the measurement of students’ progress in relation to the National Standards.”
A key phrase slips through this web of deceit: “measurement of students’ progress.”
Here, then, is the official confirmation of testing. “Measurement of progress” can not mean anything else. 
If something is to be measured, then some kind of measurement system/criteria must follow. We can not measure length, for example, in a way that is consistent and meaningful to others, without using an agreed reference tool e.g a metre ruler. Measurement is a precise operation. One does not measure by using an overall judgement. 
The statement about this progress and consistency tool being used to support teacher judgements is nonsense.
The article concludes with some saccharine:
“The tool is being developed with the sector, it will be released in iterations, and consultation will be sought throughout its development. Three advisory groups, including one comprising teachers and principals, have been established and schools will be asked to give feedback on the tool from early next year.”
Consultation? Advisory groups? One comprising teachers and principals? Who is represented in these three advisory groups? Who are the principals and teachers who are participating in one of these groups? Do they have the right to speak on behalf of the whole sector? 
Since we have no idea who is in any of these advisory groups, the statement is meaningless and is yet another red herring to distract our attention away from the implicit danger signs.
What danger signs? The glaringly obvious one is the lack of any information about the “progress and consistency tool.” 
What is being developed? How will it be implemented? 
This lack of detail means that anything is possible. Think about it - what is the point of developing this “progress and consistency tool” as an optional extra to assist overall teacher judgements? 
One can be reasonably confident that there will be a degree of compulsion buried in this.
Unfortunately for the Ministry’s spin doctors, Kelvin Smythe spoiled their schemes when he published this article
followed by
In these two articles, Kelvin provided the information that the Ministry tried to keep secret.
 This ‘tool’ will consist of rubrics.
As a key part of this tool, a pedagogical framework will be developed that will be psychometrically aligned to enable a measure of progress.”
The significant word here, that gives this all away, is “psychometrically.”
From Wikipedia:
“Psychometrics is the field of study concerned with the theory and technique of psychological measurement, which includes the measurement of knowledge, abilities, attitudes, personality traits, and educational measurement. The field is primarily concerned with the construction and validation of measurement instruments such as questionnaires, tests, and personality assessments.”
Regardless of the Ministry spin here, this means some form of testing. It is obvious that schools will be compelled to use this tool, given the investment in it.
As for claims that this will not narrow the curriculum, that is arrant nonsense as well. We know that this tool will use rubrics, that will reduce the range of possible learning outcomes in literacy and numeracy to those specified in the rubrics. 
How can it be otherwise? How can this not be a narrowing of the literacy and numeracy curriculum? 
Narrowing of the remainder of the curriculum is inevitable for schools who are under the threat of league tables. The government’s protestations that these are not on the agenda fall over immediately when we have a close look at the relevant clause requiring achievement data to be submitted to the Ministry of Education:
“the numbers and proportions of students at, above, below or well below the standards, including by Māori, Pasifika and by gender (where this does not breach an individual’s privacy);”
Why insert the qualifying phrase about individual privacy? Surely this would only be an issue if this information was available to the public? The answer is very obvious.
There is ample overseas evidence (ignored by the government and the Ministry) about the problems resulting from league tables, and especially about the resultant narrowing of the curriculum as schools strive to get higher places on the league tables in order to retain their roll numbers and status as a ‘good’ school. Again, how can it be otherwise?
Another part of the Ministry’s arrogant disdain is that they did not mention the other part of this testing agenda - the advertising of a Project Manager position, responsible for the development of an empirically - calibrated psychometric scale to ‘assist teacher judgments in relation to National Standards’. This will use a software tool (possibly internet based), with the data being processed using Rasch Analysis.  
I’ve outlined this in detail in this posting: National Testing- here we go. 
This project is for the development of computer based testing that will result in numerical rankings. There is no other outcome. 
This is to be ready to go by January 2014 and we can expect this to be mandatory.
Trying to define ‘achievement’ through focussed standards and rubrics that pre-define the end results, through a narrowing down of what is deemed to be of value, is nonsensical. 
This is especially so in written language, which includes an artistic and expressive component.  Here is a draft written by a ten year old boy, the introduction to a speech he is developing for a competition. No adults have been involved in this, although I have re-formatted this as a poem for this article.
I wonder what the future holds? 
Does it hold what I wish, like peace and harmony, 
Or does it hold what I do not wish like violence and war? 
I wonder what the future holds?  
Will the world be at peace 
Or will the world be at war? 
I wonder what the future holds? 
Who will rule the world? 
Will hunger still be a problem?
I wonder what the future holds?
How could any rubric be developed for this piece of writing? 

1 comment:

melulater said...

The idea of someone developing rubriks for all teachers to use with all children must be at odds with the original idea.

To me, rubriks are more helpful when developed jointly by teachers and students to target the learning that group is involved with, to enable the students to see what they need to achieve.

No national testing can ever truly reflect what I have taught in my class, nor the learning my students have made in spite of me either.

No national testing can ever truly compare what has happened in my class compared to the same age grouping of students in a school down the road.