The long-awaited National Standards data was recently released to the public by ‘Fairfax Media’ and the Government followed with its own set of school performance data.
Already comparisons are being made between schools.
While more information is generally a good thing, in this particular case, caution needs to be liberally applied.
Publication of this information is one of the first phases in a five-year plan by the Government, aiming to see improvements in student learning and achievement through the open dissemination of data.
The plan is to eventually have schools report their students’ performance against National Standards in a uniform manner.
This information will then be compiled into tables, allowing students to measure and chart their own performance; parents to assess their children’s school; educators to monitor the quality of the education provided by their school; and the general public to compare progress and achievement across the sector.
The goals of offering more information about the educational sector in general, and individual schools in particular, is praiseworthy.
The interest in how girls perform at school compared to boys, whether or not class size makes a difference and what impact poverty has on education shows that there is an appetite to know more about what impact different schools have on their students’ performance.
Moreover, information offers the chance for students who are struggling to be identified and helped, and it facilitates parents in making choices about what school will best fit their children’s educational needs.
But the data just released risks misinforming us.
This is because ‘Fairfax’ has not published a uniform set of data that allows progress and achievement to be compared across the sector, and nor will the Government. Instead, whatever data individual schools submitted has been published in whatever way the schools have chosen.
For this year, schools have been given the latitude to report their data in accordance with their own judgement about whether their students have achieved the standards; and what they consider to be representative of their achievements.
Data from individual schools, then, will not be helpful to students who want to compare their achievements against National Standards because their school’s interpretation of achievement will differ from other schools.
Parents will not be able to make informed choices about which school will best fit their children’s educational needs either.
In fact, no one will be able to make fair comparisons about the difference that different schools have made to students’ learning.
With publication of data in such format, we are more likely to realise the concerns voiced by the New Zealand Educational Institute (NZEI), academics, and several school principals that some schools will be unfairly judged as losers.
Already schools’ performance is being labelled as ‘below’ or ‘well below’ the standard as they have relatively few students able to reach the standards.
Yet these schools may be located in areas with greater social problems to deal with than other schools, and the improvement in their students over the course of the year may be great, even if those students did not quite meet the standards—but the data would not show these details.
There are better ways of gathering and publishing data on student achievement against National Standards.
Firstly, the data on student achievement and progress should be assessed and reported in a uniform way, such as through the use of a single assessment, such as ‘Assessment Tools for Teaching and Learning) ‘asTTIe,’ which is benchmarked against the New Zealand Curriculum.
Secondly, assessment data is not meant to stand on its own.
School evaluations should take into account the improvement of a child’s learning between at least two points in time, where students end up after leaving school, a school’s financial performance, its ethos and the quality of its staff among other things.
The Government is right to introduce standards and to begin collecting more information about student achievement.
But the difference between helpful data and misleading data is subtle and unfortunately the recent release is at the wrong end of the spectrum.
Dr Jane Silloway Smith and Steve Thomas are respectively Research Manager and Senior Researcher at Maxim Institute, Auckland, the source of the above article.