WHY RELIANCE ON NAPLAN AND PISA DATA PROVIDES A MISLEADING VIEW OF OUR EDUCATION SYSTEM’S SUCCESS

WHY RELIANCE ON NAPLAN AND PISA DATA PROVIDES A MISLEADING VIEW OF OUR EDUCATION SYSTEM’S SUCCESS

Education is our tool to shape the future. But what is the future we are preparing our young people for? What skills, what values, what capabilities and knowledge do we want young people to have, and how will this help them become citizens who will forge a stronger Australia? 

These are important questions – ones that help us ensure all young Australians become successful learners, confident and creative individuals, and active and informed citizens. 

Unfortunately, however, simplistic tools to assess our education system’s effectiveness have led some commentators to suggest that we don’t have a clear idea of what it is we want our kids to know, be like, and do. 

However, this could not be further from the truth. 

The problem that education systems around the country face today is not that they lack a clarity of purpose.

The problem is almost the opposite: we have a very clear idea of what the goals of education are, but we fail to measure ourselves holistically against them. 

Instead, we rely almost entirely on tools such as PISA (the 'Program for International Student Assessment' - conducted by the OECD) and NAPLAN (the 'National Assessment Program for Literacy and Numeracy' - conducted for the Australian Government), which provide a narrow and misleading view of our education system’s progress and success.

We place inordinate emphasis on statistics gleaned from tests that seek to reduce a student’s educational journey to a number, and a school system to a line in a league table.

PISA borrows from neoliberal economics in seeking to quantify the value of a country’s education system. A reliance on quantitative measures renders PISA tests susceptible to manipulation and facilitates comparisons between countries that are fundamentally incomparable.

Take Finland. The NSW education system is regularly, and unfavorably, compared to the Finnish one – with Finland extolled as a global exemplar of scholastic excellencebased upon its impressive PISA results. 

Yet, in Finland, more than 50% of high school students are deemed ‘ineligible’ to sit the test. Non-Finnish background speakers, students studying trade-based subjects and those from indigenous backgrounds are all restricted from sitting the exam, resulting in an artificial inflation of Finland’s PISA ranking.

It is interesting to note that South Korea and Japan –countries that have similar PISA results to Finland – have diametrically opposite systems of education (long school days, extensive homework regimes, intensive subject specific teaching methods). What they do share are homogenous cultures, conspicuously low immigration rates and very few remote schools.

The fact that countries with opposite systems of education perform similarly well in PISA tells us that these rankings are less a reflection of which specific educational policy settings work – and more a commentary on demographics and socio-economics. 

The idea behind PISA – that we can quantify our success in education via a test that doesn’t fight fair, says more about an obsession with statistics than it does about the relative merits of different education systems.

And then there is NAPLAN. When NAPLAN was introduced in 2008 it was designed as a straightforward diagnostic instrument to identify students who may be struggling so that help could be provided accordingly. 

When used for this purpose, NAPLAN is useful. Indeed, the current review of NAPLAN – launched this year on the back of leadership from NSW – will hopefully ensure that NAPLAN transitions to an online, on-demand, formative assessment, bringing it back to its diagnostic roots.

But presently, NAPLAN is too often misused as a tool to quantify the performance of entire education systems. We now use NAPLAN to measure the performance of teachers, the performance of schools, and the performance of different school sectors.

This is not only problematic because it transforms NAPLAN into an instrument that it was never designed to be. It is also problematic because policy makers look at unfavorable NAPLAN data and make snap decisions about the overall health of our education system. 

They assume that a decline in NAPLAN results means that our overarching vision as to the purpose of education is no longer contemporary.

The purpose of our education system is timeless.

What is not timeless – or useful – are measurement tools that provide an inaccurate view of our uniquely Australian education system. 

We need a national approach to measure progress across already agreed educational goals using sophisticated quantitative and qualitative approaches. This is the best and only way to ensure we are equipping our students with the capabilities we know they need.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了