The need for robust measures of student progress
David de Carvalho
Executive Dean, Faculty of Education, Philosophy and Theology, University of Notre Dame Australia
Marist Schools Australia Biennial Conference???????????????????????????????????????????????????
Mparntwe (Alice Springs) 29 July 2024
?
Many of you will have heard the saying “Only what gets measured gets managed.”? It was attributed to the famous management guru, Peter Drucker, although its heritage is disputed.
But you have also likely heard another saying, that to some extent as a counterpoint namely, “Not everything that matters can be measured, and not everything that can be measured matters.”
In Australia, and around the world, the creative and dynamic tension between the ideas represented by these two sayings has been at the heart of many debates about education over the last quarter of a century, especially since PISA, NAPLAN and the ATAR have become more and more prominent as sources of data that inform those debates.
With PISA, for example, we have all become used to being told that the performance of Australian students in Maths, Science and Reading has declined over time. And with NAPLAN, the reporting of results at the school level on the MySchool website has arguably been the bane of many a principal and system leader, but also a boon to parents.
And then there is the issue of the central role played by the Australian Tertiary Admissions Ranking, or ATAR, and how that metric drives – for better or for worse – the educational choices that schools and students make as they head into their senior secondary years.
So today I want to use my time to provide some insights from my time as CEO of the NSW Education Standards Authority and of ACARA, in the hope that it will stimulate conversation about how Catholic educational institutions should approach the issue of educational measurement and reporting. ?And it is worth having in the back of our minds as we think and talk about these issues, the question: “What would Marcellin Champagnat say?” In other words, how should Marist schools be applying the principles of Marist education to the field of educational measurement and reporting.?
The Quantification of Education
But to begin, I want to go back to those two opening quotes: “What gets measured gets managed” and “Not everything that matters can be measured, and not everything that can be measured matters.”
We all know that we live in a secular age, but it is also the age of quantification, which attempts to capture all aspects of reality and being in numbers, equations and algorithms.? This is NOT a coincidence. ?Quantification is to certain extent both a cause and effect of secularisation.
?Sir Francis Bacon, arguably the most articulate spokesperson for and representative of the so-called Enlightenment, the so-called Age of Reason, and the so-called Scientific Revolution, described his aspiration as that of torturing nature, to put it on the rack until it reveals her secrets. The instruments of that torture – in relation to the physical sciences - are the telescope, the microscope, the thermometer, the barometer, the spectrometer, the compass, the stop-watch, the ruler, and the scale. ??The tools of measurement and quantification.
In the social and human sciences, we have developed the survey, the Likert scale, the IQ test. We have incredibly sophisticated mechanisms for collecting all kinds of economic data and for subjecting that data to statistical analysis, such as randomised controlled tests and multiple linear regressions, Rasch analysis, R-scores and Z-scores.
The very optimistic notion of inexorable progress – economic, social, intellectual, cultural - which was seen by the founding fathers of the Enlightenment as the inevitable result of separating questions of faith from questions of science, relies on quantification and measurement.?
All these technological developments have enabled an incredible and measurable expansion in the stock of knowledge and increased intensity of specialist expertise in every domain.? But they have not guaranteed any advance in human wisdom.? This is because the new knowledge has been cut-off from the source of its meaning and purpose, to serve us on our journey towards the ultimate grounds for truth, goodness and beauty that is God.?
And as Charles Taylor has pointed out in “A Secular Age”, as secularisation has advanced – and here Taylor means by secularisation the process by which the social and cultural conditions conducive to faith have been eroded over time – the sense of meaning and purpose in life has been increasingly attenuated. We can read the greater focus on quantification as a response to uncertainty an attempt to buffer our increasingly fragile sense of our place in the cosmos with the comfort provided by hard data and statistics.
Secularisation and quantification have also gone hand in hand with a process of hyper-individualisation and marketisation.? Classical liberalism had a focus on the freedom of the individual within the context of social norms of morality, but its neo-liberal post-modern distortion has jettisoned concern about social cohesion and made a craven image of the utility-maximising individual who can and must create their own identity through what they choose to consume in the market-place and how they present themselves as a commodity in that market-place.?
In the service of profit, over the last fifteen years, a small number of global big-tech firms have offered us a Faustian bargain, putting into our hands the so-called smart-phone that gives us the illusion of an expansion of individual power, and in return, we have surrendered to its convenience and have delivered into the hands of those same corporations vast amounts of data about ourselves as individuals.? We are now all being digitised, we are all being measured, and we are all being managed.
At the same time, education has become increasingly dominated by human capital theory, whereby the primary purpose of education is seen as ensuring that the economy is well-served by competent workers.
The reason I have highlighted the interrelationship between quantification and secularisation, individualisation and marketisation, is not to suggest educational measurement is a bad thing – which would be contrary to my purpose - but simply to point out that as people of faith, we need to be aware of the fact that the cultural milieu within which the quantification of education is taking place is the outcome of historical processes involving the separation of faith and reason and the emergence of a view about the human person as defined by their individual preferences and their usefulness to the economy, as opposed to their membership of communities. ?This in turn means we ought to adopt a “hermeneutic of suspicion” towards ever increasing efforts to quantify the outcomes of education in its various aspects: intellectual, physical, social, cultural, spiritual and moral.
So we need to be wary of drinking the neo-liberal Kool-Aid, and very considered about how we engage with the whole agenda to quantify the process of education, always asking, does this or that initiative better serve our educational purpose?
We need to avoid the temptation represented by the allure of a spurious precision in relation to things that are inherently imprecise, matters best evaluated by judgment rather than measurement.
For example, I note the proliferation of conferences that include in their title “The Science of Learning”. ?We have the science of reading, the science of maths and no doubt soon enough we will have the science of science, and if we are not careful, the science of history, the science of art, and the science of dance.
As educators we must make every effort to apply scientific knowledge about brain function, for example, to the art of teaching. But there is a risk that we mistake the science of how the brain works with the solution to all the pedagogical challenges presented to us by the actual human beings in our classrooms.? In the quest for certain solutions, this is the temptation to which all the human sciences are vulnerable, that of wanting to predict how people will react under certain circumstances and in response to certain stimuli.? Insofar as they desire this, the economists, sociologists and psychologists are tempted to reduce human beings to atoms and urges, to find out the forces that move them, and to predict what they will do whether they choose to or not. Let teachers not be among them.
I am particularly wary of well-meaning efforts to expand the range of aspects of human development that should be subjected to quantification and measurement, in projects such as Melbourne University’s New Metrics project.? The motives can’t be questioned.? The New Metrics team and many others have been concerned about the excessive focus, via the ATAR, on academic achievement at school, which tends to advantage those students who already have a range of social, economic and educational advantages. ?I think Learner Profiles, particularly those that make use of portfolio approaches to presenting information about the different aspects of students development, are very worthwhile, but I am very wary of Learner Profiles that purport to be able to reductively quantify aspects of our lives that are inherently messy and mysterious. ?
Consistent with the view that what gets measured gets managed, the New Metrics project aims to put greater emphasis on domains of non-academic achievement so that schools and system put more effort into them. To do this requires a way to measure things like creativity and acting ethically. But not all things that matter can be measured, and efforts to quantify and measure them run a serious risk of reductionist distortion of what those things actually mean and how they should be developed.
The Catholic School’s commitment to excellence
So, having opened up with a warning about the over-quantification of education, let me turn now to why some quantification is important.?
An excellent Catholic school must first be an excellent school, and held to account by the same academic standards that other schools are held to.
Church documents, history and practices, supported by Canon Law, establish that first and foremost a Catholic school is characterized by excellence. Consistent with the defining characteristics, Catholic schools should implement ongoing processes and structures and gather evidence to ensure excellence in every aspect of its programs, life, and activities.
So we need to have tools to capture how well we are doing in terms of academic standards. ?We have one such tool to hand in the form of NAPLAN, and I’d like to focus there because of the importance to students – especially those from socially disadvantaged communities - achieving mastery in the foundational domains of literacy and numeracy. ?
Why do we focus on the two general capabilities of literacy and numeracy when it comes to nationwide standardised testing?
Literary and numeracy as the gateway to learning and social inclusion
Very simply, because they are fundamental to further learning and to being able to participate effectively in society and, in the words of Nobel prize-winning economist, Amartya Sen, living lives we have reason to value. Sen pioneered the capability approach to development economics, and defined a capability as “practical choice”, that is the ability to exercise agency in the pursuit of one’s goals. If we are not literate and numerate, the range of practical choices open to us is severely constrained.
What is concerning is that in the ongoing conversation about how education should evolve to take account of its changing context, some advocates for reform downplay the importance of literacy and numeracy. For example, in an article entitled “Towards Education 3.0: The Changing Goalposts for Education” Chris Goldspink & Robert Kay write about the brave new world of artificial intelligence and ask:
“What does this mean for education? Are the traditional goalposts of the 3Rs still appropriate and if not, what should they be? How can schooling prepare children to thrive in the non-routine cognitive roles it appears will make up the bulk of the future workforce? Of more concern is that if we need to change the education system to properly address these issues, we don’t have much time.”
While the football field analogy is useful, describing literacy and numeracy as the goalposts is wrong. They are not the goalposts, rather they are the gateway onto the field. Without literacy and numeracy you don’t get to play at all. You don’t get to be included in the game. You don’t get to make the practical choices you’d like to make about what to learn next, what career to follow. You are excluded. That is why we need to ensure all our children, particularly those who are most likely to be excluded from the life of society due to other circumstances, are empowered to make practical choices about their lives through education.
If students do not master literacy and numeracy, they will not be able to access those key learning areas that expose them to the richness of God’s creation in both its human and non-human manifestations. If you can’t read, not only will the exhilaration of reading novels by Austen, Dickens, Dostoyevsky, McKeon, Barnes, Garner, or the poems of Hopkins, Elliot, Donne, Plath or Oodgeroo Noonuccal, be denied you, but you may not be able read potentially life-saving instructions on a forklift. ?If you can’t do basic arithmetic, you will not be able to progress to be able to do algebra, trigonometry or calculus, physics or chemistry or statistics or to think mathematically, which is necessary when trying to solve so many problems, large and small.
So it is absolutely necessary that teachers have at their disposal a means for being able measure the extent to which students have mastered the basic skills of spelling, grammar, reading, writing, addition, subtraction, multiplication and division.
The existence of such measures is a pre-condition for making data-informed decisions at the level of the family, the classroom, the school, the system, the state and the nation.
The existence of such measures is, therefore, a precondition for social justice and educational equity.
One of the objections to NAPLAN I got used to hearing when at NESA and ACARA was that it was too stressful for students.? But the extent of student stress about NAPLAN is mainly driven by the adults around them. And a certain, minimal level of stress is acceptable to perform well.? But any stress students experience momentarily in the lead-up to the tests will pale into insignificance when compared to the lifelong stress they will experience from being socially and economically excluded due, in part, to poor literacy and numeracy skills.?
Another objection heard repeatedly sounds clever but is in fact the opposite: “You don’t make the pig fatter by measuring it.” Well of course not, but if you don’t measure the pigs, you won’t know whether the pigs are getting fatter, and which ones need more nutrition.
A similar critique goes along these lines: “We’ve had NAPLAN for 15 years now and results haven’t improved, so we should get rid of it.”? Which is essentially the same as saying “We’ve had thermometers now for hundreds of years, and global temperatures are still rising. Let’s get rid of thermometers.”
There is more than a little irony in the fact that arguments like these are often mounted by people who are advocates for improving the critical thinking skills of students.
NAPLAN: changes to measurement and reporting
So what is the purpose of NAPLAN.? In 2019, Education Ministers agreed to the following set of words to clarify that issue:
As students progress through their school years, it is important to check how well they are learning the essential skills of reading, writing and numeracy.
NAPLAN assesses the literacy and numeracy skills that students are learning through the school curriculum and allows parents/carers to see how their child is progressing against national standards.
NAPLAN is just one aspect of a school’s assessment and reporting process. It does not replace ongoing assessments made by teachers about student performance, but it can provide teachers with additional information about students’ educational progress.
NAPLAN also provides schools, education authorities and governments with information about how education programs are working and whether young Australians are achieving important educational outcomes in literacy and numeracy.
?
Now, please note that there is nothing in that set of words that mentions NAPLAN being a “diagnostic” test, and one of the more frustrating things ACARA had to put up with was certain ministers referring to NAPLAN as a diagnostic test, which gave the impression that it provided sufficiently detailed information about what students could and could not do in terms of their literacy and numeracy skills that would enable teachers to adjust the way they taught individual students.?
But NAPLAN is only 40 multiple choice questions in four out of five domains, and a single written piece in the fifth domain. It does not have a sufficient number of questions to allow the kind of individual diagnosis of areas to be worked on at the student level.? And this is why when students get their Individual Student Reports, ACARA doesn’t provide a single score, but rather a dot that covers a range. It’s just not that precise. What NAPLAN does provide is a reliable picture of how a student is progressing in general terms in these core capabilities.
When it comes to measuring the rate of learning growth, ACARA uses a scale for each NAPLAN domain that enables it to measure progress between Year 3 and Year 9 for each student.? Prior to 2023, the measurement scale was based on the non-adaptive paper assessments, and went from Zero to 1000. ?Parents received reports that showed where there child was on the scale.? The scale was divided into 10 arbitrarily determined bands, and the ISRs showed which Band your child was in.
For 2023, the combination of the new scale – necessary to take account of adaptive testing - and the fact that tests were moved from May to March, led Ministers to authorise a new way of reporting to parents.? Instead of showing what arbitrary band their child was in, the reports placed their child into one of four proficiency categories.? The top category was “Exceeding”, and the second was “Strong”.? These two categories are above the proficient standard, while the third and fourth categories, “Developing” and “Needs Additional Support” are below the proficient standard. The proficient standard itself, instead of just being set by a technical statistical process, was determined by sets of expert teachers who were asked in a series of workshops which questions could be answered or not answered by students who were at the level expected at the time the test was to be taken.
The response to the change was mixed.? On the one hand parents valued labels that went some way to describing how their child was performing relative to the rest of their cohort using plain English.? ACARA’s Aboriginal and Torres Strait Islander Advisory Group was particularly influential in the decision to call the bottom category “Needs Additional Support”.? The original proposal was “Developing”, but they objected to that as hiding the truth of the situation in which many First Nations students find themselves, that is, that they need additional support. ?I remember the discussion well: “We want truth-telling in history, and we want it in reporting as well. Don’t sugar-coat it.”
On the other hand, the researchers and policy analysts mourned the loss of the Bands because, they argued, it meant that even if students were progressing in terms of the literacy and numeracy skills, they might be reported in the same category in successive reports.? For example, if a student was in the Developing category in Year 3, and achieved two years of expected progress, they would be reported as Developing in Year 5, whereas under the former system they could be seen to move from say, Band 4 to Band 6.
Teachers were also divided, with many pointing out that the label “Strong” chosen by ministers could be misleading for students and their parents who were marginally above the proficient standard.? I agree.? “Sound” would have been better. ACARA had initially recommended much more straightforward labels: Well above proficient, Proficient, Below Proficient, Well below Proficient, but in the end Ministers decided otherwise.
My hope would be that ACARA will move over time to electronic delivery of ISRs, whereby parents will be able to see where their child was on the overall scale on previous assessments, thus getting a visual representation of their child’s progress.
And this will confirm what we know already is a problem: different children progress at different rates, and unfortunately you are more like to progress slowly if you come from a socio-educationally disadvantaged community.
In fact, the report of the expert panel reviewing the national school reform agreement shows that shows that between Year 3 and Year 9, the achievement gap between the most disadvantaged students and the least disadvantaged students widens considerably.? Educational inequity gets worse.
领英推荐
A strange idea: the untimed syllabus
So the key question then becomes, what do we do with this information, when we identify students who are travelling at different rates towards common learning goals? Sure, the system is not performing as it should, but will proposed solutions improve things, or make it worse?
Here I? want to go sideways here for a few minutes to discuss an idea that has been promoted in some circles, namely the introduction of an “untimed syllabus”.?
This was a recommendation of the 2021 Review of the NSW Curriculum, which argued that NSW should do away with a curriculum that is based on year-level achievement standards and replace it with one whereby students move at different paces through the curriculum depending on the extent of their mastery of the content.
Sounds perfectly sensible, right?? So why did the NSW government, on the advice of the NSW Education Standards Authority, decide to exercise caution?? Instead of adopting the recommendation, the NSW government decided to initiate trials of this approach, however the arrival of COVID lowered this as a priority.?
Shortly after the NSW Government announced its decision, Professor Masters published a piece entitled, “The Equity Myth” in which he argued strongly that untimed syllabuses would have more equitable outcomes than the current year-level, age-based syllabuses.? The core of the argument is set out in the following sentences from the article:
An equitable curriculum would recognise that students are at widely different points in their learning and have very different learning needs. Rather than being based on a ‘conveyor belt’ design that expects all students to progress at the same pace and teachers to deliver the same curriculum to everybody, the curriculum would be designed as a frame of reference for establishing the points individuals had reached in their learning and then ensuring every student was taught and challenged at their current level. Instead of being anchored to years of school, the levels of such a curriculum would define a common sequence and course of learning for all students while allowing for differences in individuals’ starting points and rates of progress…
I have great respect for Geoff Masters, but would like to suggest that the last sentence, while it sounds like a wonderful formulation, is in fact a recipe for disaster if it is to be given concrete expression in untimed syllabuses, and that far from closing learning gaps, it will widen them even further.
Masters continues:
Students who are not on track may require special support and intervention to get them on track as quickly as possible. This is especially important in the early years; some children currently begin school on trajectories that do not see them reach minimally acceptable standards by the time they leave school. And for some children there may be a need for ongoing support to keep them on track.
This suggests that students who start behind will need help to speed up to catch up.? In other words, in theory they will have to go at a faster pace than the students who start ahead of them if they are to have any chance of finishing the race in a reasonable position. I absolutely agree, and if that’s what happens, that would be great. ?But how is it to be squared with the two operating principles that
1.???? no student should be required to progress to the next syllabus until they have adequately mastered the content of the prior syllabus (as judged by their teacher); and
2.???? a student who has mastered the content of a syllabus (as judged by their teacher) should be able to progress to the next syllabus when ready.
I may well be lacking in imagination, but what I suspect is likely to happen in practice is that students who start behind will move at an even slower pace while the students who start in front will move at a faster pace. The most likely outcome is that the achievement gap will widen and equity, instead of being enhanced, will be even further undermined.?
This is not to say that teachers shouldn’t be differentiating their instruction to some degree to take account of the different abilities in their classroom.? But the untimed syllabus is not the answer.? Other solutions such as small group tutoring are likely to have more success at closing the equity gap by accelerating the learning of those who are behind.
Structural v Relational approaches to reform
In addressing the achievement gap, it is important to distinguish between two categories of reform: structural and relational. The proposed “untimed syllabus” reform falls into a category of initiative that can be described as a “structural” solution.
The more pervasive and popular solutions in education fall in the structural category. Structural solutions involve changes in school procedures, regulatory practices, and structural factors around which schooling is organized.
Examples of these reforms include extending the school day or the school year, reducing class size, alterations in the sequencing of the curriculum, adopting a new textbook series, constructing new buildings, changing from a junior high to a middle school format, or implementing block scheduling.
Structural solutions for school improvement are more likely to be enacted among those who pursue school reform, perhaps because with such solutions, it is simply easier for the public to see that something has been done, that some changes have been made to educational system. Something had to be done, and look, we’ve done something! But as we know, or at least I hope we do, change is not the same as improvement.?
On the other hand, there are “relational”.? These solutions appear to be more compellingly effective than structural solutions. Relational solutions are concerned with implementing practices that directly transform the procedures which optimize student-student and teacher-student relationships that support teaching and learning activities inside the classroom.
These interactions form the fundamental bases of everyday teaching and learning and are manifested in cognitive, cultural, social and motivational processes within classrooms. Mounting evidence supports that such relational factors function to maximize increased student learning and explain a sizable portion of the variation in student achievement. Such classroom dynamics engage students’ academic efforts, prior experiences, attitudes, knowledge bases, values, goals, and agendas.
In other words, it is in the far less sexy, less glossy, less eye-catching work of improving day-to-day teaching that the real benefits are to be realised in terms of closing educational gaps. It’s the pedagogy that makes the difference.? Which is the educational version of Peter Drucker’s famous dictum, “culture eats strategy for breakfast”.
Now this critique of the idea of the untimed syllabus is not intended to suggest that there should be no room in our education system for different structural approaches, for organising schools differently, for organising the curriculum differently.? We need the system to encourage diverse approaches that meet diverse educational needs.? But at the end of the day, you need a common tool to measure the academic progress of students who are in those alternative settings, otherwise there is no way to establish whether those alternative approaches are, in fact, any better than traditional approaches to school organisation.
?Change versus improvement
When I was a university student in the 1980s, every year elections were held for the Student Representative Council. One year, one student group had as its election slogan “Vote for Change!”. Another student group in the election had a different slogan: “Don’t vote for change - vote for Improvement!” So, I put this question: if you had to choose, what would you vote for? “Change” or “Improvement”? Surely you would vote for improvement, because change might be change for the worse.
The point of the story is very simple and straightforward, namely that when it comes to asking how to introduce effective change, one first has to ask: “How do you know the change will be an improvement on what we currently have?” In order to answer this question, you have to some objective standard of measurement that allows you to say, on the basis of evidence, that a new model of schooling is better than the current model of schooling, since just because something is new does not mean it is better. So how does one measure the quality of schooling?
The dilemma that many advocates for a new model of schooling have is that in many such imagined alternatives, there is no place for the kind of standardised testing that would actually provide evidence that the new model is better in terms of achieving educational equity than the status quo.? NAPLAN or the HSC exam results are rejected as a valid measure.
In the absence of such measures, reformers have two options: either they attempt to persuade people by the emotional force with which their arguments are put, suggesting the current system of being ?“an outdated, industrial model” or “not meeting the needs of students in the 21st century” but without offering any alternative objective standard by which the new school model can be assessed as better than the current model; OR, they embark on the kind of challenging task of coming up with new educational metrics, which entail the various risks I spoke about earlier.
A far better approach would be to accept the existing metrics, seek to improve them where necessary, and have the confidence in one’s reform agenda to be able to produce academic results that provide evidence that the reformist alternative is superior to the existing model.? So ideally system authorities should encourage both technocratic and pedagogical innovation at the school level, and insist that the evidence for the success of alternative models of schooling continue to be through assessments such as NAPLAN which allow fair comparison.?
School level
So now I want to turn to consideration of how NAPLAN can be used or misused at the school level.? One of the practices that I and many others find odious, and which has contributed to an anti-NAPLAN animus among many education, is the habit of some schools – including, I am sad to say, some Catholic schools - in relation to how they publicise their NAPLAN results as evidence of the quality of school, without any reference to the socio-educational status of the school community.
I have noticed schools promoting their NAPLAN results on LinkedIn, purporting to be “outstanding” or “excellent”, according to NewsCorp league tables that use mean scores without providing any contextual information about the socio-educational profile of the school community.
One of the schools who was crowing about their results had ots of green and dark green, when compared against all students.? The school has an ICSEA score of well over 1000. Notice also their NAPLAN participation rate is only 92%. It doesn’t take a rocket scientist to work out why 8% of students might excused or discouraged from participation.
The real question is whether this level of achievement is, in the terms used on the My School website, well above, above, close to, below or well below what you would expect given the socio-educational profile of the student body.? ?
I suggest that these schools might do well to either provide full disclosure by pointing out that in fact their results are nothing special when this context is taken into account, or even better, remove those posts as they do more harm than good to our education system as a whole.
While taking socio-educational status of the community into account is important, it is only the first step in a two-step analysis of results if we are going to get close to what is really going on at school level.? The second step is to look at the measures of progress that students make between the assessments. In primary schools, we need to compare the progress Year 5 students made compared to where they were in Year 3; and in secondary schools we need to look at the progress Year 9 students made compared to where they were in Year 7.? In schools that are K-12, there is also the ability to monitor the progress between Year 5 and Year 7.
In 2021, ACARA introduced changes to the MySchool website that allowed parents to see whether the progress being made was above or below what might be expected, given the school community’s level of socio-educational advantage. If more than 60% of the students in your school are making progress between assessments above what would be expected for students across the country from a similar socio-educational background, then your school is probably doing something intentional that is really making a difference, particularly if the school is doing this consistently, year after year. (As an aside, one of the consequences of the resetting of the NAPLAN scale in 2023 is that this table won’t be available again till next year, 2025, when we will be able to see progress between 2023 and 2025 on the new scale.)?
But while the new-look MySchool enables this information to be provided, the key question from a school’s perspective is what it tells you about what needs to be done?
Now earlier in my address I was critical of Geoff Masters’ advocacy of untimed syllabuses, but I am fully on board with him in relation to a 2017 article he wrote in which he lamented the lack of systematic analysis of NAPLAN data with a view to understanding what might be driving improvements.
NAPLAN makes it possible to explore the reading and numeracy skills of Australian students in some detail. Because NAPLAN assesses all students rather than samples of students, it is possible to study changes in performance at the level of individual schools as well as school systems – including schools and systems in which results have improved since 2008. This, in turn, introduces the possibility of identifying policies and practices that may have led to these improvements…
So in 2020, using the old scale, ACARA undertook an analysis of the previous six years data to identify schools around the country that had consistently demonstrated above-expected progress taking socio-educational background into account.? ACARA wrote to 30 schools: 15 primary and 15 secondary.? They identified the top five schools with consistently high progress in any of the domains reading, writing, and numeracy. ?We asked them to identify what they thought they were doing intentionally in terms of pedagogical practice that might be giving them these results. ACARA published their responses, and summarised them in terms of any common themes. ?The media release ACARA issued stated that…The information provided by the schools indicates that some of them do use similar methods. Some of these include:
? explicit teaching, including the use of clear learning intentions and success criteria for lessons
? use of formative assessment to generate data on student progress
? analysis of that data to inform teaching strategies, which can include differentiated teaching depending on the level of support students need
? strong focus on sustained professional development, with more skilled teachers acting as instructional leaders and mentors
? collaborative approaches to planning and teaching, which build collective efficacy among teaching staff.
The following are representative excerpts from three of the schools for numeracy, reading and writing respectively:
? "Learning activities are directed towards a specific learning intention and are structured around big ideas and essential questions."
? "Classroom reading instruction is heavily supported via a structured instructional leadership program. Teachers work shoulder to shoulder with instructional leaders focused on classroom pedagogy."
? "We analyse big and small data to inform strategic direction ... to see what students have done well and where learning gaps remain, and we align this with our teachers’ assessments ... This knowledge informs our professional learning agenda which is targeted to meet the gaps."
This highlights that the measurement is only the start of the process.? The data that is delivered by the measurement tool needs to be attended to carefully, then intelligent questions need to be asked about what it might mean, and then those options need to be critically weighed in light of other information and evidence in order to arrive at the most likely and credible interpretation, based on other formative assessments and observations that teachers conduct. And then once that knowledge is achieved, the question arises – how do we responsibly act upon what we have come to know?? What changes to pedagogy or curriculum design do we need to consider? ?
School Improvement
So the question for everyone here at this conference is what processes have you got in place to systematically analyse the assessment data, including the NAPLAN data and that which you get from your senior secondary results, so that consideration can be given to whether you need to do things differently?
A good place to start is the National School Improvement Tool.? To what extent are following statements true about your school?
·?????? the school has made an effort to understand current student achievement levels, and how achievement levels have changed over time, including for students in social inclusion priority groups, students at risk of disengaging or who have disengaged from schooling, and students facing disadvantage…;
·?????? explicit targets for improvement in student achievement levels have been set and communicated to parents, staff and the wider school community;
·?????? school staff are united in their commitment to improve the quality of teaching and learning throughout the school and to address obstacles to schoolwide improvement;
·?????? the school communicates clearly that it expects all students to learn successfully and has high expectations for student attendance, engagement and outcomes;
·?????? the school has clearly articulated strategies for improving levels of student achievement and wellbeing; and
·?????? progress towards targets is monitored and initiatives and programs are systematically evaluated for their effectiveness in producing desired improvements in student learning and performance.
Conclusion: what would Marcellin say?
In conclusion, it is important that we acknowledge fully the importance of gathering suitable empirical data on the various metrics which are part of national and international discussions on school effectiveness. Catholic schools are not immune from the need to move in this atmosphere of accountability and openness to public critique: As civic institutions, often with a religiously diverse intake, it follows that they should be subject to the same level of scrutiny as that faced by other forms of schooling - while playing a full part in wider, and often necessarily critical, discussions about the level and types of scrutiny to which schools should legitimately be subject.
Catholic schools are the visible expressions of the Catholic Church’s commitment to educate, form, and inform. And a core expression of our Catholic commitment to education as the process of empowering students for their journey towards the God who the source of all truth, beauty and goodness, is that we are committed to excellence, and when it comes to academic learning, we must have recourse to valid and reliable methods of assessing student learning progress.
Such a commitment to academic excellence and to the collection and reporting of data and information about the success of our efforts needs to be seen as integral to the mission of Catholic schools, not just as an external burden imposed by a secular system.?
Nevertheless, this focus on academic excellence needs to be kept in perspective, meaning that the intellectual development and learning progress of students, while at the heart of the Catholic school’s mission, needs to be at the service of their personal growth in faith, hope and love.
To quote St Paul in his letter to the Corinthians:
If I have all the eloquence of men or of angels, but speak without love, I am simply a gong booming or a cymbal clashing. If I have the gift of prophecy, understanding all the mysteries there are, and if I have faith in all its fulness, to move mountains, but am without love, I am nothing at all.
And that is what I think St Marcellin would say.
Head of Education Partnerships
3 个月Enjoyed this. Linked in asked me if I wanted to report it. I think I do on the basis for being too considered and factual for this platform.
Senior Lecturer - Theology and Associate Dean, Learning and Teaching at The University of Notre Dame Australia
3 个月A great summary of the lay of the land in Australian schools. I can just imagine the stress an untimely syllabus approach would put in teachers! Works well for home school, but at a system level it would be chaotic.
Dean of Teaching and Learning at St Augustine's College & President Science Teachers Association of NSW
3 个月A timely piece, very well considered. The ‘Science of Science’ made me chuckle.
Education Consultant
3 个月Absolutely David. The key is ‘value added’ learning, not just reporting achievement against a standard; how much the student has improved is about what we do with the student. Also care needs to be exercised about an over-reliance on ‘science of learning’ at the expense of student enjoyment of learning.
Senior Lecturer at Australian Catholic University
3 个月As always, David de Carvalho, wonderful insights, made more so given your firsthand experience. Here’s my much more subjective reflective response.