?????? ???????????????????????????? ??????  - ????????????????????
Produced with ChatGPT4 + AdobeExpress

?????? ???????????????????????????? ?????? - ????????????????????

(For a summary read the topic sentences of each paragraph - a 1-minute read)

??????????????????: Assessment is your most vital tool...a statement of absolutes that I shouldn't say. But I have your attention! This article will elucidate the gap that a lot of institutions encounter as well as my take on this form of data collection. Oh, and I would like to cite my preramble from a previous article on assessment. 'Knowledge parties' are entirely the intellectual property of Dennis Allen. A Math teacher I had the absolute pleasure to learn from in my NQT year.

Speaking of previous articles. I wanted to acknowledge Gilbert Halcrow for further enlightening me on these topics through his comments . If you want to read considered expansions upon these topics I would suggest diving into his work. In his consultancy, he not only focuses on the initiative (GenAI, LISC or formative assessment) but also the implementation in the school.

"...but that is not the problem - it is always implementation" - Gilbert Halcrow

The implementation gap in assessment refers to the discrepancy between the intended outcomes of educational assessments—such as accurately measuring student learning, informing instruction, and supporting educational decision-making—and the actual outcomes realized in practice. Often perceived from the point of view of whole-school summative assessments. Mainly due to the logistical difficulties of pulling these off. However, I have made this error as a teacher when applying formative assessment. Even after self-proclaiming my authority on the topic. This gap can be a pernicious one. We can underestimate the impact right up until we have collected the data over the long term, which at times can be considered too late.

Implementation gaps in assessment can arise from misalignment with curricula, ineffective design, inadequate teacher training, resource constraints, poor use of data, stakeholder misunderstandings, technological challenges, and an overemphasis on high-stakes testing. The culmination of these factors commonly results in quagmire. More egregious errors can include punitive measures implemented on bad data and stemming from a misunderstanding of the data itself. For example, using the mean to represent a data set that includes anomalies. I've seen this lead to disproportional responses to misbehavior through policy changes that have affected entire cohorts of students who ultimately lose liberties unjustly.

The Implementation Gap in assessment typically occurs in at least one of the following: design, administration, analysis, or follow-up. Depending on what data you gather, your priority of implementation for each of these steps will vary. If you intend to report upon a student's conceptual understanding of content then your question design must be meticulous. However, if you want to choose the most appropriate task for a student to learn from, you can accept a larger error margin for question design to favour agility and expediency.

The gap usually resides where there exists too much emphasis on the results of assessment as a one-dimensional measure. Not all assessments are true representations of understanding...actually no assessments are. If we're honest it's impossible even for these standardised examinations to reveal the objective reality of a learner's ability. Since assessments and examinations typically get conflated this can result in disdain for any measures to be applied. But without measuring in this way we remove a way of knowing from our perspective. Let's move from feet to metres, shall we?

So how do we ensure that both the assessments we administer and the way we utilise the data can actualise our intended outcomes? Below I address each factor that widens the gap.

  • Align Assessments to the action you want to take - Start at the end. If you wish to inform parents about the progress of their children. The most ideal way is curricula and standards-aligned comparative data. A mother-load of work. If you're looking to find out how you want to differentiate your next lesson based on students' prior knowledge then you can afford to reduce accuracy for speed. In an ideal world, all assessments would be both fast and 100% accurate. They aren't. That's fine. Specialise your assessment to the action and try not to do everything. If your assessment is meant for snapshot data then it doesn't have to be a feedback opportunity. You can do another assessment for that down the road.
  • Enhance Professional Development - Offer comprehensive training for educators on developing, administering, and interpreting assessments to ensure they are effectively utilized in informing instruction. Furthermore, project planning! The mistake most of us make in education is underestimating the amount of time these projects need to be effective. Most of us know how to create and administer assessments. What we don't do is dedicate enough time to them because we're trying to do everything else. If schools won't let up on workload then you need to be brutal. If you're running an assessment that requires time for any of these domains then you must give something else up to gain that time e.g. books don't get marked for a week or two. Ask yourself which action will have the largest impact and accept you aren't letting the students down by not doing it all. Instead, you're just burning out.
  • Leverage Technology Appropriately - Utilize technology to support assessment practices, ensuring access to and training on technological tools for both educators and students. Where I feel most comfortable! Our assessment practices are a thousand times more impactful given modern technology. This will exponentiate when educators get involved in AI development. Always look at how you adopt EdTech for assessment through the lens of assessment for learning, of learning, and as learning. Pen and paper assessment remains relevant!
  • Use Data Effectively - Establish systems and practices for the effective analysis and use of assessment data to inform instruction, support student learning, and make informed educational decisions. I'm going to take a risk and engage in controversy I've avoided for a while. We have a large amount of educators who don't understand statistics enough to interpret data within their context. Their career journey does not require the skill until they reach a leadership role. This is problematic. Since most people believe themselves capable of data analysis, they will commit mistakes unknowingly. A prime example: not knowing the difference between mean, median, and mode, or even if you do, not knowing when each is required depending upon the context. Data training is vital. In this day and age, the stigma around the inability to read data is almost equivalent to the inability to read. We need to bust this and allow people to be comfortable in admitting their gaps in knowledge of statistics.

Source - MantraCare

  • Engage All Stakeholders - Foster clear communication and collaboration among educators, administrators, students, and parents about the purposes and uses of assessments to build a shared understanding and support for assessment practices. Historical context has bred a logical fear of assessment. This needs to be addressed explicitly to ensure the mindset around data collection reduces anxiety responses in the collection itself. Protecting the well-being of all stakeholders as the main motivation. Anxiety skewing the results and therefore impacting your decision-making as another. We all know the student in class we'd hire in a heartbeat but underperforms in examinations. This scenario breaks my heart every time!

Assessments are crucial but can be tricky. What's your main hurdle? Share your experience!

If you've enjoyed what you've read here feel free to repost this article. It appears that's a favourite morsel of the algorithm. Let's feed it fat =)

????????????????????/?????????????? ??????????????:

Gilbert Halcrow - GDH Learning

Dylan Wiliam - 1994-2014 - Dylan_Wiliams_website

Philip Jury - 2023 - Rethinking Assessment: A Teacher's Guide to Understanding Learning Through Data Collection - LinkedIn

John Garfield & Christine Franklin - 2011 - Assessment of Learning, for Learning, and as Learning in Statistics Education - Springer [Paywall]


要查看或添加评论,请登录

社区洞察

其他会员也浏览了