The Forgetting curve - Implications for assessment of vocational competency: Credit Transfer, Program duration and the definition of "Competency"
Hermann Ebbinghaus (see https://en.wikipedia.org/wiki/Hermann_Ebbinghaus) hypothesized the "Forgetting curve" in the 1890s. The forgetting curve hypothesizes the varying rate in the decline of memory over time. This curve (above) shows how information is lost over time when there is no attempt to retain it.A related concept is the strength of memory that refers to the durability that memory traces in the brain. The stronger the memory, the longer period of time that a person is able to recall it (https://en.wikipedia.org/wiki/Forgetting_curve).
"So what?" you say. The current significance of this finding has recently been championed by eminent scholars such as Robert Bjork, distinguished Professor of Psychology at the University of California, Los Angeles. His research focuses on human learning and memory and on the implications of the science of learning for instruction and training.
"So what?" you (still) say. Well the upshot of this scholarly research can be summarised as follows:
- We forget stuff we've learned if we don't use it
- When we learn things we learn them better if we "practice" them - This takes time (including time to "Forget")
- Performance (in particular evidence of performance) is not evidence of learning. And, (two of my own)
- The rate of retention declines in indirect proportion to the presence of competing demands on memory
- We don't know what we've forgotten and we don't know that we've forgotten it
Still saying "So What"? Well, in the information-age relevant vocational skills are increasingly reliant on information (not strength or physical technique). This isn't just the ability to recall but the efficient ability to identify what information needs to be sourced and the distinct ability to source that information "just in time".
This impacts Australia's VET system in three important ways: (1) Approach to Credit Transfer, (2) Learning Program duration and (3) "Competence" as it is judged in the field. I discuss these below.
(1) Approach to Credit Transfer.
We all know Units of Competence are "mapped"; Thanks to TGA, older units can be readily compared to new units to identify differences. Sometimes a superseded unit is deemed "equivalent" to a current unit. In this case, and subject to adequate evidence of continuing practice and professional development, the holder of a superseded unit can be (and indeed is required to be) deemed competent for the equivalent, current unit.
What's wrong with this picture? Well (1) above for starters. We forget stuff, particularly if we don't use it. While two units might be equivalent to each other this says nothing about an individual. For that we have to investigate the features of the evidence for their continuing practice - Does it provide quality evidence of currency and sufficiency? Probably not. We get certified, we get a job, we use parts of our learning in the job (i.e. we practice it). We don't need to recall other parts of our training for the job, so we forget those bits.
The phrase "I've forgotten more than you know" is an epithet used to denote expertise. Today however, it can mean an experienced VET practitioner has forgotten important information through misuse.
If VET certification through CT is to be effective it needs to commit assessors to identifying knowledge gaps that arise through "forgetting". Evidence of PD and "staying up to date with advances" is not enough. And yet right now, ASQA is directing RTOs to award units through CT even if the candidate has demonstrated they have forgotten the bulk of the knowledge associated with their award of a prior (equivalent) unit. ASQA is committing RTOs (many of which know better) to ignore manifest incompetence.
(2) Learning Program Duration
The "Forgetting curve" has implications for program duration.
With "User choice" rational candidates choose the lowest "cost" RTO. Cost can be rendered in terms of time and money and effort. Understandably, and to attract students, enterprising RTOs offer the shortest course with the least cost and the least effort.
In response, ASQA has mandated minimum durations for learning programs - essentially, and quite rightly, to dissuade dodgy operators. At least for TAE, BSB, CUA, FSK and FNS training packages that I am engaged with, this has lead to a spate of what I think of as "front-loaded", short duration courses: Such courses engage the learner in intensive FtF workshops and subsequently require the learner to complete self-study activities over the remaining duration of the program. They aren't all bad - The best compel a learner through a self-reflective journey that genuinely delivers learning. Others are just a way to get around the system.
In my experience so far, such programs uniformly disregard the Forgetting curve. Learners need time to forget before they can meaningfully engage in retrieval practice (through low-stakes testing). This relates to point (2) above: When we learn things we learn them better if we "practice" them - This takes time (including time to "Forget"). If a program awards certification too soon, or it fails to compel retrieval practice then it can not validly justify learning, let alone competence. Candidates may have produced an impressive portfolio of evidence, but without "forgetting time" it is not, and can not be, evidence of learning.
(3) "Competence" as it is judged in the field
The third area of impact for the Forgetting curve is our very notion of Competence as it is judged by assessors every day. The assessment of "Competence", defined roughly as the ability to apply skills and knowledge to perform consistently to the standard required in the workplace including the ability to transfer this ability to other contexts, doesn't explicitly demand persistence. It doesn't actually demand learning.
This would just be a bit of semantic revery if it didn't have so much impact. The issue is this - How competent is someone if you observe their performance today but have no confidence in their ability to perform an equivalent task tomorrow? In other words, are we as assessors, being asked to judge a persons future competence?
This is a "can of worms". Of course we can't judge future competence; anything could happen. And yet, if a certificate is to have anything more than historical significance it is saying something about the likely competence of the individual at some point in the future.
As an aside, these days my beloved Mother passes her time in an aged-care facility. She doesn't have Alzheimers but the years are taking their toll on her cognition. When presented with a novel piece of information she understands it. An hour later, unless it's important to her, it has gone for good. We don't know how long someone will retain information, but we can make educated guesses.
Two months ago I exited a CBD Melbourne campus (where I was teaching) on a lunch break only to be confronted by an aggressive vagrant who was menacing my students and demanding money with the threat of violence. I was perplexed; I'm a teacher and an appalling but enthusiastic violinist - I have no self (or other) defence skills. Still, I adopted my best Bruce Carridine pose (recalled from the Kung Fu television series in the early 1970s), stared him down and saw him off the property. If he had seen through my bluff I have no idea what might have happened next.
I mention the above because it is a signal lesson in the "appearance" of competence but does not provide any information regarding persistence. Suppose there were three of us confronting the aggressor. One had no experience, but had just completed a one hour session in how to adopt the Kung Fu defensive position. The second is me, no training but a long memory of watching too many Kung Fu shows in the seventies. Third is an actual Kung Fu master who had years of disciplined training and just happened to be walking by.
To an assessor, without reference to knowledge of the three individuals, the performance is identical. And so the assessment must be identical. With knowledge the assessor can conclude the Kung Fu master will repeat the behaviour (because he is an expert), I will also repeat the behaviour (having demonstrated recall of a 40 year old memory). But the assessor can't ascribe persistence to the freshly trained learner - Who may or may not repeat the behaviour depending on their recall.
To judge the value of performance as evidence of competency Assessors must make an informed judgement of persistence. Persistence can not be judged without judging recall and therefore without verifying sufficient "forgetting time".
Concluding remarks
If Australian VET is to be effective it needs to take account of the "Forgetting curve". The knowledge has been around for a 100 years and most recently has been confirmed by reputable scholars using modern scientific methods. Credit Transfer protocols need to include how much a candidate has forgotten before automatically deeming competency. Learning program duration needs to reflect the need for "Forgetting time" and retesting knowledge prior to deeming competency and, finally, our concept of competency must address the notion of persistence.
Sean Kelly July 2018
Adult Vocational Education and Training Professional
2 年This is where RCC and VOC comes into play, is this still taught in the TAE Cert IV. RCC: Recognition of Current Competency VOC: Verification of Competency RCC and VOC only applies if a learner has successfully completed the requirements previously for a unit of competency or module and is now required to be reassessed to ensure that the competence is being maintained...
TAE Trainer/Curriculum Writer/Tutor
6 年Kavin's comment also referred to his RTO and students resisting his suggestion for holistic assessment at 8-10 week intervals. Periodic holistic assessment using automated algorithms( e.g.? Leitner ) for follow up adaptive teaching is standard practice for people faced with high-stakes "capstone" assessments.? Training organisations and students alike are less resistant to periodic re-testing. Indeed, in preparation for independent, professional aptitude testing for executive recruitment firms and in their desire to win the job/promotion, students self-direct into practice testing. Yes, they are "training for the test" but if it is a valid test then that can't be a problem. Despite the bumbling of SSOs and ASQA hi-stakes testing hasn't been eliminated. It has just re-located executive recruiters as a filtering measure to discriminate between AQF certified applicants.
TAE Trainer/Curriculum Writer/Tutor
6 年Good grief! This insightful observation by Kavin Windsor (below) reflects the parlous ignorance of some RTO Compliance managers who value evidence over learning.? Incorporating scheduled, adaptive, re-testing is not "over assessing". - It is well-proven as a critical component of learning. Sure, in vocational learning it can be substituted with workplace practice (verified through well-formed competency logs) but either way the regular requirement to recall information without priming/prompting is what causes a person to learn something. Referred to in the scholarly literature as "retrieval or recall practice" , it has repeatedly been shown that learning programs without such practice produce manifestly (and predictably) inferior results; this not just in quality of learning outcome, it is in the resource efficiency of delivering the program - It costs more to train a person without re-testing than it does with. Automated re-testing is probably one of the cheapest and most efficient means of transferring knowledge from short to long-term memory.
A VET professional looking to assist student Trainer / Assessors navigate their TAE path.
6 年The forgetting is exacerbated when the training is compartmentalised, each UoC being taught and assessed separately. When I was teaching basic electronics it was not uncommon for students to have absolutely no recollection of something that I had taught them 6 to 8 weeks prior. In that situation rather than an assessment regime that tests the individual UoC a regime that clusters UoC as well as retests performance and knowledge criteria from previous/related UoCs. When I attempted to have a 'Holistic Assessment' (covering some 8-10 weeks of learning) implemented, the hue and cry was prolific, from both students and some parts of management, "How can you justify RETESTING if they have already passed the UoC?"? My response, because the underlying knowledge and skills are fundamental to the students' future trade and we need to make sure that they ARE competent and have retained the knowledge and skills. The students were not retested on the entire UoC, merely important parts that were effectively prerequisites for following UoC. Once the students got over their initial shock the results and the retention of information and skills improved, until the management of the RTO decided that the test was taking too long and cut it to the point of uselessness.