Re-thinking the university curriculum
Simon Eassom
Chief Executive Officer (CEO) at Food Frontier - Chief Futurist at the Australian Council of Professions
Stephen Parker's recent post (referring to his article in the Campus Morning Mail) reprises Joseph Aoun's 2017 book, 'Robot-Proof: Higher Education in the Age of Artificial Intelligence'. It's prompted me to re-visit Aoun's suggestions about the university curriculum in the light of where we are now, five years on, and just how much universities should be preparing graduates for the skills economy.
The challenges, of course, centre around an already over-crowded curriculum, the double-edged sword of accreditation and its demands on the technical components of professional courses, the historical and political antecedents of the idea of the the university and its relation to industry, and (dare I say it) the ability of professiorial discipline-grounded academics to construct and teach a curriculum that develops such skills in the context of the degree subject. However, in an era where universities have, almost exclusively, marketed themselves as employability factories, the question remains, as Parker poses it, of whether it's time for a radical overhaul of the university curriculum. I outline here what I believe those essential skills are before prefacing an adjunct to Aoun's book with a further series of articles, "Future Proof: Higher Education in the Age of the Skills Economy".?
The first thing that struck me in hindsight was just how long it has taken the university sector to begin to formally recognise competencies (via micro-credentials, for example) related to non-discipline based skills that require both a "soft skill" (such as presenting to an audience) and a "hard skill" (such as using an office application such as MS Powerpoint to facilitate that presentation), even though such competencies became essential graduate and employability skills twenty years ago. The backdrop to this, let us not forget, is the constant gripe from industry that universities are turning out graduates who are not "job-ready".
I began questioning just how serious academics were about "teaching" these skills in their courses when I became a 'Teacher Fellow' in the UK in the late 90s and early 00s. Course tutors attempted to assess them and were begrudgingly persuaded to do so but were happy to let them be acquired by a process more akin to osmosis than any overt attempt to educate students in their development. After all, most academics as "presenters" were wedded to "death by OHP" and were struggling to up-skill themselves in laptop use and doing anything more with Powerpoint than creating text-dense slides and endless bullet lists. Swipes and transitions were the hallmark of a real "power user". Embedding video was just too much: you were clearly a techno-nerd and not a real academic if you had all that time on your hands to learn such skills. None of this denies that all universities have always contained some brilliant academics who embrace technology in their teaching delivery and represent the cutting edge of teaching and learning leadership. But technical skills aside, how can you expect the most un-creative lecturer to teach "creativity"? Half-hearted attempts in the new modularised degrees of the 90s to embed such skills in separate, out-of-context courses were truly woeful and did little more than allow sleep-deprived socially active students the chance to catch a few Zzzzzz in class.
So where are we now, twenty years on? Moreover, where should we be and why is the subject of skill acquisition back on the agenda with greater urgency than previously? But first: the skills. What are these employability skills constantly being talked about? They are often alluded to with parenthetical examples: "blah blah blah, need for soft skills (such as creativity . . . etc)" without ever being fully listed or actually reasoned or justified.?
Although I don't like the binary of "soft" and "hard", I will apologetically use it here in keeping with most representations. The "soft skills" are fairly easy to list as they are the most talked about as (somehow) being the most important, the most neglected, the most "human" in an era of AI, and the most "worthy" of a graduate. I will compartmentalise these as dispositional, relational, and developmental, knowing that neuro-physiologists and cognitive psychologists will wince at such stipulative distinctions. The distinctions are not mutually exclusive but, at the very least, provide me with a heuristic device for further investigation.
What "soft skills" should we try to teach and to whom?
First, my typology. I think of dispositional skills as those that reflect personal dispositions and capabilities (that whilst capable of being developed to a greater or lesser degree) are more often exemplified by an individual's preferences and innate capabilities. The obvious big three are skills deemed to somehow be lacking and in demand are related to creativity, reasoning and argumentation, and problem solving.?I call them dispositional because they are somewhat dependent on not only the individual's natural talents but also a certain attitude, disposition, or "way of being" in the world. I am a very keen photographer and I am prepared to say I'm very competent technically. But I struggle with composition. My brilliant photography friends interrogate my depth of technical knowledge whilst I look at them flabbergasted that they take better pictures than me without worrying about the relationship of focal length to perspective. Some people just see the world differently and "create" accordingly! We desperately need people with these dispositional skills but should we set out to teach them to people that don't already have them in abundance and, if not then to everybody, to whom? The late Sir Ken Robinson had a great deal to say on the failure of our school systems to "breed" creativity. I'll return to this after I've outlined the rest of my typology.
The relational skills are those that are largely latent and reflect certain dispositions but emerge in context more strongly in some than in others. However, they can be "taught". I include in this batch skills such as: collaboration, autonomy, resilience, and adaptability. I'll steal (and misuse) an Aristotelian metaphor here: we enter the palace of reason through the courtyard of habit and tradition. In this context, what I mean here is that these relational skills are underpinned by numerous other dispositional skills and can only be nurtured by putting people into situations where they have to collaborate or adapt to succeed. We have to promote the habits and develop the traditions. There are numerous emotional intelligences and dispositions that lead us to more readily collaborate or to adapt to circumstances and some of us have them already and some don't. What every student knows is that being given an assessment task with the dreaded requirement of a "group project" with a shared grade does not engender collaboration and is often counter productive. How should we teach students to recognise the value of collaboration and how to build knowledge networks that complement their skills and knowledge and further the achievement of their goals? On the other hand, creating situations where a task cannot be completed without working together is the bread-and-butter of outdoor education management training courses. But, even here, are we teaching people how to collaborate or giving them understanding of the need for collaboration? Such distinctions need careful consideration if we aim to "teach" these skills.
Finally, the developmental skills are those that only fully flourish with experience and exposure. They can be taught but require corresponding developments of EQ and other dispositions and relational skills: leadership is a skill that some people will never master and it's questionable that we should assume it to be a consequence of graduation from a university course, even a post-graduate course aimed at developing leadership. The philosopher Gilbert Ryle would talk about "propositional knowledge" and "procedural knowledge". Universities have traditionally eschewed procedural knowledge (outside of laboratory skills needed for research and professional accreditation) and elevated "propositional knowledge" as more degree-worthy. The others in this developmental batch include: time management, entrepreneurship, and a strategic mindset. They are all essential and perhaps more easily put through the course development sausage machine with more tangible aims and objectives and measurable learning outcomes but just how much we expect everybody to develop them is another matter.
The reason for this somewhat drawn out exposition is to align the need for development of these "soft skills" with tangible skillsets aligned to job roles that not only create greater employability for the graduate but also shape the nature of work in the era of AI and help organisations understand where its capability requirements lie. Perhaps it's more important to identify the learner's dispositions and attitudes and, when combined with their academic interests, curate a programme of study that prepares them in the best way, not just for employment but also for fulfilling and motivating work. I compartmentalise these dispositions and "soft skill" capabilities into job role clusters that represent the Big 5:
Rather than thinking about professions and jobs, we should focus on helping learners to understand where they best fit in such a typology. Somebody who fits into the consultant-communicator paradigm might well suit becoming a teacher but many enter the teaching profession for all sorts of reasons but lack capabilities as consultant-communicators. Similarly, guardian-governors are inclined towards policy and process management. They might struggle with adaptability and collaborative skills but they work well as auditors and lawyers. Yet, so many thinkers and problem solvers dive into careers in law beguiled by the plethora of law-based drama series on television and the cut and thrust of prosecution and defence and are driven to despair by the banality of a great deal of the guardian-governor requirements of everyday legal work.
All of this leads to the possibility of a curriculum for the 21st century university that genuinely offers fully personalised learning, skills development, and career navigation where the programme content is potentially multi-disciplinary and completely cuts across department and faculty boundaries. Those boundaries might still exist for research and expertise development but they are fast becoming inhibitors in the transformation of the university as a genuine employability developer.
领英推荐
The "hard skills" future workers need
The "hard skills" are the more elusive as "skills" embedded in the university curriculum: they are more technical and prone to a shorter shelf-life, hence why universities have such difficulty with validating them and placing them. Nevertheless, they are now as essential as using word processing, spreadsheets, and presentation graphics have been for the past twenty years. Their requirement is driven by the exponential growth in digital technologies that is changing the nature of business and the future of work. In fact, despite the emphasis on "soft skills", I want to suggest that these "hard skills" are becoming even more important for employability.
What does this future of work look like? It can be represented in many ways but generally, for the purposes here, it can be categorised around 4 levers that are shaping the transition of organisations inexorably from "people businesses" enabled by technology to "technology businesses" enabled by people. What I mean by that, more specifically, is that there has been a shift in process engineering from people-centric ways of doing things where technology (that is, hardware and, more importantly, software) enables more efficient delivery of those processes to technology-derived processes that humans augment with their thinking and management to create new products and services. As an example of the former, think of recruitment for a job: the employer creates a job profile; posts an advert on a jobs' board with an agency; waits for applications in the form of CVs and letters; the HR department sifts through applications; a short-list is drawn up; first-stage interviews are held; two candidates withdraw and one isn't suitable; the department re-writes the specifications and off we go again. That's before getting to the processes of reference checking and credential verification. Within that human-centric process, technology has been increasingly utilised to make the process more efficient and less demanding of human time: the job role is automatically distributed via multiple channels; applicants submit applications in digital format; software does an initial sift of candidates by searching for keywords and expressions; emails are automatically generated; and so forth.
A technology-derived process is one where the function of the software is used in the design of the solution in the first place and the processes flow from that design, minimising human engagement in the "supply chain" and utilising it where it's most effective, in the analysis and assessment of short-listed candidates. Such a system might, for example, trawl e-learning and e-skill records (all securely locked and verifiable in a Blockchain skills vault); identify potential candidates with the requisite skills and experience; verify credentials hosted on a badging platform; filter for other relevant variables (such as current location, years in post); send out a notification; and engage a human only upon the receipt of "expressions of interest". The question, then, in all the doom-laden discussions of how the robots will take all the jobs, is where will humans add value in this process chain. The question isn't a difficult one to answer and the opportunities aren't limited. For the moment, let's just stick with the question of how this changes the work-readiness of graduates and their skill requirements. Traditionally, employers hired graduates because of an assumption of soft skills representing the ability to learn, the ability to communicate effectively, the ability to work autonomously, the ability to think independently and so forth. You learnt the human processes "on the job" and you moved through a career framework based on performing the tasks, managing people who performed the tasks, managing the business that had responsibility for the tasks in a hierarchy tightly aligned to job role specificity. CEOs started life at the coal face and knew the business from the ground up.
As we move towards the proliferation of jobs in those new-era businesses, the 4 levers that pull on the talent pool and lead to a well of jobs but no suitable applicants (despite having university degrees) channel the skills requirements around: the digital workplace; the requirement for lifelong learning (owing to the rapidly reducing half-life of technical skills and the constant need for up-skilling and re-skilling); the demand for hybrid-skills in addition to other knowledge and expertise in a "skills economy"; and the increasing demand for accountability in business, whether it be in terms of financial transparency, green credentials, ethical conduct, or quality of service. Job role specificity is no longer the guarantee of lifelong employability. What now matters is capability development and the worker's flexibility in applying that capability to multiple scenarios and requirements in a never-ending work environment of short and medium-length projects and "gigs".
The list of in-demand hybrid skills that support the new ways of working could be almost limitless but it's worth highlighting a number that are becoming as essential now as basic office software skills have become over the last twenty years. The first set I'll highlight are "production skills" related to communications; not because they are the most important or incontrovertibly essential but because they illustrate the point I'm trying to make and provoke consideration of how much universities should be responsible for developing them.
It will become increasingly important for employees to be competent in skills of audio production, video production, multi-channel marketing, user-experience design, animation, and web development, to name just a few. These skills are not limited to marketing function or product development. They will increasingly become part of the toolset of skills needed across all parts of an organisation. Just as letters gave way to emails, to instant messaging, to Zoom and Teams meetings, and to productivity applications that manage workflow and activities, employees will use video and audio more fluently with the emergence and growth of avatars and holograms: my hologram will pop up on your desktop taking you through my communication to you with pull-down charts, animations, images, and so on. In the not-so-distant future I will develop my avatar automatically to develop and orchestrate my communications in the ultimate form of personal assistant. Meanwhile, in the here and now, our consumers (including students) learn, communicate, and engage via Instagram and Tik-Tok, generating their own reels and messaging.
So, you might argue, my last example illustrates how these "skills" are part of the learning ecosystem that young people are already immersed in: it doesn't need to be formalised in the university curriculum. That's not the university's job. However, being skilled in web development doesn't mean being a programer fluent in HTML (although such specialist skills are needed as well), it means having a core understanding of the way web design has evolved to become central to the predominant means of communication, whether by a sales organisation, a media outlet, a blogger, a government department, or a university. No longer is it the case that a line-of-business employee prepares a content brief to hand over to marketing to communicate to the organisation's customers. The whole process is now a collaborative one whereby the subject matter experts understand as much about multi-channel communications as the marketing professional does about the SME's line of business. The SME constructs the information knowing how the hierarchy of web pages needs to work and how pages need to link to repositories. The technology shapes the process and employees need to be fluent in the technology to work constructively in such an organisation as humans augmenting technology-derived processes. Producing an Instagram reel isn't the challenge; it's understanding the application of that form of social media within a business, internally and externally.
Less contentious are those skills related to technology platforms and the rise of the era of cognitive computing. All employees need to understand cloud computing, the basics of AI, data-related skills such as data visualisation and interpretation, API (Application Programming Interface) use and management, machine learning, predictive analytics, and natural language processing. Further "hard" skills revolve around people management, business analysis, sales management, competitive strategy development, project management, agile development, and industrial design; all of which are "teachable" and inextricably intertwined with the underlying technologies supporting them, built on a hierarchy of pre-requisite skills such as adept applications of business intelligence and analytical reasoning, for example.
How does this impact universities?
The question still remains as to where and how these skill requirements are addressed. Universities can't claim to be the best channel to professional employment yet abdicate responsibility for such bric-a-brac as skills: "It's not our business" won't wash. Few actually say such a thing, although the research intensive universities might like to even as their researchers incorporate them more and more into their own processes. Nevertheless, even where universities are dipping their toes into the world of micro-credentials, stackable or composable digital certificates, and recognition of skill development within courses that justifies further credentialing as an add-on to the testamur, it's still peripheral and carefully siloed to ring-fence the core product: the full degree programme.
Furthermore, it's no use punting around responsibility between schools, TAFEs, universities and corporations with in-house professional development in an attempt to demarcate delivery by aligning these skills to competency frameworks and AQF levels. It's everybody's responsibility, everywhere. The real challenge is around working in partnership to develop a credentialing ecosystem that simplifies the learner's journey without increasing demands on their time, their pockets, and their learning preferences. What that ecosystem needs to contain at the very least is a credit recognition and transfer system, robust and credible enough, that enables academic institutions to work with other providers and accrediting authorities to develop employable graduates where the awarding institution allows credit towards the qualification and doesn't see skill development as an adjunct or an extra-curricula private good that the student opts into. Only then will vibrant and productive learning-integrated-work become meaningful and part of a genuine lifelong learning journey.
Enabling this is not a delivery problem; it's not a design problem; it's not an epistemic problem. It's a business problem. It goes to the heart of how universities operate and manage their cost structures. It might ultimately mean giving up on procuring the entirety of a student's four years' of fees. It might mean creating meaningful exit points that actually allow students to exit before completion of the whole degree and re-enter again in the future without penalty. Moreover, it might mean enabling students to enter with no intention of completing a full four-year degree. How this is managed without eviscerating the university's core business is the guts of the challenge (pun intended). Solving the problem requires a serious considerations of three pillars of the university: its architecture, the curriculum, and the kinds of people it employs and how it manages and rewards them. I will address these in my series of articles on "Future-proofing Higher Education".
Managing Director EvolveCareers, FRSA, Board Trustee
3 年Really thought provoking article Simon, look forward to the article series. Given your closing point on it being a business problem, where do you see institutions looking intentionally and creatively at addressing that to get ahead of the curve and create the new model required?