The technology learning curve has never been steeper - or more important to climb
Picture credit: Hu Chen on Unsplash

The technology learning curve has never been steeper - or more important to climb

The first time I tried to learn about computing was difficult, because I didn’t actually have access to a computer. I just knew that this was a field I was interested in. My school had a Commodore PET which I wasn’t allowed to touch, but there was something eerie and fascinating about the glowing letters on its screen. I wanted to be able to make those letters do what I wanted them to do.

So, I got a book from the library and tried to read it. Fortunately, it was a book on BASIC, which most microcomputers ran at that time. I wasn’t able to run programs, but I was able to learn the commands and the syntax, and to write some simple programs on paper. When I finally got my hands on my own computer, it helped me make a quick start.

And, years later, when I got my first paid programming job, writing COBOL on an ICL mainframe, I was able to transfer many of the skills I had learnt on my home computer into the work environment. Even though I learnt quickly that the GOTO I had relied on for many of my amateur programs was frowned upon, most of the other logic constructs worked and were useful. And that has been my experience throughout my programming life: these days I mostly write Python, but many of the basic constructs in Python do the same job as, well, BASIC.

However, even though some of these fundamental constructs have been remarkably persistent, that does not mean that I have not had to learn new things. One of the exciting things about the field of computing is that new fundamentals keep turning up. Just as I can remember that first book on programming, I can remember the first time that I encountered a GUI and realised that I was going to have to learn new things about user experience. I can remember the first time I encountered things called classes and objects. I can remember the first time I worked on a network more complicated than a terminal with a hard-wired connection to the mainframe. And I can remember the first time that network extended beyond the boundaries of the company to an emerging public network called the Internet.

I believe three things about the nature of these fundamental computing concepts and their impact in our work.

First, I believe that, if you work in computing, and you want to stay engaged with the technology you are building, you will have to keep learning. Of course, you could ignore this advice. It is possible to specialise deeply in a narrow set of skills which don’t change much. It’s also possible to focus on management of teams and resources rather than technology. However, I think that both of those choices make life less interesting, and also that it’s increasingly hard to manage and lead well, or to make sense of a deep technical specialism, without paying attention to the way the world is changing about you.

Second, I believe that, if you work in computing in 2022, the fundamentals of computing are changing with greater speed, depth and significance than ever before. I was reminded of this when I conducted two recent experiments in learning in public, about quantum computing and generative AI. In both cases, while I found that lots of coverage of these topics was either overly cynical or overly credulous, both of these developments have profound substance and impact, are difficult to get to grips with, and are currently the province of deep specialists. Grasping either of them takes rather more attention than the ‘IF . . . THEN . . . ELSE . . . ‘ of my BASIC textbook.

Third, I believe that these changes in the fundamentals of computing are additive rather than substitutional. While we are trying to figure out how we might, for example, integrate generative AI into our call centres, we are still writing code with the same logical constructs that I learnt from that BASIC textbook. We may even be maintaining code of the same age as that textbook. We sometimes talk about ‘full stack engineers’: to be a true full stack technologist you need to have grasped and internalised concepts accumulated over decades. I doubt that it is possible to fit all of those concepts in one head.

This prospect may seem too daunting to contemplate. It makes computing seem hard to approach for newcomers, and hard even for experienced people to stay current in their field. Yet, if we don’t do this, then not only do we miss the benefits and advantages of new technologies, we become increasingly distant from the world we are building out of those technologies. If we want our technology to be safe, reliable, secure and sustainable as well as fast, responsive and valuable, we need to take the trouble to understand it.

I believe that the best approach we can take to this learning curve is to recognise it, embrace it, and configure ourselves to help technology teams climb it. In my experience, technology functions often allocate small amounts of their budgets to training, and underspend even that which they do allocate: it sometimes seems too hard to find the time amongst the pressures of project work, backlogs, schedules and production incidents. Yet underinvesting in training can be as fatal as underinvesting in technical currency: they are both forms of debt, and they are both hard to pay down, especially if we let them build up.

It’s time to build the continuous refresh of skills into our technology budgets, just as we build in the continuous refresh of infrastructure, and to recognise that, just as our technology sometimes needs to undergo major upgrades, so does our understanding. If you are a technologist reading this, you may be thinking about your own learning plan. If you are a manager or leader reading this, you should be thinking about whether you are creating the environment for your teams to keep on learning.

(Views in this article are my own.)

Nicola Hobson-Langley

Transformative data and analytics leader with an obsession to create personalised experiences for our customers using data, analytics, ai, ml and nurture talent

2 年

Well said - I do think that learning every day has to be a fundamental part of a working day in the data and analytics world but also beyond as the pace of change and new information continues to accelerate. It’s a useful habit to foster! Great article

To be a Full Stack Technologist in years ahead, will we still need decades of experience or might this be accelerated / supplemented by generative AI? How much more quickly could you have learned how to code many years ago if you had an AI Chatbot buddy to mark your homework and show you how things could be better like debugging a line of troublesome code?

Oliver Cronk

Technology Director | Sustainable Innovation & Architecture | Speaker, Podcaster and Facilitator | MBCS CITP

2 年

The growth mindset (and the right environment to support that mindset) is critical in tech. Nice article David.

Mike Samra

Chief Technology Officer

2 年

Couldn’t agree more or say it better.

要查看或添加评论,请登录

David Knott的更多文章

社区洞察

其他会员也浏览了