The landscape of Artificial Intelligence is has moved at break neck speeds in the last 3 years. My first exposure to practical Artificial Intelligence was deeply sobering and kept me from the space for a number of years. Which is a shame because I’d always grown up fascinated by “Thinking Machines”. My family moved around a lot and I was a geeky, awkward kid so never really able to connect with a strong social group, so instead I became an introvert spending my evenings learning 68000 assembly and C so I could code games and maybe, just maybe even get a girlfriend.

That last sentence sounded odd right? Yes, definitely. I was constantly made fully aware of my awkwardness and reminded by my peers that the chances of finding a girlfriend for myself was slim to impossible. So what did being able to code computer games have anything to do with getting a girlfriend? Well… Absolutely nothing… Kind of. OK, so being a kid growing up in the 80’s I remember somehow one night watching a movie called Weird Science. I probably wouldn't recommend watching it these days but put in context, at the time it was pretty awesome. I mean, being a geeky child who at that point totally accepted he’d never be one of the cool kids, it gave me hope that one day I’d have a companion. Not a human companion I admit, but a companion non the less and if it had a likeness to Kelly Le Brock and was totally OK with taking showers with my while I was fully clothed, then I was absolutely down for an AI future.

Skip forward a decade and I’m sitting in a virtually empty auditorium at the university I’m attending, listening to the introduction talk for the first module in ML and Artificial Intelligence for my Comp Sci degree. There’s only a handful of students in this class, most of them looking bored beyond belief… Except me of course. I’m here to build artificially intelligent machines and yes, bionic Kelly Le Brock... After the first 10 minutes of the talk I was feeling sad and deflated however. Why? Well, because the talk was really a declaration of defeat as the lecturer proceeded to explain how slow the pace of progress has been in this space and how at best, by the end of the module I'll only be able to produce statistical party tricks. Machine Learning and by association Artificial Intelligence was at a snails pace trajectory and so would remain that way for another decade until the ideas of Deep Learning would gain traction. Those two decades growing up watching sci-fi shows were just that, science fiction.

It’s now 2024, just a short time since GPT 3.5 was released. Generative AI has captured the imaginations of industry leaders and anyone involved with thought work is most probably using a set of AI tools or risk being outclassed by those that are augmented. The world cannot keep relying on the previously slow steps towards AGI as a safety net. We don’t have decades anymore between dramatic shifts in paradigms and this shift will be the most dramatic and painful we’ve seen and will accelerate convergence of roles and jobs at first. Thought workers augmented with AI tools and assistants will be expected to undertake more end to end tasks and no longer find safety in their specialised silo. For example, for software engineers in the next 5 years there will no longer be a distinction between backend and frontend engineers. Heck, even what we currently call “full stack” engineers won’t look anything like they do now.

Historically, the high costs associated with hiring specialized talent and their teams made it challenging for startups to scale without substantial financial backing. However, recent shifts in the technology job market have altered this landscape. The widespread layoffs resulting from companies striving to balance project resources with profitability, particularly in the wake of diminished venture capital funding, have led to an influx of cheaper skilled professionals in the job market, at least for now. But I predict in the next 5 years it will become ever so difficult to find work for these roles unless you're working in a large organisation where the pace of innovation is slower, in which case you may have 8 years at best in your current career.

In the short term, we'll see AI tools enhance workers and their outputs. They've already become common in so many peoples workflows. They're a tool for now, requiring human intervention to manage the output. However, as the gap shrinks between human and AI ability for the same job, it will cause a huge shock to the job market. Unlike previous events in history where these innovations have taken years or decades allowing people to transition to other ways of supporting themselves, this time will be different. One day you'll go to sleep and in the morning you'll read on the news that AI cannot only do your job, but do it better. It doesn't need to sleep, it doesn't get tired, it doesn't have relationship issues. You'll try to retrain in a related field but then 6 months later you'll be in the same position, and this time you'll be in a state of panic as the realisation that there's no place to hide.

If you're sitting there complacent, thinking that this is all science fiction and it will never happen you only need to look at the delta's of AI's progress in the last 4 years in comparison to the last 40 years. You're probably also thinking that because AI currently can't write an award winning novel or understands the intricacies of human psychology that people are safe, but ask yourself for a moment are you able to write an award winning novel? Are you able to fully understand human psychology yourself? How many people really are capable of doing this? How many places have you worked where you've seen someone who is just good enough but remains gainfully employed? AI just needs to be as good as the average person doing the same job. It just needs to be good enough at your job to take over.

As someone who still considers themselves as a software engineer I'm both excited and scared. I'm in engineering leadership, but I can already see the writing is on the wall for my job in the next few years. The few roles that will be protected for at least the next 10 to 20 years will be those in healthcare that require physical work such as examinations, tending to patients, surgery etc. Actually, anything that requires work in meat space at scale will see a slower pace of replacement until robots because viable and cost effective. When this happens though, as a civilisation we'll need to start asking ourselves some hard and scary questions on how do we support a large population who are unable to find employment and thus support themselves?