Are you moving fast in the AI-First race?
Nitin Gaur
AI~Digital Engineering & Solutions Lead | GenAI Consulting, Architecture, Thought Leadership
Context Setting
AI adoption moved beyond chatbots in 2024 proving it is not just a hype. With the advancement of computing powers and huge data storage capabilities, foundation models (FMs) emerged and democratized AI for everyone. FMs completely changed the way data scientists traditionally approached machine learning. Rather than develop from scratch, you now use FMs as a starting point to build ML model that power new applications via APIs.
Today’s FMs, such as the large language models (LLMs) from Anthropic’s Claude and Meta’s Llama, and the text-to-image model Stable Diffusion from Stability AI, can perform a range of tasks out of the box spanning multiple domains, like writing blog posts, generating images, solving math problems, writing software code, engaging in dialog, and answering questions based on a document.
This FM/LLM revolution has already brought in massive industry level pivot. Tech investors world over are chasing promising AI startups, existing software vendors and cloud platforms are adding AI capabilities, IT services companies are AI-enabling their service offering and upskilling their workforce. Among these 3 different categories, the shift is painful for IT services and consulting companies. It is simply because increased use of AI-led automation is likely to reduce human workforce in next few years. So, they not only need to protect their existing business but also need to find new areas of revenue growth leveraging AI capabilities like generating new content, creating autonomous AI agents.
Instead of listing down the GenAI use cases (there are tons available already), this article focuses on the roles and opportunities for IT services companies. Please note, it’s not a strategy model as a company strategy differs based on various factors such as size. A large-size service provider may decide to invest in building industry specific SLMs or fine-tuning LLMs whereas a small-size company may partner with a startup for its go-to-market offering.
Breaking down the IT roles
Regardless of the size, all IT services companies are in race to build and provide “AI solutions” to their existing customers and win new customers. Without going into the definition of what an “AI solution” is, it is imperative for the company to envision / evolution of the current roles that exist today. These roles can easily be classified into 3 broad categories:
Users (who re-imagine their work with AI) – Their work is likely to depend upon how good are they in using the AI assistants and tools.
Builders (who build AI solutions and apps) – They are not just expected to build but also keep updating themselves on weekly basis.
Operators (who operationalize AI solutions) – These are going to be super busy as more AI models get embedded into the systems.
In the diagram below, I listed key roles within each of these broad categories along with their area of opportunity and required skills. To re-emphasize, the table is limited to IT roles (not counting broad spectrum like Sales, HR) and not including all possible IT roles in a company.
领英推荐
You notice the roles stay more or less the same however what is changing is the AI related function they are aligned to. For example - Data engineer role is established for several years but now the demand will be more particularly on management of unstructured data (text, images, video) and extracting the structured data of it. Software engineer will still be doing system design and creating applications/workflows but now rather LLM-driven apps and agentic workflow that integrate with multiple LLMs for various purposes.
Getting the hands dirty
In the race of AI-First, simply knowing is not enough. Without clarity of the next step or call to action, some companies will be laggards and always find themselves doing catch-up.
In the diagram below, I listed key roles along with their resources to and the target output. For example - a product/program manager should experiment with AI-driven low-code/no-code tools such as V0, Bolt.new, Replit, Builder.ai to find best fit and demonstrate their idea prototype quickly to stakeholders. Similarly, solution architect should stich the AI components into a platform for other builders to experiment while keeping the model inference cost down.
Here I mentioned only for half of the roles. This is because these are prioritized over others in my findings as top level candidates to produce tangible assets to showcase and I don't want to held up sharing the article. I intend to complete this table soon for remaining roles sometime in early next year (expect it to be one of the most exciting year ever in tech).
Conclusion
Let's face the reality! The technology in (Gen)AI world won't be the real differentiator because of the level and the speed of innovation that is happening, whatever technology you may develop will become obsolete within a few months, and you have to keep doing that. The differentiation will lie in terms of how entire company embrace AI mindset, rather than having another cool stuff which is better than the competitor. The key is to push the employees and guide towards AI advancements in their roles and help become better in leveraging it.
Disclaimer: The views presented are mine and not of my employer or any other org.
Glad that you read it. Hope you liked it. Please share your comments / questions.
Wishing you a year filled with achievements and new opportunities. HNY 2025!