It stacks up: How FSI is thinking about Generative AI

It stacks up: How FSI is thinking about Generative AI

Over the past 18 months I’ve often been asked about the potential for generative AI and large language models in financial services. More specifically: where do I see the most promising areas for adoption and a genuine path to quantifiable business value; and how are organisations approaching the challenge of building AI into their businesses as an enterprise capability?

Many companies across Australia and New Zealand have made significant investments in AI, whether through technology investments, building AI policies, implementing new systems or hiring and developing the right skills. As you would expect, banks and financial institutions, motivated by a clear path to economic value, are at the front of this trend.

Whilst customer-facing use-cases like chatbots and personalised digital interactions may draw the most attention, it is the chance to fundamentally change how the business operates where I believe the biggest, and most immediate, opportunities lie. Whether it’s automating mundane tasks, enhancing human decision-making, or removing process steps, it is increasingly about putting useful AI tools in the hands of employees to accelerate and improve process outcomes.

This approach is also reflective of a broader shift in how financial organisations are thinking about developing AI. Instead of relying solely on pre-built solutions, financial services institutions are looking for new ways to innovate with these technologies. This might be through creating platform ecosystems where employees are empowered to experiment with AI tools that address their unique day-to-day needs. While guardrails are a given, this model is one that prioritises agility, context, and ownership, whilst managing risk.

Building AI Foundations

Top of mind for many financial services organisations today is the modernisation of core systems and data platforms. With the rise of AI, a modern core and data platform is crucial, as it’s the engine that enables organisations to drive reinvention and reduce time to market to launch new digital products. The financial services industry has evolved upon decade-old platforms that are monolithic, complex, and highly dependent. Data collection has grown exponentially across multiple siloed platforms with poor compatibility, interoperability and usability.

With the rise of AI, modernising these core systems and data platforms is absolutely crucial, and the good news is it’s becoming much easier. In an ironic flywheel, it’s the technology that is making modernisation a critical dependency that is also enabling much faster and easier modernisation through tools like Amazon Q Developer Transform. Companies that are leveraging these tools to accelerate their modernisation efforts are already seeing benefits like improved agility and the ability to innovate faster, with lower cost of experimentation, lower cost of failure, and resulting in the faster launch of new and better capabilities and products.

Building upon their core systems and data platforms, financial services Institutions have also started to make significant progress establishing AI foundations. While a lot of attention has been on model selection, true AI foundations are much broader than which Large Language Model (LLM) you will apply. Establishing clear AI policies and guardrails with a consideration of risk appetite and ethical use is crucial. Increasingly, institutions are considering their AI Constitutions, a set of guidelines that creates sustainable principles for how AI will be built and deployed across an organisation.

With these guardrails, organisations are better able to empower their builders to develop the new capabilities with more freedom, enabling greater speed and innovation without introducing levels of risk that goes beyond the organisation’s tolerance. Other elements of foundations include a consistent approach to use case prioritisation, as well as creating the clear measurement metrics to learn from experiments, measure business value and, most importantly, track how savings in time and efficiency are being better utilised e.g. new products, features and backlog clearing.

Simplifying Generative AI Platforms

As enterprises grapple with the complexities of how they leverage generative AI, many are gravitating towards platforms that help them apply their own policies and principles and simplify access to a broad range of tools. This shift reflects a desire to simplify AI infrastructure and streamline operations to give access to more of their employees, taking AI well beyond the domain of the data scientists and developers. To break down the challenge of imagining integrated generative AI across a potentially vast enterprise ecosystem, we think about it as a stack with three levels.

As the graphic below shows, the Bottom layer provides the compute platform, along with the tools for building and training LLMs and other Foundation Models (FMs). The Middle layer provides the tools you need to build generative AI applications using these models in a safe, secure and scalable way. The Top layer includes pre-built applications that use LLMs and other FMs to deliver value such as writing and debugging code, generating content, deriving insights and taking action. This becomes easier to understand when we look at each layer in turn with a real-word example of how we have helped major Australian financial institutions deliver value at each level.

AI Infrastructure – CommBank

At the Bottom layer is the foundational infrastructure from AWS to support the development and scale of AI applications. Developing and scaling (training and inference) AI models is incredibly computing intensive. With model queries running across millions of parameters the intensity of compute required can quickly escalate. At this bottom level of the stack, AWS is committed to providing the most efficient technology to build, train, run and host AI applications.

A great example is CommBank, which recently activated its state-of-the-art AI Factory in collaboration with AWS. The AI Factory will use cutting-edge capabilities from AWS including Amazon SageMaker, a Machine Learning (ML) managed service, and Amazon EC2 P5 Instances for deep learning and high-performance computing applications. This AI Factory will enable employees to conduct safe testing and development of AI solutions by providing the compute power required to fine-tune and train AI LLMs at increased speed and with greater efficiency, providing both cost and carbon savings.

Applications and tools – PEXA, nib

As we continue to learn the best ways to build AI capabilities, we are seeing common trends, like customers desire to be able to access and test against a broad range of models to ensure they are selecting the right one, with the right performance and cost base for their use case. We are also seeing the desire to establish some clear guardrails and tools to simplify access and use. The middle layer of our generative AI stack is built upon Amazon Bedrock, which provides access to the broadest range of LLMs through a single API, as well as the tools a builder needs such as reasoning/hallucination checks, guardrails, model distillation and many more. This is the layer at which we are truly democratising access to these powerful models, and helping customers build capability. Two good examples at this level are PEXA Group and nib Group.

PEXA, the ASX-listed digital property exchange and property insights solutions business, launched a permission-aware AI platform in September designed to enhance secure data use, empower employees, and drive productivity improvements. AWS were part of the collaboration behind PEXA’s personalised, generative AI Assistant, which is designed to understand each employee’s access to enterprise data in real time and provide role-specific assistance. PEXA’s solution ensures robust data security at both the model and data levels, retrieving live information from within the organisation. The platform operates using Bedrock and Claude models, applying a permissioned Retrieval Augmented Generation (RAG) model, which controls what data is passed to the AI in small, secure segments. Security is a fundamental requirement for PEXA and many other operating in the highly regulated financial services sector.

Nib Group, an insurance provider, recently completed migration of 95% of their applications to AWS. As a result of this, nib has expanded and enhanced its digital offerings, utilising AI services like Amazon SageMaker, Amazon Lex and more recently Bedrock, to develop "nibby," an AI chatbot. Since its introduction, nibby has handled millions of customer inquiries. This again highlights the major modernisation that so many financial services sectors are going through. By leveraging AWS's robust cloud platform, nib has achieved significant cost savings and unlocked new possibilities for advanced data analytics and AI-driven initiatives.

Actively applying AI – NAB

At the Top layer we see the actual use of generative AI-enabled applications under the Amazon Q family, and this is where financial services businesses can really accelerate value. I am excited by the early signals we are seeing from customers experimenting with Amazon Q for Connect and Q for Business to reduce the “cognitive overload” call centre agents have and provide other employees access to more simplified and targeted information that enables them to better serve and support customers.

NAB’s use of Q for Developer to improve developer productivity though automated code development and training is another great example. From experiment to pilot to full scale rollout, Amazon Q Developer gives NAB developers ability to automate many of the coding tasks, which in turn, gives them more time back to focus on the higher value work for customers. More than 1,000 NAB technologists are trained and using the tool with plans to further scale out access in coming weeks, and we are seeing the efficacy accelerate, with over 50% of code recommendations accepted by developers, and we expect this to only get better.

I have been pleased to see NAB’s strong outcomes from Amazon Q Developer, which rank among the best in the world. A majority of its engineers are seeing improved productivity when using the coding assistant, showcasing the tangible benefits of these tools.

Where to from here?

If you’ve been following re:Invent, it’s clear that we’re living in an era of AI-driven innovation, and the possibilities for financial services institutions is endless. We saw a broad range of new AI services and Bedrock service feature enhancements unveiled at the conference, ranging from Amazon Nova, our new generation of foundation models, to new Amazon EC2 Trn2 instances, which feature AWS’s newest Trainium2 AI chip.

These innovations, combined with automated reasoning checks (minimising hallucinations), model distillation (making a model faster and cost effective to run), plus model marketplace and multi-agent collaborative technologies in Bedrock, aim to make AWS technology more accessible, efficient and impactful for our customers.

This just scratches the surface of what we announced at re:Invent 2024 and I look forward to providing my thoughts on how these will benefit financial services customers in the coming weeks.


Adrian De Luca

Technologist, Advisor, Investor & Director Cloud Acceleration at Amazon Web Services (AWS)

2 个月

Thanks Jamie for sharing this insightful spectre on how AI is evolving in FSI. As the leader of our prototyping business I have been very enthusiastic and to be honest somewhat pleasantly surprised about the amount of experimentation financial services organisations are doing with Generative AI. While as you said many started with chatbots and augmented search to improve productivity, several have moved to use it in core decision support and even fraud analysis. As the technology is being proven for the most constructive use cases, and prototypes move to pilots or production, I’m also seeing organisations requiring more rigour around controllability of its behaviour, integration with existing governance policies and explainability in evaluating the qualitative outputs. It’s going to be an exciting space in 2025!

Great article Jamie - how do you think AI will impact the business process outsourcers ?

回复
Lokesh Bahety

Account Manager at Amazon Web Services (AWS)

3 个月

This bit was quite poetic - "In an ironic flywheel, it’s the technology that is making modernisation a critical dependency that is also enabling much faster and easier modernisation through tools like Amazon Q Developer Transform"

Hey Jamie, great informative article. Hope your well mate

Great concise AWS Summary Jamie Simon. What are your predictions where the FSI will focus with AI over the next 18months?

回复

要查看或添加评论,请登录

Jamie Simon的更多文章

社区洞察

其他会员也浏览了