The missing P in your AI strategy
Structured Enquiry Analysis Prompt, outputting valid JSON in a make scenario

The missing P in your AI strategy

Every non-freaked-out business owner and their dog is "leveraging AI" to gain a competitive edge, but they're falling short because of a missing P.

We all can envision how, one day, AI will deliver incredible value in a context of spiralling costs, an increasingly complex competitive environment, the realisation of the enormous latent value locked in our siloed, unstructured information and people stuck in heavily manual 'qualitative' processes... Most people agree that You'd be silly not to be at least dipping your toe in right now.

But this message isn't to the toe-dippers, it's to the gung-ho early adopters who think they've arrived with AI because they have given their employees access to ChatGPT, Claude or MS Copilot... I'm here to tell you that you're missing a P in your _AI strategy.

Here's the thing: simply giving your employees access to tools like MS Copilot, Claude, or ChatGPT isn't enough to truly harness the power of AI. It's more than likely a recipe for frustration, errors and embarrassment.

Fred Flintstone in his Model S

Picture Fred Flintstone behind the wheel of a sleek, modern Tesla.

Instead of letting the car's advanced technology do the work, he's got his feet through the floor, frantically pedalling like he's in his old stone-age vehicle. Sure, he might get from point A to point B, but he's not tapping into the true potential of the cutting-edge machine he's driving.

The same principle applies to using AI tools like ChatGPT or Claude without properly integrating them into your business processes. Employees might achieve some results through manual prompts and copy-pasting, but it comes at the cost of increased time, inconsistent results, and a lack of systematic checks and balances.

In short, they're still stuck in another version of the 'working harder, not smarter' trap and you're none the wiser about what is happening and what benefit the introduction of this tech is having on your business outcomes.

This ad hoc approach is not the way to unleash exponential value from your investment in AI for your business.

To truly unleash the power of AI in your business, you need to put the P into your AI – and by P, I am talking Programmatic - realised through the integration of AI via APIs (Application Programming Interfaces).

AI + API = ?

We can't re-use the API acronym, so what shall we do?

I propose we unite our thinking around this approach under the term programmable Artificial Intelligence or pAI.

pAI is achieved by deploying AI capabilities into your business processes systematically, identifying points of AI-based analysis/enrichment/creation etc within your standard operating procedures and then concreting these into place through API-integration in your tech stack according to defined data flows, and of course referencing centralised structured data repositories (ERP, CRM, WorkOS, PIM, WMS, Helpdesk and the like).

By doing this you can transform the way your company operates by leveraging AI in a stable, predictable and scaleable manner with centralised safeguards and levers for optimisation.

Rather than everyone copying their own locally stored prompts, doing a find and replace on input variables and then launching into a cycle of increasingly frustrating iteration to optimise the generated output (sometimes before abandoning an 80% complete conversation because the last 20% is excruciating) – your people will maintain and perfect a centralised set of prompts and AI agents with defined data variable placeholders that your process automations will fetch and utilise; giving you stable, auditable output 24/7/365!

How does pAI compare to ad hoc AI?

Let's do a worked example:

In our business, we conduct discovery calls with prospective clients. Transcript analysis is an excellent use case for generative AI.

Based on my own experience, having completed perhaps 3-5 per week on average for the last 2+ years, I have crafted a highly effective prompt which elicits pertinent client requirements, key quotations, structured 'next steps' and the like. After much iterative optimisation, the output is consistently brilliant.

This central prompt requires around 15 dynamic data variable placeholders (date, prospect's name, company, original enquiry, call transcript etc).

Context

An enquiry comes in, it is parsed and stored. Some research is conducted to enrich the data set (partially automated).

When the discovery call is booked, the record is enriched with further data from SavvyCal and we now have unique ID for this Zoom call.

Let's inspect the evolution of this process from ad hoc to programmatic.

Iteration 1

Discovery call host launches the prompt, copies and pastes the dynamic data elements into the placeholder fields (through team-gpt.com which is built for this use case), reviews and edit the output and sends the follow up.

Iteration 2

Everyone who's conducting discovery calls use the same prompt and is trained on where the various pieces of data come from so they can also consistently manually input the data and run the prompt 'efficiently'.

Both of these are still ad hoc AI because there's no guarantee or control mechanism.

Iteration 3 gets programmatic

Once the call is done, a link to a Jotform is shared (SavvyCal, our scheduling platform has a useful 'send webhook' automation that fires when the scheduled call finish time hits) with the person who conducted the discovery call (found in the data from SavvyCal), this contains a field which is pre-filled with the unique ID of the call.

Post discovery call analysis data capture form

The consultant can paste in their unique notes and takeaways in that form in a dedicated text field, and they can indicate the next actions to be taken following the call (a triage process such as 'schedule a live consulting workshop' or 'request a brief' as a single select field.

Then comes the programmed AI magic

When the form is submitted, the programmable aspects of the subsequent workflow are triggered.

In our case, this invokes our main 'discovery call analysis and write-up' prompt (stored centrally in a template library), and the system dynamically inserts the data variables captured upstream and dynamically selects other prompt blocks according to selections made within the form by the consultant.

The prompt structure is thus constructed dynamically and intrinsically tailored to the specifics of the call, before being fed to the API (Anthropic or OpenAI).

90% of our internal automations like these are built in Make and we're utilising OpenAI or Anthropic's API connectors (still doing split testing to determine which we are going to settle with for the time being).

We craft our prompts to generate JSON payloads, against predefined data structure specifications found within our prompts. This is what we then parse out to retrieve our generated information, and execute whatever processes are required.

And we of course pass more complex content outputs via smaller and simpler 'sense check' prompts which are relatively static, e.g. to ensure we generate UK-English (just a stylistic preference, though I can feel my mother leaning over my shoulder correcting any grammatical errors ??)

The stable, normalised output is then fed into the creation of documents for external (high level summary of requirements and next steps) and internal use (data for ball-park estimation).

We recommend that any workflow with an external output of significant consequence should still feature a human-in-the-loop for checks and tweaks, but for internal ones we have more grace for AI-isms that pop out occasionally!..

We of course share all original data such as video transcript and video file itself so we can check the analysis against the evidence it was provided.

And finally the inputs and outputs are all stored centrally in order to facilitate 'monitoring and evaluation' to keep optimising our workflow logic, data variables and prompt structure.

Did you see it?

A single system, crafted by the minds of many, allows for a central system which receives consistent inputs and can thus provide consistent outputs.

You can see how this system can scale nicely (and it is!)

Help, I don't do APIs!

Now, I know what you're thinking: "But I'm not a tech wizard! How am I supposed to program AI into my business workflows?" Fear not, my friend – that's where no-code and low-code tools come in. Platforms like Zapier , Make and Airtable allow you to automate workflows, connect apps, and manage data without needing a degree in computer science. Want to extract data from documents? Klippa 's OCR and own prompts have got you covered. Need to collect data and incorporate human input? Jotform is your new best friend (I genuinely see it as unrivalled form-software).

Of course, integrating AI into your business processes isn't without its challenges.

You'll need to invest in the right technical resources and expertise, ensure data privacy and security, and align your AI initiatives with your overall business goals and strategies.

And let's not forget the importance of change management and employee training – after all, you don't want your team to feel like they're being replaced by robots (even if those robots are pretty darn clever).

But here's the bottom line: businesses that embrace the pAI approach and leverage no-code/low-code solutions will be the ones that stay ahead in the AI-driven business landscape.

So, what are you waiting for? It's time to put the P into AI and unlock its true power in your business. Trust me, your future self (and your bottom line) will thank you!

Alistair

– ps what is your number one use case (or wish list item) for AI in your business? Let me know in the comments and I will tell you how we'd approach it!


I'm Alistair, CEO of flowmondo – a leading #automation and #nocode agency that helps businesses unleash the value from their data.

We're certified Airtable, Make and Zapier experts and we know our way around #APIs and #AI.

Get in touch if you need help ??

要查看或添加评论,请登录

社区洞察

其他会员也浏览了