Anticipating AI's next move ? article ③ ?
Marco van Hurne
Partnering with the most innovative AI and RPA platforms to optimize back office processes, automate manual tasks, improve customer service, save money, and grow profits.
In the two previous articles, PART 1 - the Generation Space, and PART 2, the Productivity Space, I explored the first two spaces in the evolution of AI. And if you want to skip ahead, to PART 4. Go ahead!
First was the Generation Space. This is where AI began with tasks like content creation, data analysis, and simple predictions. The current hype around AI started, with tools like GPT-3, Midjourney, and others, showing the world what AI could do.
In the second article, we moved into the Productivity Space (read the article). Here, AI stepped up its game to start automating processes and create efficiencies in business operations. I talked about AI in customer service, process automation, predictive maintenance, hyper-personalization, and even AI-Assistants. AI is more then just a helper, it is a key driver of efficiency and business results.
And in this third article, it is time to dive into the Business (re)Design Space.
This Space is where AI starts to create and reshape entire business models.
AI-first companies do not use AI, they do not augment humans, nor do they collaborate with them.
They are built around it.
These AI-First businesses redesign their processes, strategies, and even customer experiences with AI at the core.
I will be showing you examples like Databricks, where AI drives every aspect of their operations, from data processing to real-time analytics. Or the company DevRev, where the entire product development and customer lifecycle is intertwined with Agentic AI at its core, in one integrated platform.
To wet the apetite, here is the animated GIF of the framework again:
This Space is also where AI will reach its most advanced stage and may potentially even surpass human intelligence.
This will be a world where AI, and robotics converge together which will lead to fundamental changes in how our societies and businesses operate. Because the Singularity Space, it isn't about process improvement, or incremental changes. No suree, it will be all about transformation on a scale that we have never seen before.
Throughout these spaces, I have discussed use cases like Fount’s work friction reduction, UPS’s predictive analytics, and even personal AI assistants evolving into machine customers. I have also also showed how Zara uses AI to spot fashion trends before they hit the mainstream, how Netflix uses predictive analytics to bring personalized content to your TV, and how Schneider National uses AI to monitor and maintain its fleet in real-time.
And don’t forget the example of personal AI assistants that act as virtual shoppers. They help you negotiate your next energy contract, and even making purchases on your behalf. Each of these examples shows you how AI is not just a tool but a transformative force. And it will be reshaping our world beyond belief.
So stay sharp, people, because I'll be discussing these final two spaces now, where AI becomes the business, and eventually, something even more profound.
Before we start!
If you like this topic and you want to support me:
The Business (re)Design Space - the current frontier
Welcome to the Business (Re-)Design Space. It is the place where AI stops being a tool and starts becoming a transformative force.
If the Productivity Space is about optimizing existing processes and models, with intelligent machines, then the Business (re)Design Space is about fundamentally reimagining how business gets done from the ground up.
In this third stage of AI integration, I am going to talk about applications and use cases that go beyond simple automation or augmentation. This is where AI becomes not just a part of the business but the very core of it – it is driving the strategy, the operations, and even customer experiences in ways that were have not seen before.
So what does AI look like in the Business (Re-)Design Space?
Let's jump in!
AI-First business models where machines run the show
One of the most radical applications of AI in the Business (re)Design Space is the emergence of entirely new business models that are built from the ground up around intelligent machines.
I am talking about companies where AI isn't just a tool for optimization but the very heart of the operation, and it drives everything from product development to marketing to customer service.
I have come to call them the AI-First businesses.
If you've read my articles on AI-Assistants (check out the previous article here), you're well aware that AI can do far more than just generate funny pictures or serve as an impaired Customer Service Rep.
The step before a business, 100% driven by AI is that of the AI-Assistants.
And as soon as we have mature enough platforms that have a no-code, and work-flow based approach to building AI-Assistants, the door to AI-First businesses is wide open.
One of the most promising Agentic AI platforms out there is AgentOS. I'll be discussing them later on.
Now, first explain what an AI-first business will be all about.
Basically, it is a company where AI is the foundation for everything they do.
I could just stop here.
But just for kicks, let's continue.
It's not just a fancy add-on or a cool new tool.
No, AI is the very essence of how they operate. Because in an AI-first business, every workflow starts with the assumption that AI can be applied.
AI is the default, the go-to solution, because non-AI workflows are only developed when the risk of using AI is too high.
EXAMPLES---------------------ooo---------------------EXAMPLES
DevRev
DevRev has built a native 100% AI driven platform to automate and augment the entire product development lifecycle, from collecting requirements from customers, to mapping them into a visual representation of the roadmap, managing product development teams and real-time analytics on just about anything you want to report on. And all of that using their Agentic AI platform called AgentOS, and their Bot that is your personal guide through all of this.
This truly impressive company belongs in two categories in this Space. They have built their platform entirely on Agentic AI in 2021. Way before ChatGPT came along with their most impressive model in 2023. So therefore they are an AI-First business model.
And they are all about Collaboration, AND Augmentation as well. Because they let people join in and help them if requested. A truly impressing feat if you ask me.
And then there’s Waymo.
Waymo, the autonomous driving technology company, is another good example of an AI-first business. The company uses AI to power its fleet of self-driving Taxis. AI is used for navigation and route optimization and to understand and predicting road conditions.
Every decision made by the system, from steering and braking to passenger communication, is driven by AI algorithms. And because they do this, Waymo is able to operate a business where human intervention is minimal.
But not all of it runs smoothly though. In the article Robotaxis in SF are feeling kinda 'h*rny', I wrote about some time ago. Apparently, the taxis were honking frantically at each other when on the parking lot. Apartenly robotaxis just can't seem to master the art of parking. And when they run into trouble, they honk the horn.
Ocado, another example of an AI-driven business model
Ocado is a British online supermarket that uses robotics and machine learning to automate virtually every aspect of their operation. From the moment an order is placed on Ocado's website, intelligent algorithms take over.
They optimize everything from inventory management and picking to packing and delivery. The result is a highly efficient business that deliver groceries to customers with a brilliant speed and accuracy.
And you get all of this without the need for human intervention.
Alas, it is true.
The last one is Lemonade.
Lemonade is a tech-driven insurance company which is a great example of an AI-first business.
From the moment the customer signs up, AI plays a central role in every interaction. Lemonade's AI-powered chatbot is called Maya. She handles everything from onboarding new customers to processing claims. And she does this often within seconds.
The company uses AI to assess risk, set pricing, and detect fraud, and it makes sure that operations are streamlined. Lemonade uses AI to make decisions at every step of the customer journey, so they are able to operate with a level of speed and accuracy that traditional insurance companies simply can't match.
In an AI-first company like Lemonade, AI isn’t just a tool—it is the foundation that drives the entire business model, from customer service to underwriting and beyond.
And the coming of the AI-First businesses will result in the next wave of Digital Transformation.
If you want to know more about the AI-First business. Read my latest article on the subject: Prepare for the "AI-first" businesses.
Rise of the machine customers
The evolution of personal AI Assistants
Machine customers will become another transformative leap in AI technology for us humans.
Just think of a (dystopian?) world where your personal AI assistant evolves from a simple tool that sets reminders and plays music into a Batman's butler that knows all the dark corners and the complexities of the internet, and it can also negotiate deals on your behalf, and manage transactions, and whatnot.
These AI-based agents are set to revolutionize the way we interact with digital platforms, and e-commerce sites.
Because personal Assistants are moving beyond basic tasks to autonomously make purchasing decisions or securing the best price, and even paying the bills. All of this happens while you focus on more enjoyable activities.
You just leave the mundane and repetitive stuff to your digital assistant.
Companies like Elna AI, Jace, and Viv are already ahead of the rest, and are slowly and quietly pioneering this new frontier.
--------------------------------ooo---------------------------------
ELNA
I have played with Elna AI a bit and it is a bit more than just a personal assistant.
It’s designed to become a digital extension of your brain.
It tries to deeply understand your preferences and act as a second brain.
Elna wants to manage everything from booking your next vacation to negotiating your energy contract. And it doesn’t want to just follow commands; it anticipates your needs by offering you solutions that are tailored to your lifestyle. That could be ’s finding a hotel at the best price or making sure that your utilities are the cheapest.
Read the article on Elna: Personal AI x Agentic AI = Elna, a game-changer
Nex there is Jace. And it excels at personalized shopping. It just scours the web for you, searching for deals.
This limited focus, however, will make it become obsolete soon.
But for now it serves as a good example of personal AI-Assistants.
Jace takes into account your tastes, your budget, and your past buying habits to find the best deals available online. Jace can go a step further, making purchases on your behalf. It’s a personal shopper that knows exactly what you want, when you want it, and how much you’re willing to pay for it.
And then it tries to make the best deal for you.
Read the article on Jace: Meet Jace AI - a brilliant AI-Assistant platform
--------------------------------ooo---------------------------------
Then there’s Viv.
Viv is an AI platform developed by the creators of the rather arcain voice-assistant of Apple called Siri.
In their new venture they are trying to push the boundaries of what AI assistants can do. I think that this is what they should have done with Siri in the first place, but they now have a chance to do it all over again, and this time at their own expense.
By the way, they have recently sold off to Samsung!
Hahahah.. that made me laugh when I first read it.
Because Apple misses the AI boat again.
Viv is designed to be a rather versatile assistant that can handle quite a wide range of complex tasks. The fact that it has the ability to go beyond it's process templates, makes that this platform has the ability to evolve into an omni-assistant that could handle any process you want it to support.
Viv is built with the intention of being an open and extensible AI platform.
The builders have learned this from Apple of course. Apple was the first to come up with an ecosystem around the iPhone back in the day, and that has done Apple a lot of good. And Viv is trying to replicate just that.
It is designed to integrate with a lot of third-party services and can handle complex workflows across multiple platforms. Viv's architecture is created in such a way that it allows it to learn and adapt over time. And just that major feature is making it capable of creating and managing workflows that go beyond standard templates.
This means that Viv can handle unique, customized tasks tailored to whatever you want it to! It has the ability to generate custom solutions and adapt to new scenarios. And that allows it to exceed the limitations of pre-defined workflows like Jace, and Elna.
The fact that it has the ability to go beyond it's template processes, makes that this platform has the best odds of becoming a major player in this new field.
To me, Viv is a winner in this Space for sure!
--------------------------------ooo---------------------------------
Major players are entering this Space as well
But it is not just these emerging players that want to play in the world of machine customers.
Big tech giants such as Amazon Alexa and Google Home are finally evolving their current assistants (rather lame ones at that), to meet the demands of this new market. And what Siri will be doing is still quite fuzzy. Apple's AI strategy has not communicated quite clear yet.
These were originally designed as voice-activated assistants for just simple tasks like playing music, or a timer, controlling your smart home devices, or sending brief messages to other voice assistants.
But these platforms are now expanding their capabilities to function as fully-fledged machine customers.
Amazon Alexa is one of them.
The platform is being developed to act as a digital butler to help you make purchasing decisions or managing subscriptions or to even do the negotiating on your behalf. Alexa could be equipped to monitor your household inventory and automatically reorder supplies when they’re running low.
This is not the first time that I have heard this promise of smart refridgerators, but they never seem to catch on.
Ideas why anyone, as to why?
Myself, I have tried both Alexa, and Google Home.
And Siri is just an impaired and annoying little brat.
If you are a fan of Southpark, you will probably remember the episode when Cartman had an Alexa and a Google Home talking with one another.
?? Hillarious !
And in the end, I stuck with Google Home because of the sleakness of the Google Nest Hub.
Google Home is following a similar trajectory as Alexa.
I think the best use now of this device is for its integration with Google’s services.
Yet Google Home is poised to leverage its deep learning algorithms that it has honed over decades and that it holds an extensive amount of data about you that enable it to offer you even more intelligent services, and personalization (whether you like it ot not, that is their USP).
Google Home will not just remind you of an upcoming bill but will automatically find the best way to pay it—whether that’s negotiating a lower rate or choosing the optimal payment method to maximize your rewards.
And Google is investing heavily in AI to position their Assistant as the go-to for at home, and underway. And that is going to make it a key player in the machine customer revolution (read the article: Google just held the worst product launch event
Pun intended.
--------------------------------ooo---------------------------------
The Implications for Businesses and Consumers
The arrival of machine customers will have profound implications for both businesses and for people.
For businesses, this means they need to start considering not just how they market to human consumers but also how they appeal to AI-driven purchasing agents.
Let this sink in for a bit.
These machine customers are logical, data-driven decision-makers who prioritize efficiency and cost-effectiveness over emotional appeal.
Companies will need to optimize their offerings and interfaces to be machine-friendly.
We are not talking about User Experience (UX), but Machine Experience (MX), that will allow for their products and services to be easily accessible and appealing to these new digital buyers.
Read the article: We need to develop a vision for Machine Customers
For people like you and me, the rise of machine customers means simplifying life in ways we are only beginning to understand. I can think of never having to deal with the hassle of finding the best price, or to scheduling payments, or managing subscriptions.
I truly hope my AI assistant will do that all for me.
In my lifetime, please.
Human | AI Collaboration in the Third Space
Now we are move deeper and deeper into the AI-driven Business (re)Design Space.
Here, the models of Human-AI collaboration are becoming increasingly sophisticated and futuristic.
These collaboration models will transcend task automation and decision support or even AI-First businesses (though they are still highly scarce). They will redefine how humans and AI systems work together.
The future of work will see humans and AI as deeply integrated collaborators, where each amplifies the strengths of the other to achieve extraordinary outcomes.
AI-Based creativity for scientific discovery
Being a former chemist/pharmacologist, one of the most exciting frontiers in Human-AI collaboration is scientific discovery. I recall working with an advanced program called CAOS/CAMM (Computer Assisted Organic Synthesis, and Molecular Modeling) at uni nearly thirty years ago.
But platforms like Insitro and DeepMind’s AlphaFold are already pushing the boundaries of what AI can do in partnership with human creativity and intellect.
In the scientific realm, DeepMind’s AlphaFold is revolutionizing the field of protein folding. That is a complex problem that has stymied researchers (and myself) for decades. The cool thing is that it predicts the 3D structures of proteins with unprecedented accuracy. AlphaFold is not just assisting scientists but actively contributing to groundbreaking discoveries in medicine and biotechnology.
This is an AI-driven approach to protein sequencing and it will lead to new treatments for diseases, faster drug development, and a deeper understanding of biological processes.
--------------------------------ooo---------------------------------
Have you ever heard of CRISPR technology?
No they are not "Crunchies", the genetically modified cornflakes with which you start your day.
"Gene-O's : Because your DNA deserves a modified morning!"
Focus....
Ok
CRISPR is one of the real beauties of the genetic sciences.
It was discovered in 1990 by Francisco Mojica and Ruud Jansen (Holland! ) in the 90's as a part of the immune system of bacteria. CRISPR has been developed further by Jennifer Doudna and Emmanuelle Charpentier as a gene-editing tool.
CRISPR was basically repurposed as a tool for cutting DNA at specific sites.
This means that they were able to edit genes very precisely.
For instance CRISPR could be used to correct a faulty genetic mutation that is responsible for say cystic fibrosis or sickle cell anemia or muscular dystrophy.
This tool for precise gene editing is being paired with AI to identify target genes more efficiently and predict the outcomes of genetic modifications.
Insitro uses AI and CRISPR to speed up drug discovery.
After I graduated, the plan for me was to become part of the pharmaceutical industry to design and develop new drugs.
But I couldn't picture myself working on a project for 12 years on end.
2.5 billion dollars per drug and a success rate of 5%. Clinical trials alone take 6–7 years to complete, where most failures come later on in Phase II-III trials. The average approval rate of new drugs has dropped to 13.8%, but for some of the more challenging therapeutic areas like oncology, these rates are at a mere 3.4%.
Insitro uses biology and ML at scale to predict which molecules and targets work for which patient clusters to speed up the development lifecycle in clinical trials and especially reduce the overlay time for going back to earlier parts of the clinical trial pipeline.It gets a lot of insights from its own repertoire of detailed genetic, phenotypic, and clinical data using a new kind of Machine Learning model architecture.
They use machine learning to analyze genomic data to identifying patterns and relationships that predict disease progression and therapeutic effects. And this speeds up drug discovery by a huge, huge factor because it pinpoints the most promising targets for editing with a very high precision.
To me, these examples represent the beginning of a model where AI not only assists but becomes an essential partner in scientific innovation.
Just dream of a future where AI systems autonomously generates and test millions of hypotheses in fields like pharmacology, materials science or even climate modeling. Human scientists only have to focus on interpreting these results and guiding the next steps of research. This level of collaboration will speed up advancements in knowledge and technology like never before.
At least, that is the promise..
--------------------------------ooo---------------------------------
And now the sensation of this year: DevRev
Another brilliant example of Human and AI collaboration is DevRev.
This is one of the most fascinating companies that I have come across in my lifetime.
And they have been making huge waves in the tech world, particularly with its focus on bridging the gap between product developers and customers through an AI-driven platform.
They have built their entire platform, based on Agentic AI. This key feature of DevRev’s platform is AgentOS, which is the backbone for their AI-driven operations.
And what baffled me the most, was that they finished this Agentic AI platform way back in 2021.
Let this land for a second...
Agentic AI...
In 2021...
OpenAI launched GPT 3 in '23, which unleashed the current AI hype.
And a 100% AI native platform.
The people behind this platform were truly visionary.
So I set out last week to talk with it's founders... And I was so overwhelmed with what they did, I decided to dedicate an entire article to DevRev.
Watch them closely, people.
Here is the link to their site. But I must say that It does not convey their beauty. Nor does their Youtube platform. That always makes me laugh. The beauty of their stuff lies in their product, not so much in their communication. But that will change soon I am sure. Rather have a look at my ChatGPT conversation about the proposition of DevRev.
But seeing a demo from Paul Daniel from DevRev gave me goosebumps, twice.
I have asked if I can share the screenrecording with you guys. Hopefully I will get the ok before I publish this piece.
AgentOS integrates customer data and product development workflows into a unified knowledge graph. And by doing that, their AI agents automate tasks like customer support, feature requests, and even predictive analytics.
领英推荐
Before organizations decide to implement DevRev, their systems are usually very fragmented, with for instance Jira handling sprints, Zendesk managing customer support, and Salesforce tracking sales. And if you have ever managed product development, you know that this will lead to communication delays and difficulty in tracking customer issues.
The thing that DevRev does is unifying these processes into a single platform.
And this way you get a lot of automated feature prioritization, and more efficient support. And that leads to a decreased response time, aligned product development with customer needs, and an improved overall operational efficiency.
The AI-driven automation in DevRev is a real game-changer, particularly in managing the feature request backlog. The platform analyzes customer data to help you prioritize the most impactful requests, and makes sure that the product evolves in line with what the customers actually want. And the support team can now access the development workflow directly, and that leads to more accurate and faster updates to customers.
--------------------------------ooo---------------------------------
Getting rid of technical debt with coding copilots
To be honest, I am not a good programmer. I learnt it at Uni (Assembly, Fortran, Pascal, etc) and moved to Python and R later on. I just need it to tweak the models I work with now and then.
So if you are a die hard full-stack engineer, you will probably be laughing at this, but when you are a noob, a novice, or a good amateur like me, this development is a godsent.
Becuase just empirically, over the last few days, most of my ‘programming’ is now writing English (prompting and then reviewing and editing the generated diffs) and doing a bit of ‘half-coding’, where you write the first chunk of the code you’d like, maybe comment on it a bit so the LLM knows what the plan is, and then tab tab tab through completions.
Tjakka !
And Bob's your uncle !
And along came AmazonQ
Probably you have heard that Amazon Q (that is an AI assistant for AWS) is improving productivity a lot for their internal operations.
Recently Amazon saved about 4,500 developer-years updating Java applications using their AmazonQ tool. That's not hyperbole - it is real, measurable impact.
Amazon has cut down the time needed to update Java applications. The average time to upgrade an application to Java 17 plummeted from what’s typically 50 developer days to just a few hours.
Now guess why this is leading to layoffs in big tech all of a sudden.
The example of Finserve bank
Here's a real-life example on how Amazon Q saved time and resources in converting COBOL to Java at a bank:
Converting COBOL to Java with Amazon Q at FinServe Bank
FinServe Bank had been using COBOL systems for its core banking operations for decades.
And they are still using it...
They can't get rid of it anymore, because much of the knowledge required to migrate this arcain programming language resides in the heads of people who were in their prima throughout the 70's and 80's.
So they are in desperate need of migrating this to a newer more future proof platform.
The task was quite overwhelming though.
FinServe estimated a manual rewrite would take 10 years. And it would involve hundreds of developers.
??
The risk of errors and potential financial discrepancies is quite large. Say the least.
So they decided to give AmazonQ a go for automating code conversion. Amazon Q is indeed trained in COBOL and Java and it offered a way out of the mess.
AmazonQ scanned and translated FinServe’s entire COBOL codebase into Java.
And the results are quite impressive.
GitHub Copilot is also an AI coding tool.
And if you have ever tried it you know that it suggests code as you type. And that comes in handy, especially if you aren't a brilliant coder like I am, or a novice.
It was one of the first co-coders on the market and it was slashing development time and enhancing code quality pretty well.
It uses OpenAI's Codex as their engine.
Codex works with software developers and they help by generating code or suggesting solutions, and even automating parts of the development process itself, like for instance AmazonQ does. It doesn't just follow commands like the hard-rules based systems; it learns from the developer's coding style, and adapts to the developers' needs.
But to be honest, it has already been outpaced by many others like Cursor.
But Devin was the first, people
Devin actually was the first AI coding assistant which you could let write complex scripts. It is handling tasks that used to occupy junior developers for days.
But nowadays, also Devin has been overtaken left, and right by other, more powerful tools
The best among all of the former coding copilots is Cursor AI
The last one in this category, which is making huge waves at the moment, is Cursor AI. They have built an intelligent code editor that automates routine tasks and reduces errors.
Cursor's AI-powered code buddy is built by Anysphere, and it is really transforming the developer community,
Cursor is helping them turn ideas into reality across ages!
From helping an eight-year-old to build a chatbot within an hour (sic) to building a financial dashboard using only voice (SIC!).
At its core, what pushes Cursor AI ahead of the curve are its easy-to-use features like integration to workflow, customisations, and GPT-4 assistance. And that kinda makes coding available to a lot of people.
Some of these features are available in its free version, while the paid plan is more suitable for larger teams or those with specific.
But between you and I, just stick with the free version.
Meanwhile, developers and tech enthusiasts like me have taken over the internet to share our insights on how to best make use of Cursor AI. It is interesting to note their custom prompts and workflows that have led to building apps and features in record time !
These coding tools are transforming how we approach software development and technical debt. They're not just making coding faster - they're fundamentally changing what's possible with our existing teams.
However, this efficiency comes with some painful challenges for society.
Major tech firms, including Google and Microsoft, are reducing their developer headcount, citing AI-driven productivity gains.
Research from the Wharton School of Management confirms this trend - AI is eliminating some roles while creating others in unexpected areas.
The impact is clear: we're moving towards a future with different jobs, not necessarily more jobs.
AI isn't replacing humans entirely, but it's significantly altering the skills and roles we need.
Human | AI Augmentation
Another futuristic model of collaboration is Human-AI symbiosis in decision-making.
Here, AI seriously human cognitive abilities.
In this niche, the companies Neuralink, Synchron and Neurable are leading the pack.
Neuralink is an early step towards this vision.
They want to develop Brain-Machine Interfaces to allow direct communication between human brains and AI systems. This technology is now being tested on two quadriplegic persons, who use the BMI to communicate with others, and to activate systems around them.
They were even able to play games like Fortnite throught the device!
In such a model, AI doesn't only analyze data and present options, but it can also integrate directly with human thought processes, maybe even enhance memory, or processing speed.
But not only in the healthcare. For example, in a financial institution, traders equipped with Neuralink could analyze market trends and execute trades with an enormous speed and precision, because they are guided by AI that processes vast amounts of data in real-time.
Similarly, Neurable, who develops non-invasive Brain-Machine interfaces for applications like Virtual Reality (how cool), use brainwave data to improve the user experience in their products.
They train their AI models on extensive neural datasets.
(Read monday's article about how they obtain their data)
And by doing this, Neurable can improve the system’s ability to recognize and predict user intentions and make the interactions of a person within virtual environments more natural and fluid.
This type of augmentation will lead to a new era of thinking, where the traditional limits of human cognition are expanded by direct AI collaboration. And this in turn will lead to faster and more informed decisions.
And this example is somewhat closer to home.
I have recently bought the RayNeo X2 AI/VR glasses with AI integration and I have been playing around with for some time. It is still early stage without a real support infrastructure and still proprietary ecosystems with limited functionality, but they are nonetheless a brilliant example of human | AI augmentation.
These Augmented Reality glasses are designed to improve human capabilities because they overlay digital information onto the real world. For instance, it can project your route planner on the road to see where you need to turn right. You activate the glasses with voice commands so that it can search for information and present it to you on the go.
It can also look at what you are seeing when you are looking at something. For instance at a store, when you look at a pare of Nike's sport shoes, the AI and VR collaborate to present you with information about these particular shoes, the pricing online (made possible by AI-Assistants), and if you can get it cheaper somewhere else.
The AI technology is not there yet, because the infrastructure has not yet been developed to make this possible, but I believe that this year, we will see the first implementations of both platforms and AI apps on either the X2, the G1, or the Rayban | Meta.
AI-Powered Autonomous Teams
In the very near future, you may also come across AI-powered autonomous teams.
You will see humans and AI agents work together as equal members of a team. And each are contributing their unique strengths.
For example, in project management, AI can take on the role of progress tracking and resource allocation, or even conflict resolution among team members. These AI agents would not follow pre-set instructions like the ones that are available now, like when I implemented Asana's Workload Feature in our PM teams a while ago, and their PM-Assistants are currently based on hard-rules only.
These are dumb rule-based only systems.
But the future lies with the combination of Neural Nets and predifined workflows.
AI based PM-Assistants learn from interactions, adapt to changes, and collaborate with human team members to get to the goals. In mission critical environments like for instance space exploration, where precision and adaptability are crucial, autonomous teams of humans and AI could be the key to mission success.
Maybe the likes of Asana and Workday are working on that as we speak.
One other notable real-life example is SpaceX, which integrates AI-driven systems into its mission control and planning operations.
The AI supports scheduling, resource management, and optimizing astronaut training by analyzing performance data. And this is quite a feat, because in the "really high-stakes world" of space exploration, precision is essential! So there is no room for error.
And yet, these AI-powered teams are crucial to mission success.
How cool is that!
FORD
Ford is another example which you probably did not expect. They use AI to support their manufacturing teams.
Ford has a platform called Drishti, which uses computer vision and deep learning to monitor assembly lines in real-time.
Automation in the factory floor continues to rise, yet over 72% of manufacturing tasks are still performed by humans.? Humans continue to be integral elements of the manufacturing process, capable of performing complex actions that may be difficult for robots to perform (such as those involving a level of dexterity), in addition to providing creative problem-solving and design skills.? On the downside, humans error accounts for over 68% of manufacturing defects!
Apparently, the process to gather data in the assembly line at Ford used to be extremely manual.?
A study revealed that their line managers spent over 37% of their time gathering data manually and performing root cause analyses to trace manufacturing defects.
This of course significantly impacts the time employees spent on other productive tasks.
So they came up with Drishti.
I had to look up what this name meant with ChatGPT and it turns out it means Vision, Sight, and Focus in Sanskrit.
A perfect name for wyhat it does at FORD!
Drishti combines video and AI to replace the manual process of data capturing in manufacturing to reduce defects and increase process efficiency.??
Drishti’s technical solution encompass two key elements: Drishti Trace and Drishti Flow.
Drishti Trace is a video capturing thingy that provides insights into all the stations in an assembly line.? And Drishti Flow is the AI layer that performs advanced analytics on that video to measure cycle time, bottleneck identification, in-state operator feedback, and other performance metrics to improve line performance.
Cool innit!?
This AI not only identifies inefficiencies but also suggests improvements. And this leads to a reduction in production errors and reduction in downtime.
And another one, which you probably would have expected is Amazon.
Amazon
They use AI in its logistics department through the RoboRunner platform.
Now that is an example of a LAME NAME
In Amazon's fulfillment centers, the AI manages logistics and coordinating fleets of robots with human workers. The traditional systems that have been around for some time follow fixed paths, but this AI adapts dynamically to changes in inventory levels and worker availability!
Ding dong, brilliance!
Embodied AI is bridging the physical and cognitive worlds
Embodied AI is Artificial Intelligence embedded in Robotics (read the article: Embodied AI in robotics.
These systems combine AI with physical forms, and that allows these bots to interact directly with the world around them. Traditionally AI exists only in software on computers, but embodied AI has a physical presence.
Now all of a sudden, AI becomes mobile and able to perform more versatile activities.
And yes, I am thinking of Skynet and the Terminator as well as you guys do.
These systems, they engage with the environment through multiple senses.
They can see, hear, touch, and even smell, and this way they are mimicking human sensory experiences.
Visual sensors allow robots to navigate complex spaces. Audio inputs enable them to understand and respond to spoken commands. Tactile sensors let them handle delicate objects with precision. And some advanced models are even exploring olfactory (smelling) capabilities to detect gases or hazardous substances.
Emotional intelligence is another crucial aspect of embodied AI.
These systems can recognize and respond to human emotions. That makes interactions more natural and effective. That is particularly interesting in healthcare settings, where emotionally aware robots support patients.
A few companies are leading the way in the development of embodied AI. Tesla is working on Optimus, a humanoid robot designed to perform repetitive or dangerous tasks. The company plans to have start selling Optimus 2 by the end of 2025.
Optimus Video = Cool
Figure is another promising startup in this scene. They are also making large strides in humanoid robotics. Their focus is to build adaptable robots that can operate in a lot of different environments, from warehouses to retail spaces. Figure expects to launch its first commercial model in 2025 as well.
Figure video = cooler
Chinese tech giants are probably going to beat the others to market.
Unitree sais that its $16k humanoid robot is ready for mass production in 2024. The Unitree G1 is the most sleak looking and human-like moving robot, with such dexterous hands, that I have ever seen. And the price tage makes it so that it can probably handle my dishes, and do my cooking pretty soon !
Unitree video = hilarious
Embodied AI isn't limited to humanoid forms. Because robots come in all shapes and sizes, because they need to be deployed under different conditions, or be tailored to specific functions. Mega bots are large-scale robots designed for heavy-duty tasks.
For instance, Boston Dynamics' Spot is a quadruped robot used for inspection and data collection in hazardous environments.
And on the other end of the spectrum there are nanobots.
These microscopic robots are operating at the miniature, cellular or even molecular level.
In 1959, one of my favorite physicist Richard Feynman delivered his famous lecture “There’s Plenty of Room at the Bottom,” in which he described the opportunity for shrinking technology, from machines to computer chips, to incredibly small sizes. (read the article: (Richard Feynman's notebook method in a modern age with Obsidian and Zeta Alpha )
Well, the bottom just got more crowded.
Recent research has produced nanobots capable of performing tasks within the human body. Scientists at ETH Zurich developed nanobots in 2023 that can deliver targeted drug therapies through the bloodstream.
And a Cornell-led collaboration has created the first microscopic robots that incorporate semiconductor components, allowing them to be controlled – and made to walk – with standard electronic signals.
These robots, roughly the size of paramecium, provide a template for building even more complex versions that utilize silicon-based intelligence, can be mass produced, and may someday travel through human tissue and blood.
The walking robots are about 5 microns thick (a micron is one-millionth of a meter), 40 microns wide and range from 40 to 70 microns in length.
Each bot consists of a simple circuit made from silicon photovoltaics – which essentially functions as the torso and brain – and four electrochemical actuators that function as legs.
Just think of what that could do!
Chemotherapy delivered to where it is needed instead of spreading toxic chemicals throughout the body, also killing healthy cells. Other tiny robots navigate through the bloodstream, and provide precise and minimally invasive medical interventions.
Some of these nanobots are currently undergoing clinical trials and that will be one major leap in the medical technology department.
As said earlier, the distinction between embodied and non-embodied AI lies in the interaction with the physical environment.
Non-embodied AI operates within digital confines, but embodied AI, interacts tangibly with the world. This physical engagement allows embodied AI to perform tasks like assembling products, assisting in surgeries, or responding to environmental hazards. Stuff that is way beyond the reach of software-based AI alone.
Reflections on the Business (re)Design Space
The exploration of the AI Evolution framework led us deep into the Business (re)Design Space, where AI becomes the core of business operations, redefining entire industries.
And companies like DevRev, Insitro, AlphaFold, Neurable, and Neuralink exemplify this shift. Because they are building their foundations on AI to rethink business models and roles.
I wrote about a future where AI isn't just a tool but an equal collaborator, as in autonomous teams and advanced platforms like Asana. Tools like GitHub Copilot and Amazon Q and Cursos are already using AI as a copilot or even autonomously to handle complex tasks like coding and optimization.
But this is just the beginning.
Because our Next Frontier is the Singularity Space.
This space is marked by the advent of Artificial General Intelligence, wetware, and quantum computing.
This space is marked by the advent of Artificial General Intelligence, wetware, and quantum computing/
I feel - as in 2008, when I started with Machine Learning - that we are now standing on the brink of yetr another monumental change.
The future isn't just about businesses using AI but becoming AI. The real revolution is yet to come, and I'll write about these futuristic concepts in the next deep dive into the Singularity Space.
??
If you have come this far, you are a die hard geek as I am. And I thank you from the bottom of my heart for sticking around with me for so long.
And I have something for you:
Solve this little math puzzle, and get a copy of my latest book
Exploring consciousness: a guide for AI students
Here's another equation:
What is: 12×3+5?8=?12 \times 3 + 5 - 8 = ?12×3+5?8=?
Solve it, and post your answer in the comments below and I'll contact you!
Thank you for sticking around !
Signing off - Marco
Well, that's a wrap for today. Tomorrow, I'll have a fresh episode of TechTonic Shifts for you. If you enjoy my writing and want to support my work, feel free to buy me a coffee ??
Think a friend would enjoy this too? Share the newsletter and let them join the conversation. LinkedIn appreciates your likes by making my articles available to more readers.
Links from the article:
Top-rated articles:
Enterprise Architect at Microsoft & Author & CIO
2 周Insightful and informative part 3 , Although there will be a long road ahead, the destination is clear
Shall we make a difference together???????????
2 周Wow Marco, you are so incredibly amazing! You're just awesome! What a super powerful article. And so detailed too?? My great admiration for you...I take my hat off to you...chapeau?? Thankyou for sharing! I'm going to repost it!
Converging Customer, Product, Data and People | AI
2 周Thanks for the callout Marco! DevRev is on a mission to help businesses be more customer-centric. It connects end users, support, product, and development teams on a common AI-native platform, solving the problem of siloed departments, reducing the need for multiple apps, and converging teams. Most of the legacy systems were built decades ago and they try to add AI on top of it, but DevRev is natively built on AI. It's amazing to think how powerful this is!