The Future of AI is Decentralised AI
DALL-E: a street camera view of an old person that has fallen on the pavement and is in pain

The Future of AI is Decentralised AI

You cannot ignore GPT4 in the news. AI is eating the world. Many companies are looking into machine learning and how it can be applied to their business. Bigger and bigger natural language models get launched every day. AI models are going from millions to billions and are now getting close to trillions of parameters. The next frontier after natural language and generative images is videos.


Videos will soon be a big issue because the centralised cloud computing approach is not well-suited for a world where data volumes are growing exponentially “at the edge”. Modern cars have 8 or more cameras. Police are wearing body cams. Each street corner in major cities has at least one camera. We cannot just have people looking at all these video streams. We need AI to analyse them. Bringing all these thousands of camera streams, which are moving from being SD quality to HD and even 4K, to a centralised cloud is just too expensive. 99.9999% of what a camera sees is useless information. 


What you need is to be able to ask cameras GPT-like questions: “Are more workers wearing personal protective equipment [PPE] compared to the previous 12 months?” The AI will have to analyse months of data. It will need to understand if a person is visible. If this person is wearing PPE or not. Calculate how much percentage each day workers were or were not wearing PPE. Map this data in a timeline. To finally respond to the question. 


We are seeing with GPT4 and plugins how a generic language model can use third-party plugins to interact with other systems, e.g. Zapier [interact with thousands of other SaaS systems] or use domain specific expertise, e.g. FiscalNote [legal data & political data], Wolfram [computation and maths knowledge].


Running large language models, detecting objects in videos, analysing and summarising the results are best done close to the data source. The latest language models do not fit the average computer any more. Moving thousands of snippets of 4K surveillance videos to a centralised cloud is too expensive to do object detection. The ideal solution is to bring the models and AI inference to the edge where the data is. 


Decentralised AI is bringing AI and compute to the data, not the other way around. You bring models to the data and ask the data to be analysed and only the results are sent back over the network. Models and data are now so big that compute needs to happen at the edges where the raw data is generated. 


There are privacy concerns around this data but at the same time there are also many potential positive use cases. Being able to search for a missing toddler; locating a criminal or a stolen vehicle in a city; dispatching help if a pedestrian falls over and is knocked unconscious; counting footfall to decide on where to open a store; predicting the sales of a car manufacturer by counting the number of driving cars; and much more.


Having taxpayer money being used to put cameras is expensive. What if private companies could pay for the infrastructure via launching queries or waiting for events to happen. Recognising people would only be a service offered to emergency services and law enforcement but each query could be logged and made public after some time to avoid mass surveillance. Companies could pay to get information around traffic, footfall, certain activities happening [e.g. accidents, traffic jams, counting competitors bags, analysing queues in front of restaurants,...].


We need to rethink how in a world of too much data at the edges, we create new monetisation models to deal with the high costs of installing compute at the edges. Decentralised AI will allow a ChatGPT style of service to collect real-time wisdom of what is happening in the world. Cameras are just one use case though. 


The healthcare industry is another example of where you want the algorithms to come to the data and not the other way around. By running AI on genomics data, MRI scans, blood analysis, and more and allowing others to execute monetised AI searches without sharing identifiable information, hospitals and others could see their IT spend turn into data and AI revenues.


Decentralised AI can also be used by many companies to share intelligence with others, without sharing any private or confidential data. A bank which needs to decide on a loan, an insurer who wants to underwrite a life insurance, an employer who wants to hire a new CEO would pay to know the financial, health and behavioural situation of an applicant. AI could validate that you are active, charitable, financially stable, healthy,... or not. Via permission-based decentralised AI queries, consumers would know, be empowered and rewarded to share information with companies. Financial and health AI advisors could have a holistic understanding of their clients and make them more successful and healthier.


The first generation of decentralised AI infrastructure is being developed as we speak. I have been playing with some. It is still early days but if you are interested to know more about decentralised AI and if your company could use it, let me know in the comments.

Paul Dowling

Open-source AI evangelist | Building ecosystem at KXSB | AI activist and artist @Flux__art on Instagram

1 年

If this is a reality, how will it not kill cryto/blockchain based decentralised solutions?

回复
Alex Bachmutsky

Chief Architect, Networking and Security

1 年

My company, Axiado, is introducing that distributed AI concept to the field of security (EDR/NDR/XDR, vulnerability management, ransomware and other attacks detection). Instead of bringing all the data from every server, for example, to the central location in the datacenter to run AI analysis, it is much more efficient to process that data with local AI (bring AI models to data) and potentially feed the results into the central AI if needed. There are couple of companies trying to do that in software, we are doing it in hardware releasing CPU to do what it is supposed to do - normal workload. I fully agree that distributed AI is the future.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了