The AIConference 2023 - San Francisco
Gareth Roberts
Human & Machine Intelligence Expert | Research & Industry with a Commercial Focus | Academic -> Computational Neuroscience | Chief AI Officer, ex-CTO, ex-Head-of-AI | PhD in Neurosymbolic Planning & Reasoning
I just emerged from the whirlwind of the AI Conference in San Francisco, and I'm absolutely buzzing with ideas and insights! The tight line between extreme benefits and challenging data engineering of large language models (LLMs) was laid out in stark detail.
The world of AI has made leaps and bounds in retrieval augmented generation (RAG). Open-source frameworks are now bridging gaps between disparate AI models and vector databases, creating an efficient synergy we haven't seen before. Here's something to get your tech gears grinding - I managed to run lllama2 70B on my MacBook Pro, and it didn't even break a sweat. Edge AI for the win!
The standout revelation, however, was the shift towards individuals and industry fine-tuning custom LLMs. Yes, whether big or small in employees or market cap, people want their own custom LLMs to fully take advantage of future AI technology.
In summary, it was great to meet and listen to such a close community that I felt like I already had met before, given how close the open-source community is online.
It was a glimpse into the future of AI with many exciting applications far beyond the generic "it helps you write emails, etc". Diverse industries that had gone 'all in on LLMs' included defence, video game producers, fiction authors, financial professionals, educators, governments, and many more. We're living in exciting times. Can't wait to see what's next!
#AIConference #EdgeAI #FutureofAI #TechTrends #LLMs #RAG