Build your cost-free, offline AI tool
Demo! ... also, AlphaFold3 open-source, and a 1.7T parameter AI ??
Hey there!
? AlphaFold 3 is now open-source!
? 1.7T parameters - the largest OS LLM. → Does size matter?
? Demo: Building a secure, offline AI tool from scratch in under 35 minutes. ?
AlphaFold 3 is now open-source (OS)!
Don’t underestimate this news. The future will single out AlphaFold.
While DNA is the code of life, proteins are its substrate, the hardware necessary to give the code any meaning. Biotechnology, without the understanding and control of protein, was doomed to a deadlock.
We now know how the linear code works, moving from linear to 3D, from the instructions to function was the missing piece.
Protein design changes everything—health, materials, energy, and more.
Don't be misled by what people are or aren’t noticing. What grabs attention today often fades, while history will remember the transformative.
(If you make it run, reply, and I will feature your progress in the newsletter. ?? )
1.7 Trillion parameters - the largest open-source (OS) language model. And: Does size matter?
Near Protocol co-founder Illia Polosukhin. Source: Cointelegraph
Near Protocol aims to build the world’s largest OS AI model with 1.4 trillion parameters.
The project will use crowdsourced R&D via the Near AI Research hub, starting with a 500M model and scaling up. Estimated costs are $160M, funded by token sales and repayments from AI inferences.
They explore decentralized computing as a solution.
领英推荐
Why have larger and larger AI models?
According to AI experts, such as Ilya Sutskever, or Sam Altman (I have also said this repeatedly in my book), our algorithms are good enough, and by scaling them up, they will have more emergent capabilities, even reaching/bringing us to AGI.
Dario Amodei said in yesterday's podcast with Lex Friedman (which I recommend listening to) that scaling data, compute, and model size will lead to Superintelligence.
A major bombshell dropped during the YCombinator interview when Sam Altmancasually mentioned that what excites him about 2025 is AGI.
He said it just like that, almost offhandedly. → Interview (min 46:12)
In the meantime, a new report (The Information) came out that describes how OpenAI’s next flagship model might not represent as giant a leap forward as its predecessors.
The report also states their coping mechanisms for it. So, it’s all not that easy and straightforward, but it is improbable that the undeniable progress will halt.
Demo: How to build a secure, offline AI tool from scratch in under 35 min.
In this short series, I share my frameworks for building a secure offline AI tool that can analyze your bank statements. By following along, you’ll gain the skills to create distributable offline tools that integrate OS LMs.
You don’t need to be a coding expert — just have a clear plan and know how to ask your coding expert (AI) the right questions to guide you through the process.
Four short videos
1/4:
Watch videos 2, 3, and 4 with?Premium. Next week, there will be another hands-on video!
That’s a wrap! I hope you enjoyed it.
Martin
AI and Digital Transformation, Chemical Scientist, MBA.
3 个月Provocative takes on AGI timelines. Frameworks demystifying AI accessibility intriguing. What insights resonate most?
GenAI Since 2016 | Keynote Speaker | Author | 43k+ Newsletter
3 个月get the full experience here: https://mail.generativeai.net/