AI meets Crypto, a view from crypto and infra background
<< Envision A Future With Open, Modular And Composable AI Models >>

AI meets Crypto, a view from crypto and infra background

Special thanks to Lei Tang Davide Crapis Leo Chen Siyuan Han Bo Yu for feedback and review.


I left Coinbase in March after spending 3 years there building the crypto platform for its 100M+ customers, primarily on Identity and Onboarding solutions for both web2 and web3 products.?

In the past two months, without a full time job, I got more time with my family, doing gardening, testing Tesla FSD edge cases and spending hundreds of hours exploring AI, particularly LLMs. This current AI wave seems more real than ever, given my background in crypto, I ponder what it means when AI meets Crypto. I asked myself several questions and below are the answers I got so far.


1: Will AGI Be A Reality? If Yes, When??

It depends on how we define AGI (Artificial General Intelligence). To me AGI means human-like reasoning, learning, problem solving and continuous improvements. It’s a matter of when not if, though I won’t spend too much time speculating about the timing.?

As a matter of fact, in the past 18 months or so, one branch of AI - Large Language Models (LLMs) - have made major technology and adoption breakthroughs. ChatGPT reached 100M users in just 2 months, the prior fastest one was Tiktok in 9 months. On the flip side, hallucination remains a major technical gap, besides all these ethical, societal, regulatory and geo-political concerns.?

I fine-tuned and run a few open source models myself locally on my Apple M2 silicon, even the most performing one - LlaMa3 (8B) - often responded to me with “logical but wrong” hallucinating answers.

Figure 1. Quantized LlaMa3-8B believes that Sam Altman is also the CEO of FTX, the now bankrupted crypto exchange.

Obviously there is more work to do before AGI, it might take years or even decades. Nevertheless AI gets more powerful everyday, it can do more types of work and do them more intelligently, see a diagram from The Internet of Agents by Davide Crapis.

Figure 2. Left: a concept timeline of AI evolution with increasing performance. Right: block diagram of activities for humans and different forms of AI.

2: What Are The Roadblocks To AGI??

?We often hear from people like Sam Altman and Elon Musk that AI scaling law will soon or eventually be constrained by compute, energy and/or training data before AGI. For sure we need a longer term vision and roadmap to revolutionize the GPU and energy industry (e.g. fusion energy), and (using AI to) generate unlimited, diversified, high quality synthetic training data.?

However, these seemingly new problems can potentially be solved or at least mitigated by existing solutions. In the short to mid term, to me this is an efficiency game in every step of the AI process. We should and must do performance profiling and optimization on training, fine-tuning and inference, e.g. more efficient data and model parallelization, cutting every waste in GPU memory and bandwidth consumptions, quantization, batch and offline inference processing etc. The good news is, Meta’s recent well received LlaMa3 LLM is a perfect example of efficiency meets performance.?

Another seemingly new problem around the globe GPU / energy shortage, my understanding is it’s more of a distribution problem than an overall supply problem. We should and can find an existing solution to take advantage of all those idle GPUs across data centers and energy across regions, without a longer term GPU rearchitect and revamping of the globe energy network. Crypto could help here (see later discussion).?

In addition, there are broader ethical, social, regulatory, geopolitical and interdisciplinary implications and challenges, which I will leave to future discussions.??


3: Large Or Small Language Model?

Large Language Models are getting larger and larger. GPT4 has 1.75 trillion parameters, fed with 15 trillion tokens and was trained by 25,000 NVIDIA A100 GPUs. The soon to come GPT5 will certainly out number these, people were speculating massive $2.5B training cost for GPT5.?

However, models don't always have to go bigger. Instead of a monolithic approach, we could adopt a modular approach just like any other modern computer systems. We could use a handful of medium/large size models, e.g. LlaMa3 70B, as base models, and build much smaller domain specific incremental models on top of them. During inference, these incremental models can share and connect through the same base model. In this way, one application can access multiple incremental models and base models at the same time. Hence not all these companies have to build their own super, costly and largely overlapped generalized model, instead we can keep expanding the overall model capability with an open community, building and sharing base models and incremental domain specific models on top of each other. (see the below diagram)

Figure 3: Smaller domain specific models build on top of and connect through the same shared base model.

Moreover, companies like Apple and Microsoft are experimenting in a different direction. Instead of LLMs, they are building SLMs (Small Language Models), e.g. Apple OpenELM (270M, 450M, 1.1B, 3B) and Microsoft Phi-3 Mini (3.8B), both trying to balance model size and performance. They may not be as powerful and general as LLMs like ChatGPT 4, Claude 3 Opus, let alone the incoming GPT 5/6, but they could be very suitable for cases like edge AI on smaller IoT devices without reliable internet connection, or privacy sensitive scenarios like Amazon Alexa or Google Home or certain highly regulated industry where data needs to be kept on premises. SLMs inference can be 100% hosted on edge like a mobile device, without touching any remote servers or internet at all.


4: Does AI Need Crypto? And Vice Versa.?

While AI does not inherently "need" crypto to function, and vice versa, the integration of these technologies could lead to significant advances in both fields. Each can enhance the capabilities and address the limitations of the other, potentially leading to more robust, efficient, and trustworthy systems.

What can Crypto do for AI?

  • Decentralized Data and Compute for AI Training: Crypto and blockchain technology can facilitate the creation of decentralized data and GPU marketplaces. By using crypto to incentivize data and GPU sharing, AI developers can potentially access a wider array of high-quality, real-world data and underutilized precious GPUs, which is crucial for training more robust and generalized models.
  • Incentivizing Model Development and Sharing: For now, there is no viable business model for open source AI models, hence the top model developers, e.g. OpenAI and Anthropic, go with close source to monetize their models through directly serving customers via API. Crypto can build incentives for open source model developers, by tokenizing models, enforcing provenance and profit sharing between model developers, model hosting providers and marketplaces.?
  • Counter Bots and Deep Fakes: AI has made the Turing test failing its job to distinguish bots from humans. In a world with abundance of content made possible by GenAI, crypto can drive proof of personhood, track the provenance and hence enforce accountability. World ID is a right step in this direction.
  • Enhanced Security and Privacy: Crypto-related technologies, particularly those involving encryption, can enhance the security and privacy of AI systems. For instance, homomorphic encryption allows AI systems to process encrypted data, including input and output, without needing to decrypt it, thus preserving privacy.


What can AI do for Crypto?

  • Fraud Detection and Blockchain Security: AI can significantly improve the security of cryptocurrency networks by detecting patterns indicative of fraudulent activities or potential security breaches of smart contracts and wallets. Machine learning models can analyze transactions at a scale and speed that are impractical for humans, identifying anomalies that could indicate malicious behavior.
  • Enable Intent based Smart Contracts: LLMs can enable intent based smart contracts, which users only need to give smart contracts high level goals and constraints in natural language, and leave smart contracts to plan & break them down to machine friendly tasks and then automatically figure out the right agents and optimal time to execute.
  • Market Predictions and Trading Bots: AI, especially machine learning and predictive analytics, is extensively used in developing trading bots that can analyze market trends and execute trades at optimal times.


5: What Are The Challenges To Bring AI On-chain??

Let’s start with a quick comparison between AI and Crypto for their default settings:

Figure 4: A comparison of AI and Crypto default settings.

As you can see, AI and Crypto defaults are different or even quite opposite in most dimensions. Bringing AI on-chain presents a set of unique challenges that stem from the fundamental differences in how these technologies are designed and operated. Here are some of the key challenges and possible solutions:?

Figure 5: Challenges and possible solutions to bring AI on-chain.

Conclusion: (Generated by chatGPT)

As we stand at the crossroads of two revolutionary technologies, AI and crypto, the potential for transformative change is immense. The integration of AI's deep learning capabilities with blockchain's secure and decentralized framework promises to redefine how we handle data, execute transactions, and build trust in digital interactions. However, as outlined, challenges such as scalability, verifiability, data privacy and regulatory compliance remain formidable but not insurmountable.

While the technical hurdles are significant, the opportunities—for efficiency, security, and new business models—are profound. Innovations like intent-based smart contracts and enhanced security for crypto platforms highlight just the beginning of what's possible when these technologies converge. However, as we push forward, a balanced approach that considers ethical, social, and economic implications is crucial.

Davide Crapis

Ethereum | PIN AI

10 个月

Amazing to be a coreviewer with Lei Tang , our latest coauthorship is a US patent for the AI system for Lyft driver marketplace few years back ??

要查看或添加评论,请登录

Shenghu Yang的更多文章

社区洞察

其他会员也浏览了