Embracing a Compute-Light Future: The Evolution of Intelligence and AI

Embracing a Compute-Light Future: The Evolution of Intelligence and AI

We never anticipated the extent to which computers would proliferate, ultimately providing everyone on Earth with access to one. Similarly, we couldn't foresee that GPUs would find applications beyond gaming, leading to a scenario where everyone might possess a supercomputer. As we grapple with the carbon footprint of these advancements, we face the unsettling possibility that each individual could effectively own a personal carbon-generating factory. While the excitement surrounding Large Language Models (LLMs) and their perceived magical abilities to usher in a new era of intelligence is palpable, we must question whether this is truly the intelligent web we aim to create—especially considering the climate crisis we may be exacerbating.

In 2017, when I wrote about AlphaBlock, I introduced the concept of intelligent agents, or "alphabots," powered by an engine capable of forecasting—a hallmark of intelligence. Rather than mere pattern recognition, I emphasized the importance of seeing into the complex future, taking steps ahead as a benchmark of true intelligence. My focus extended beyond finance; I envisioned the convergence of financial and non-financial data, leading to a world where data transforms into enriched assets—a future teeming with trillions of intelligence assets within a predictive ecosystem.

"As intelligence shifts from arbitrary and erratic patterns of human discretionary knowledge-building toward more systematic and organic AI, there arises a need for a new market mechanism to validate, distribute, and reward intelligent processes. Such an intelligent market is built on a Systematic, Scientific, Replicable (SSR) process that is objective, accountable, and can be validated and utilized by the community. This general intelligence, or 'alpha,' should be content-agnostic and context-focused—an alpha process that reconfigures the block of the blockchain into 'AlphaBlock,' an intelligent market mechanism. Traditionally, alpha prediction has been associated with domain-specific content and known as predictive systems that are non-replicable and mostly non-scientific. I define a General AI predictive process that can be integrated into the blockchain block, transforming the blockchain into a multi-purpose predictive tool that self-builds, self-protects, and self-validates. AlphaBlock becomes the essence of everything linked with data predictability, evolving into an intelligence layer on the blockchain and the web. It is a predictive ecosystem that blurs the distinction between financial and non-financial data, ultimately removing barriers between financial and services markets. The blockchain can achieve this evolved state and become an intelligent market if it overcomes three key hurdles: first, securitizing blockchain assets to create new alternative assets and asset classes; second, resolving the incapability of conventional finance to understand risk effectively and enhancing return per unit of risk (outperforming the market) using a General AI process; third, offering a better mechanism to address currency risk than what is currently provided by existing fiat currencies and cryptocurrencies."

I also discussed the concept of compute. For me, this future world was not data-heavy but data-light; not compute-intensive but compute-light. I imagined this because nature itself operates as a light compute engine, not a heavy one—nature doesn't require GPUs. Therefore, a truly intelligent world would simulate nature, reducing its data and compute requirements. It would be a world that understands complexity.

"This allows a user to define new problems and solve them. The process does not require a lot of data and is not computationally heavy. The solutions are robust and don't change as the sample set changes. Simply put, the process simplifies the complex, understands the mechanism that generates uncertainty, and thus does a better job of managing the risk inherent in a data point."

It's an amazing feeling to imagine a future, write about it, and then see parts of it unfolding before your eyes. I'm pleased to see intelligent agents now handling mundane tasks like replicating fund fact sheets, performing attribution analysis, and building models for assets around the world. I'm enthusiastic about the systematic revolution underway in asset management. If you're not augmenting your investment management process, you're at a significant disadvantage. Systematic approaches are rapidly becoming the foundation of asset management.

However, I'm still waiting to see the middle "S" from the SSR process I previously mentioned—the Scientific aspect. LLMs are statistical; they are not yet scientific in their thinking. For true scientific progress, LLMs need to reduce their compute requirements, which they can't do until they understand the power of fractional data, and that won't happen until they grasp nature's intelligence, which operates in parallel with complexity.

"Then the Web will be a thriving organism. This architecture could potentially drive Web 4.0, the ultra-smart agent that caters to various user needs. Web 4.0 won't be simply artificial; it will be an intelligent web that does not distinguish between domains. When the web evolves to this architecture, its structure becomes more important than its content, and the value of a General AI is unleashed on a universe of data commonalities."

Nature won't provide solutions if we disrupt it by effectively giving everyone a personal carbon factory. The intelligence revolution has nothing to do with the compute revolution. We need to understand data to reduce compute requirements. Only when we become computationally light and energy-efficient will the true intelligence world emerge.

"A General AI that understands nature and is universal is superior to the human brain. Such a General AI should be able to learn and price assets across domains. It is an algorithmic framework that learns as it is trained. It has a global understanding but functions locally. It understands components and groups, inter-domain and intra-domain relationships, convergence and divergence, signal and noise, etc. Intelligent design can ultimately only be driven by intelligence, as only intelligence can improve and enhance itself."

Pal, Mukul, AlphaBlock (November 14, 2017). Available at SSRN: https://ssrn.com/abstract=3070978 or https://dx.doi.org/10.2139/ssrn.3070978

Terry Wallace

President & CEO Black Dog Development

1 个月

Thanks for sharing

Terry Wallace

President & CEO Black Dog Development

1 个月

Thanks for sharing

要查看或添加评论,请登录

Mukul Pal的更多文章

社区洞察