The other side of AI tokens: most projects are busy with financial benefits rather than real-world impact
Written by: Gagra
Compiled by: Xiaobai Navigation coderworld
Abstract
-
This is not another article about "AI + Web3” space. We are optimistic about the merger of the two technologies, but this article is a call to action. Otherwise, this optimism will eventually lose its justification.
-
Why? Because it is the best way to develop and runAIModels require huge capital expenditures on cutting-edge and often hard-to-obtain hardware, as well as domain-specific R&D. Crowdsourcing through crypto incentives, as most Web3 AI projects do, is not enough to offset the tens of billions of dollars invested by large companies that have a tight grip on AI development. Given the limitations on hardware, this may be the first big software paradigm that smart and creative engineers outside of incumbent organizations don’t have the resources to disrupt.
-
Software is “eating the world” at an increasing rate, and will soon grow exponentially with the acceleration of artificial intelligence. In the current situation, all of this “cake” flows to the technology giants, and end users, including governments and large companies, and of course consumers, are more dependent on their power.
Misaligned incentives
所有这些都不可能在更不合适的时间展开——在90%的去中心化网络参与者忙于追逐叙事驱动发展带来的容易获得的巨大收益。是的,开发者们在追随投资者进入我们的行业,而不是反过来的。它从公开承认到更微妙的潜意识动机各不相同,但是围绕它们形成的叙事和市场推动了很大一部分Web3的决策制定。参与者们太沉浸在反身性泡沫中,以至于无法注意到外面的世界,除了帮助推动这一周期进一步发展的叙事。而AI显然是最大的一个,因为它本身也在经历一场繁荣。
We have spoken to dozens of teams at the intersection of AI x Crypto and can confirm that many of them are very capable, mission-driven, and passionate about building projects. But human nature is such that when faced with temptations, we tend to give in to them and then rationalize those choices afterwards.
Easy access to liquidity has been a historical curse for the crypto industry, one that has slowed its development and delayed useful adoption by years. It can turn even the most devoted crypto believers on to “hype money.”Token”。合理化的解释是,持有代币的更多资本,这些建设者可能有更好的机会。
The relatively low maturity of institutional and retail capital provides builders with opportunities to make claims that are divorced from reality while still benefiting from valuations as if those claims have already been realized. The result of these processes is to actually lead to moral hazard and capital destruction, where few such strategies work over the long term. Necessity is the mother of all inventions, and when necessity disappears, so does invention.
This couldn’t have happened at a worse time. While all the smartest tech entrepreneurs, national leaders, and businesses big and small are racing to ensure they benefit from the AI revolution, crypto founders and investors are choosing “grow fast.” In our view, this is the real opportunity cost.
Web3 AI Market Overview
Taking the above incentives into account, the classification of Web3 AI projects really comes down to:
-
Legal (also divided into realists and idealists)
-
Semi-legal
-
Forger
Basically, we think the builders have a clear idea of what it will take to keep pace with their Web2 competitors, and which verticals it is possible to compete in and which are more of a pipe dream that can be marketed to VCs and an unsophisticated public.
The goal is to be able to compete here and now. Otherwise, the pace of AI development may leave Web3 behind, and the world moves towards a dystopian Web4 of Western enterprise AI vs. Chinese national AI. Those who are not competitive quickly and rely on distributed technologies to catch up over a longer time horizon are too optimistic to be taken seriously.
Obviously, this is a very rough generalization, and even the community of fakers contains at least a few serious groups (and perhaps more paranoid ones). But this article is a call to action, so we are not trying to be objective, but to call on the reader to feel a sense of urgency.
legitimate
Middleware that “brings AI to the blockchain”. The founders behind these solutions, though few in number, understand that it is not feasible or even impossible to decentralize training or inference on the models that users actually want so far. Therefore, it is a good enough first step for them to connect the best centralized models with an on-chain environment so that they can benefit from sophisticated automation. At present, hardware isolates (TEEs, i.e. “air-isolated” processors) that provide API access points, two-way oracles (for bidirectionally indexing on-chain and off-chain data), and verifiable off-chain computing environments for agents seem to be the best solutions. There are also some that useZero knowledge proofWe also think that coprocessor architectures that perform snapshot state changes (ZKP) rather than verifying full computations are feasible in the medium term.
A more idealistic approach to the same problem seeks to verify off-chain reasoning to bring it into line with on-chain computation in terms of trust assumptions. In our view, the goal of doing so should be to allow AI to perform tasks both on-chain and off-chain in a single coherent runtime environment. However, most proponents of verifiability of reasoning talk about vague goals like “trusting the model weights” that may never become important in the next few years or even years. Recently, founders of this camp have started exploring alternative approaches to verify reasoning, but initially all based on ZKPs. Although a lot of smart teams are working on so-called ZKMLs, they run too big a risk in anticipating cryptographic optimizations that outpace the complexity and computational requirements of AI models. As such, we believe they are not currently well-suited to competing. However, some recent advances are interesting and should not be ignored.
Semi-legal
Consumer applications using wrappers around closed and open source models (e.g. Stable Diffusion or Midjourney for image generation). Some of these teams were first in the market and have traction with actual users. So it’s unfair to generalize and call them fake, but only a few are thinking deeply about how to evolve their base models in a decentralized way and innovating in incentive design. There are some interesting governance/ownership changes in this regard. But most projects in this category are just adding a token on top of a centralized wrapper like the OpenAI API to capture a valuation premium or provide faster liquidity to the team.
The problem that neither of the above camps solves is training and inference of large models in a decentralized environment. Currently, there is no way to train basic models in a reasonable time without relying on a tightly connected hardware cluster. Given the level of competition, "reasonable time" is the key factor.
There has been some promising research recently, and in theory, methods such as differential data flow could be extended to distributed computing networks to increase their capacity in the future (as network capabilities continue to match data flow requirements). However, competitive model training still requires communication between localized clusters (rather than a single distributed device), and cutting-edge computing power (which retail GPUs are increasingly uncompetitive).
There has also been recent progress in localizing inference (one of the two ways to decentralize) by reducing model size, but there are no existing protocols that take advantage of it in Web3.
The question of decentralized training and reasoning logically brings us to the last of the three camps, and the most important, and therefore the most emotionally triggering one for us.
Forger
Infrastructure applications are mainly concentrated in the field of decentralized servers, providing bare hardware or decentralized model training/hosting environments. There are also some software infrastructure projects that are promoting protocols such as federated learning (decentralized model training), or merging software and hardware components into a single platform where people can basically train and deploy their decentralized models end-to-end.Most of them lack the sophistication required to actually solve the stated problem,” Token Incentives + MarketsXiaobai NavigationThe naive idea of "which direction the wind is blowing" prevails here.None of the solutions we see in both the public and private markets can meaningfully compete at this point in time. Some may develop into viable (but niche) products, but what we need now are fresh, competitive solutions. This can only be achieved through innovative designs that address distributed computing bottlenecks. In training, not only is speed an issue, but also verifiability of work completion and coordination of training workloads, which adds bandwidth bottlenecks.
We need a set of competitive and truly decentralized base models that require decentralized training and inference to work. If computers become intelligent and AI is centralized, then there will be no world computer to talk about, except in some dystopian version.
Training and inference are at the core of AI innovation. As the rest of the AI world moves toward more cohesive architectures, Web3 needs some orthogonal solutions to compete, as head-on competition is becoming less and less feasible.
The scale of the problem
It's all about compute power. The more you throw at it, the better the results, both in training and inference. Yes, there are some tweaks and optimizations, and the compute itself is not homogeneous, and there are all kinds of new ways to overcome the bottlenecks of traditional von Neumann architecture processing units, but in the end, it all comes down to how many matrix multiplications you can do on how big a chunk of memory you can do it on and how fast you can do it.
This is why we’re seeing such a strong buildout in datacenters from the so-called “hyperscalers,” all of whom are seeking to create a full stack with powerful processors for AI models at the top and the hardware to support it at the bottom: OpenAI (models) + Microsoft (compute), Anthropic (models) + AWS (compute), Google (both), and Meta (increasingly both, by doubling down on datacenter buildouts). There are many more nuances, interacting dynamics, and players, but we won’t get into that here. The big picture is that the hyperscalers are investing unprecedented billions of dollars in datacenter buildouts and creating synergies between their compute and AI offerings, anticipating huge returns as AI becomes ubiquitous across the global economy.
Let's look at just the levels of build-out expected by these four companies this year:
-
Meta expects capital expenditures in 2024 to be between $30 billion and $37 billion, which will likely be heavily skewed toward data centers.
-
Microsoft’s capex in 2023 is around $11.5 billion, and it’s rumored to invest another $40-50 billion in 24-25! This can be partially inferred from the huge data center investments announced in just a few countries: $3.2 billion in the UK, $3.5 billion in Australia, $2.1 billion in Spain, 3.2 billion euros in Germany, $1 billion in Georgia, $10 billion in Wisconsin. And these are just some of the regional investments in their network of 300 data centers in more than 60 regions. There are also rumors that Microsoft could spend another $100 billion to build a supercomputer for OpenAI!
-
Amazon's leadership expects their capital expenditures to increase significantly in 2024, from $48 billion in spending in 2023, primarily due to the expansion of AWS infrastructure for artificial intelligence.
-
Google spent $11 billion to expand its servers and data centers in the fourth quarter of 2023 alone. They acknowledged that these investments were made to meet the expected demand for AI and expect the pace and total amount of their infrastructure spending to increase significantly in 2024 due to AI.
Here’s how much NVIDIA will have spent on AI hardware by 2023:
Nvidia CEO Jensen Huang has been touting $1 trillion in spending on AI acceleration over the next few years. He recently doubled that forecast to $2 trillion, purportedly due to the interest he’s seeing from sovereign players. Analysts at Altimeter expect global AI-related data center spending to be $160 billion in 2024 and more than $200 billion in 2025.
Now compare these numbers to what Web 3 is offering independent data center operators as an incentive to scale up capital expenditures on the latest AI hardware:
-
The total market capitalization of all Decentralized Physical Infrastructure (DePIn) projects is currently around $40 billion for relatively illiquid and mostly speculative tokens. Essentially, the market capitalization of these networks is equal to an upper-bound estimate of the total capital expenditures of their contributors as they incentivize this construction with tokens. However, the current market capitalization is of little use as it has already been issued.
-
So, let’s assume that another $80 billion (2x current value) of private and public DePIn token market cap enters the market as incentives over the next 3-5 years, and assume this is entirely for AI use cases.
Even if we divide this very rough estimate by 3 years and compare its dollar value to the cash spent by hyperscale operators in 2024 alone, it becomes clear that applying token incentives to a range of “decentralized GPU network” projects will not be enough.
Billions more dollars of investor demand will be needed to absorb these tokens as the operators of these networks sell large quantities of such mined coins to cover costs such as capital expenditures. Many more billions will be needed to drive up the value of these tokens and incentivize the growth of construction to surpass the hyperscale operators.
However, someone with a good understanding of how most Web3 servers currently run might expect that a large portion of the “decentralized physical infrastructure” is actually running on the cloud services of these hyperscale operators. Of course, the surge in demand for GPUs and other AI-specialized hardware is also driving more supply, which should eventually make it cheaper for clouds to rent or buy them. At least that’s the expectation.
But also consider this: Nvidia now needs to prioritize providing its customers with the latest generation of GPUs. At the same time, Nvidia is also starting to compete on its own turf with the largest cloud computing providers, providing AI platform services to enterprise customers who are already locked into hyperscale servers. This will eventually force it to either build its own data centers over time (essentially eroding the lucrative profits they enjoy now, so it's unlikely) or significantly limit its AI hardware sales to its partner network cloud providers.
In addition, NVIDIA's competitors have launched additional AI-specific hardware, mostly using the same chips that NVIDIA produces at TSMC. As a result, essentially all AI hardware companies are currently competing for TSMC's production capacity. TSMC also needs to prioritize certain customers. Samsung and potentially Intel (which is trying to get back into cutting-edge chip manufacturing soon) may be able to absorb the additional demand, but TSMC is currently producing most of the AI-related chips, and it takes years to scale and calibrate cutting-edge chip manufacturing (3 and 2 nanometers).
On top of that, all cutting-edge chip manufacturing is currently done by TSMC in Taiwan and Samsung in South Korea, across the Taiwan Strait, and before the facilities currently being built in the United States to offset this (and not expected to produce next-generation chips for another few years) could start up, the risk of military conflict could become a reality.
Finally, China, largely cut off from the latest generation of AI hardware by U.S. restrictions on Nvidia and TSMC, is vying for the remaining computing power, just like the Web3 DePIn network. Unlike Web3, Chinese companies actually have their own competing models, especially large language models (LLMs) from companies like Baidu and Alibaba, which require a lot of previous generation equipment to run.
Therefore, due to one or a combination of the above reasons, there is a non-material risk that the hyperscale cloud service providers restrict access to their AI hardware to outside parties as the war for AI dominance intensifies and takes precedence over cloud business. Basically, this is a scenario where they take up all AI-related cloud computing capacity for their own use and no longer provide it to anyone else, while also gobbling up all the latest hardware. After this happens, the remaining computing supply will be in higher demand from other large players, including sovereign states. And consumer-grade GPUs are becoming less and less competitive.
Obviously, this is an extreme case, but for large players, the rewards are so large that they won’t back down even if hardware bottlenecks persist. This would exclude decentralized operators like secondary data centers and retail-grade hardware owners (which make up the majority of Web3 DePin providers) from competing.
coinThe other side
whencryptocurrencyBefore founders realize it, AI giants are watching closelycryptocurrencyGovernment pressure and competition may force them to adoptcryptocurrency, to avoid being shut down or subject to strict regulation.
One of the earliest public hints of this came when the founder of Stability AI recently resigned in order to begin “decentralizing” his company. He had previously made no secret in public appearances about plans to launch a token after the company successfully completed an IPO, which goes some way to revealing the authenticity of the intended motivation.
Likewise, while Sam Altman has no role in the operations of Worldcoin, the crypto project he co-founded, its tokens do trade like a proxy for OpenAI. Only time will tell if there is a path to connect the free internet currency project with the AI R&D project, but the Worldcoin team seems aware that the market is testing this hypothesis.
It makes sense to us that the AI giant might explore different paths to decentralization. The problem we see here is that Web3 has yet to come up with a meaningful solution. “Governance tokens” are largely a meme, and only those that explicitly avoid direct links between asset holders and the development and operation of their networks, such as $BTC and $ETH, are currently truly decentralized.
The same (dis)incentives that slow technological development also affect the development of different designs for governing crypto networks. Startup teams simply slap a “governance token” label on their products in the hope of finding a solution, only to end up getting caught up in the “governance theater” surrounding resource allocation.
in conclusion
The AI race is on, and everyone is taking it very seriously. We can’t find holes in the thinking of big tech companies, more compute means better AI, and better AI means lower costs, new revenues, and greater market share. To us, that means the bubble is justified, but all the cheaters will still be flushed out in the inevitable shakeout.
Centralized large corporate AI is dominating the space, and legitimate startups are finding it hard to keep up. The Web3 space is late to the game, but is joining the race.The market rewards cryptocurrency AI projects too generously.Web2There are fewer rewards for startups in this space, which has caused founders’ interest to shift from delivering products to driving token appreciation at critical moments, and the window of opportunity to catch up is closing rapidly. So far, no orthogonal innovation has emerged here to bypass scaling computation to compete.
There is a credible open source movement around consumer-facing models, initially driven forward by a few centralized players who chose to compete for market share with larger closed-source competitors (e.g. Meta, Stability AI).CommunityThe Web3 space is catching up and putting pressure on leading AI companies. These pressures will continue to affect closed-source development of AI products, but will not be material if open source catches up. This is another major opportunity for the Web3 space, but only if it solves the problem of decentralized model training and inference.
So, despite the ostensible opportunity for “classic” disruptors, the reality is far from that. AI is primarily associated with computing, and unless there are breakthrough innovations in the next 3-5 years, this will not change, which will be crucial in determining who controls and directs AI development.
Even if demand drives supply-side efforts, the computing market itself cannot "let a thousand flowers bloom", and competition among manufacturers is constrained by structural factors such as chip manufacturing and economies of scale.
We are optimistic about human ingenuity and are certain that there are smart and noble people out there who will try to solve the AI problem space in a way that benefits the free world rather than top-down corporate or government control. But the odds look very slim and this is a speculative game at best.And Web3 founders are busy with financial interests rather than real-world impact.
The article comes from the Internet:The other side of AI tokens: most projects are busy with financial benefits rather than real-world impact
Related recommendation: Coinbase founder: Why Crypto is crucial to America’s future?
An initiative for the American people and American policymakers.DAOSquare critics often claim that Crypto has no real use other than speculation and illegal activity. Yet, more than 50 million Americans have bought cryptocurrencies, and globally, the number of people buying cryptocurrencies has increased dramatically.