News center > News > Headlines > Context
The impact of DeepSeek on the upstream and downstream protocols of Web3 AI
Editor
2 hours ago 832

The impact of DeepSeek on the upstream and downstream protocols of Web3 AI

Author: Kevin, BlockBooster

TLDR:

DeepSeek is beneficial to the model layer in the upstream and downstream of the industry. With the application layer, it has a negative impact on the computing power protocol in the infrastructure;

DeepSeek's favorable interests inadvertently burst the last bubble of the Agent track, DeFAI is the most powerful It may give birth to newborns;

The zero-sum game for project financing is expected to come to an end, and a new financing method of community launch + a small number of VCs may become the norm.

The impact caused by DeepSeek will have a profound impact on the upstream and downstream of the AI ​​industry this year. DeepSeek successfully allowed household consumer-grade graphics cards to complete a large number of high-end GPUs before they could bear it. big model training task. The first moat around the development of AI, computing power, began to collapse. When algorithmic efficiency ran wildly at 68% per year, while hardware performance followed a linear climb in Moore's Law, the deep-rooted valuation model in the past three years was no longer applicable. The next chapter of the company will be opened by the open source model.

Although Web3's AI protocol is completely different from Web2, it is inevitably affected by DeepSeek, which will affect Web3 AI upstream and downstream: Infrastructure Layers, middleware layers, model layers and application layers have created new use cases.

1. Clarify the collaborative relationship between upstream and downstream protocols

Through technical architecture, functional positioning and Analysis of actual use cases, I divided the entire ecosystem into: infrastructure layer, middleware layer, model layer, application layer, and sorted out its dependencies:

p>

1. Infrastructure layer

Infrastructure layer provides decentralized underlying resources (computing power ,liveStorage, L1), among which computing power protocols include: Render, Akash, io.net, etc.; storage protocols include: Arweave, Filecoin, Storj, etc.; L1 includes: NEAR, Olas, Fetch.ai, etc.

Computing power layer protocol supports model training, inference and framework operation; storage protocol saves training data, model parameters and on-chain interaction records; L1 passes through special nodes Optimize data transmission efficiency and reduce latency.

2. Middleware layer

The middleware layer connects infrastructure and upper-level applications. Bridge, providing framework development tools, data services and privacy protection, among which data annotation protocols include: Grass, Masa, Vana, etc.; development framework protocols include: Eliza, ARC, Swarms, etc.; privacy computing protocols include: Phala, etc.

The data service layer provides fuel for model training, the development framework relies on the computing power and storage of the infrastructure layer, and the privacy computing layer protects the security of data in training/inference. sex.

3. Model layer

The model layer is used for model development, training and distribution, where Open source model training platform: Bittensor.

The model layer relies on the computing power of the infrastructure layer and the data of the middleware layer; the model is deployed on the chain through the development framework; the model market delivers training results to the application layer.

4. Application layer

Application layer is an AI product for end users, among which Agent Including: GOAT, AIXBT, etc.; DeFAI protocols include: Griffain, Buzz, etc.

The application layer calls the pre-trained model of the model layer; rely on the privacy computing of the middleware layer; complex applications require real-time computing power of the infrastructure layer.

2. Negative impact on decentralized computing power

According to the sampling survey, About 70% of Web3 AI projects actually call OpenAI or centralized cloud platformOnly 15% of projects use decentralized GPUs (such as the Bittensor subnet model), and the remaining 15% are hybrid architectures (sensitive data locally processed, common tasks are cloud-based).

The actual usage rate of the decentralized computing power protocol is much lower than expected and does not match its actual market value. There are three reasons for the low usage rate: Web2 developers use the original tool chain when migrating to Web3; the decentralized GPU platform has not yet achieved price advantages; some projects evade data compliance review in the name of "decentralization", and actually calculate The power still relies on centralized cloud.

AWS/GCP accounts for 90%+ of AI computing power, compared with that, Akash's equivalent computing power is only 0.2% of AWS. The moat of centralized cloud platform includes: cluster management, RDMA high-speed network, elastic scaling; the decentralized cloud platform has the web3 improved version of the above technology, but the defects that cannot be improved include: the delay problem of distributed node communication is centralized. 6 times the cloud; tool chain split: PyTorch/TensorFlow does not natively support decentralized scheduling.

DeepSeek reduces computing power consumption by 50% through Sparse Training, dynamic model Pruning realizes consumer-grade GPU training of 10 billion parameter models. The market's expectations for demand for high-end GPUs in the short term have been significantly reduced, and the market potential of edge computing has been revalued. As shown in the figure above, before the emergence of DeepSeek, most of the protocols and applications in the industry used platforms such as AWS, and only a few use cases were deployed in decentralized GPU networks. Such use cases focused on the latter's consumer-level computing power. Price advantage and no attention to the impact of delays.

This situation may worsen with the emergence of DeepSeek. DeepSeek has released the limitations of long-tail developers. Low-cost and efficient inference models will be popularized at an unprecedented speed. In fact, the above-mentioned centralized cloud platforms and many of them have begun to deploy DeepSeek. The significant reduction in inference costs will give birth to a large number of front-ends. Applications, these applications have huge demands for consumer-grade GPUs. Faced with the huge market that is coming, centralized cloud platforms will launch a new round of user competition, not only competing with leading platforms, but also competing with countless small centralized cloud platforms. The most direct way to compete is to reduce prices. It can be foreseen that the price of 4090 on centralized platforms will be lowered.This is a catastrophe for Web3's computing power platform. When prices are not the only moat of the latter and computing power platforms in the industry are forced to lower prices, the result is that io.net, Render, and Akash cannot bear it. The price war will destroy the latter's only remaining valuation ceiling, and the death spiral caused by decline in returns and user loss may give the decentralized computing power protocol a new direction.

3. Meaning to upstream and downstream agreements

As shown in the figure, I think DeepSeek will have different impacts on the infrastructure layer, model layer and application layer. From a positive impact:

The application layer will benefit from a significant reduction in inference costs. More applications can use low costs to ensure that the Agent application is online for a long time and complete tasks in real time;

At the same time, the low-cost model overhead of DeepSeek can make the DeFAI protocol form a more complex SWARM. Thousands of agents are used in a use case, and the division of labor for each agent will be very subtle and clear, which can greatly improve User experience to avoid user input being erroneously disassembled and executed by the model;

Developers at the application layer can fine-tune the model and give DeFi-related AI applications prices. On-chain data and analysis, protocol governance data without having to pay high license fees.

After the birth of DeepSeek, the existence of the open source model layer was proven. High-end models are open to long-tail developers and can stimulate a wide range of development booms;

The computing power high wall built around high-end GPUs in the past three years has been completely broken. Developers have more choices and establish a direction for open source models. In the future, AI models will not compete in the future. Then it is computing power but algorithms. The transformation of belief will become the cornerstone of confidence for open source model developers;

Special subnets around DeepSeek will emerge one after another, and equal computing The model parameters under the force will increase, and more developers will join the open source community.

In terms of negative influence:

InfrastructureThe objective usage delay of the computing power protocol in the application cannot be optimized; and the hybrid network composed of A100 and 4090 has higher requirements for coordination algorithms, which is not to go Advantages of centralized platforms.

4. Breaking the Agent bubble, DeFAI give birth to newborns

Agent is the last AI in the industry The hope that the emergence of DeepSeek liberates the limitations of computing power and depicts the future expectations of the explosion of applications. This was a huge benefit to the Agent track, but due to the strong relationship between the industry, the US stock market and the Federal Reserve, the only remaining bubble was burst, and the track's market value fell to the bottom.

In the wave of integration between AI and the industry, technological breakthroughs and market games have always been with us. The chain reaction caused by Nvidia's market value fluctuations is like a mirror that reflects the deep dilemma of AI narrative in the industry: From On-chain Agent to DeFAI engine, under the seemingly complete ecological map, it conceals the weak technological infrastructure, hollowing out value logic, and capital. The cruel reality that dominates. The on-chain ecosystem that is superficially prosperous is hidden hidden diseases: a large number of high FDV tokens compete for limited liquidity, old assets rely on FOMO emotions to survive, and developers are trapped in PVP intravolume and consume innovation potential energy. When incremental capital and user growth hit the ceiling, the entire industry fell into the "innovator's dilemma" - both eager for breakthrough narratives and difficult to get rid of the shackles of path dependence. This tearing state just provides a historic opportunity for AI Agent: it is not only an upgrade of the technology toolbox, but also a reconstruction of the value creation paradigm.

In the past year, more and more teams in the industry have found that the traditional financing model is ineffective—giving VCs small share, highly control the market, etc. The routine is no longer sustainable. VC pockets are tightened, retail investors refuse to take over, and the threshold for big souvenirs is high. Under the triple pressure, a new gameplay that is more suitable for the bear market is rising: the joint head KOL+ a small number of VCs, a large proportion of community launches, and a low market value starts coldly.

Innovators represented by Soon and Pump Fun are opening up new paths through "community launch" - the endorsement of the joint head KOL, which will 40%-60% The tokens are distributed directly to the community, launching projects at valuation levels as low as $10 million FDV, achieving multi-million dollar financing. This model builds consensus FOMO through KOL influence, allowing the team to lock in profits in advance, while trading high liquidity for the marketThe market depth, although it abandons the short-term control advantage, it can repurchase tokens at low prices in bear markets through a compliant market. In essence, this is the paradigm migration of the power structure: from the VC-led drum-passing flower game (institutional takeover - sell-up - retail investors pay), to a transparent game of community consensus pricing, the project party and the community form a new symbiosis in liquidity premium relation. When the industry enters a revolutionary cycle of transparency and projects that are obsessed with traditional control logic may become a afterimage of the era under the wave of power migration.

The short-term market pain just confirms the irreversible technology trend. When the AI ​​Agent reduces the cost of on-chain interaction by two orders of magnitude, and when the adaptive model continuously optimizes the funding efficiency of the DeFi protocol, the industry is expected to usher in the long-awaited Massive Adoption. This change does not rely on concept hype or capital to ripen, but is rooted in technological penetration of real demand—just like the electric revolution has not stagnated by the bankruptcy of light bulb companies, Agent will eventually become a real gold race after the bubble bursts road. And DeFAI may be the fertile ground for giving birth to newborns. When low-cost reasoning becomes daily, we may soon see hundreds of use cases in which hundreds of agents are combined into one Swarm. Under equivalent computing power, a significant increase in model parameters can ensure that the agents in the open source model era can be more fully fine-tuned, and can be split into tasks that a single agent can fully execute, even in the face of user complex input instructions. Each Agent optimizes on-chain operations, which may promote increased overall DeFi protocol activity and increased liquidity. More complex DeFi products led by DeFAI will appear, and this is where new opportunities emerge after the last round of bubble burst.

Keywords: Bitcoin
Share to: