Author: Henry @IOSG
1. ForewordIn just 3 months, the market value of AI .
In fact, the relationship between artificial intelligence and blockchain has a long history, from the early decentralized model training on the Bittensor subnet to decentralized GPU/computing resource markets such as Akash and io.net , to the current wave of AI x memecoins and frameworks on Solana. Each stage demonstrates the extent to which cryptocurrencies can complement AI through resource aggregation, enabling sovereign AI and consumer use cases.
In the first wave of Solana AI coins, some brought meaningful utility rather than just pure speculation. We see the emergence of frameworks like ai16z’s ELIZA, AI agents such as aixbt that provide market analysis and content creation, or toolkits that integrate AI with blockchain capabilities.
In the second wave of AI, as more tools mature, applications have become key value drivers, and DeFi has become the perfect proving ground for these innovations. To simplify the expression, in this study, we refer to the combination of AI and DeFi as “DeFai”.
DeFai has a market capitalization of approximately $1 billion, according to CoinGecko. Griffian dominates the market with 45% share, while $ANON has 22%. This track began to experience rapid growth after December 25, and during the same period, frameworks and platforms such as Virtual and ai16z experienced strong growth after the Christmas holiday.
▲ Source: Coingecko.com
This is just the first step, DeFai’s potential is far beyond this. Although DeFai is still in the proof-of-concept stage, we cannot underestimate its potential to transform the DeFi industry into a more user-friendly, intelligent, and efficient financial ecosystem by leveraging the intelligence and efficiency that AI can provide.
Before we dive into the world of DeFai, we need to understand how agents actually operate in DeFi/blockchain.
2. How Agent works in DeFi systemArtificial Intelligence Agent (AI Agent) refers to a program that can perform tasks on behalf of users according to workflow. The core behind AI AgentThe heart is an LLM (Large Language Model) that can react based on its training or learned knowledge, but this response is often limited.
Agent is fundamentally different from a robot. Bots are typically task-specific, require human supervision, and need to operate within predefined rules and conditions. In contrast, agents are more dynamic and adaptive, able to learn autonomously to achieve specific goals.
To create a more personalized experience and more comprehensive responses, the agent can store past interactions in memory, allowing the agent to learn from the user's behavioral patterns and adapt its responses based on historical context. Generate tailored recommendations and strategies.
In the blockchain, agents can interact with smart contracts and accounts to handle complex tasks without continuous human intervention. For example, in terms of simplifying the defi user experience, including performing multi-step bridging and farming with one click, optimizing farming strategies for higher returns, executing transactions (buy/sell) and conducting market analysis, all of these steps are completed autonomously of.
Referring to @3sigma’s research, most models follow 6 specific workflows:
Data collection
Model inference
Decision Making
Hosting and Running
Interoperability
Wallet
1. Data CollectionFirst, the model needs to understand the environment in which it works. They therefore require multiple data streams to keep the model in sync with market conditions. This includes: 1) On-chain data from indexers and oracles 2) Off-chain data from price platforms such as CMC / Coingecko / other data providers’ data APIs.
2. Model ReasoningOnce models learn the environment, they need to apply this knowledge to make predictions or executions based on new, unidentified input data from the user. Models used by Agent include:
Supervised and unsupervised learning: models trained on labeled or unlabeled data to predict outcomes. In a blockchain context, these models can analyze governance forum data to predict voting outcomes or identify transaction patterns.
Reinforcement learning: A model that learns through trial and error by evaluating the rewards and punishments of its actions. Applications include optimizing token trading strategies, such as determining the best buy-in points for purchasing tokens or adjusting farming parameters.
Natural Language Processing (NLP): Technology for understanding and processing human language input, which is valuable for scanning governance forums and proposals for opinions.
▲ Source: https://www.researchgate.net/figure/The-main-types-of-machine-learning-Main-approaches-include-classification-and-regression_fig1_354960266
3. Decision-makingWith trained models and data, agents can take action using their decision-making capabilities. This includes interpreting the situation and responding appropriately.
At this stage, the optimization engine plays an important role in finding the best results. For example, an agent needs to balance multiple factors such as slippage, spreads, transaction costs, and potential profits before executing a yield strategy.
Since a single agent may not be able to optimize decisions in different areas, a multi-agent system can be deployed for coordination.
4. Hosting and runningDue to the computationally intensive nature of the task, AI Agents typically host their models off-chain. Some agents rely on centralized cloud services such as AWS, while those that favor decentralization use distributed computing networks such as Akash or ionet and Arweave for data storage.
Although the AI Agent model runs off-chain, the agent needs to interact with on-chain protocols to execute smart contract functions and manage assets. This interaction requires a secure key management solution, such as an MPC wallet or a smart contract wallet, to handle transactions securely. Agents can operate via APIs to communicate and interact with their communities on social platforms such as Twitter and Telegram.
5. InteroperabilityAgent needs to interact with various protocols while staying updated between different systems. They often use API bridges to obtain external data, such as price feeds.
In order to keep abreast of the current protocol status and respond appropriately, the agent needs real-time synchronization through decentralized messaging protocols such as webhooks or IPFS.
6. WalletAgent needs a wallet or access to private keys to initiate blockchain transactions. There are two common wallet/key management methods on the market: MPC-based and TEE-based solutions.
For portfolio management applications, MPC or TSS can split the keys between the agent, the user, and the trusted party, while the user can still maintain a certain degree of control over the AI. The Coinbase AI Replit wallet effectively implements this approach, demonstrating how an MPC wallet can be implemented using an AI agent.
For fully autonomous AI systems, TEE provides an alternative to store private keys in a secure enclave, enabling the entire AI agent to operate in oneOperates in a hidden and protected environment without interference from third parties. However, TEE solutions currently face two major challenges: hardware centralization and performance overhead.
After overcoming these difficulties, we will be able to create an autonomous agent on the blockchain, and different agents can perform their respective duties in the DeFi ecosystem to increase efficiency and improve the on-chain transaction experience.
In general, I will temporarily divide DeFi x Ai into 4 major categories:
Abstract/user experience-friendly AI
Income optimization or investment portfolio Management
Market analysis or prediction robot
DeFai infrastructure/platform
3. Open the door to the DeFi x AI world - DeFai▲ Source: IOSG Venture
#1 Abstract/User-Experience Friendly AI
The purpose of artificial intelligence is to improve efficiency, solve complex problems, and simplify complex tasks for users. Abstraction-based AI can simplify access to the complexities of DeFi for both new and existing traders.
In the field of blockchain, effective AI solutions should be able to:
Automatically perform multi-step transactions and staking operations without users having any industry knowledge;
Perform real-time research to provide users with all the necessary information and data needed to make informed trading decisions;
Acquire data from various platforms to identify market opportunities and provide users with comprehensive analysis.
Most of these abstract tools use ChatGPT as the core. While these models need to be seamlessly integrated with blockchain, it seems to me that none of the models are specifically trained or adapted based on blockchain data.
GriffainGriffain founder Tony proposed this concept at the Solana Hackathon. He later turned the idea into a functional product and gained the support and approval of Solana founder Anatoly.
Simply put, griffain is currently the first and most powerful abstract AI on Solana, which can perform functions such as swap, wallet management, NFT minting, and token sniping.
The following are the specific functions provided by griffain:
Execute transactions in natural language
Use pumpfun to issue tokens, mint NFTs, and select addresses for airdrops< /p>
Multi-agent coordination
Agent can publish tweets on behalf of users
Snipe newly launched memecoins on pumpfun based on specific keywords or conditions
Pledge, automaticization and execution of DeFi strategies
Scheduling tasks, users can input input to the agent to create customized agents
Obtain data from the platform for market analysis, such as identifying token holdings Distribution of users
Although griffain provides many functions, users still need to manually enter the token address or provide specific execution instructions to the agent. Therefore, for beginners who are not familiar with these technical instructions, the current products still have room for optimization.
So far, griffain provides two types of AI agents: personal artificial intelligence agents and special agents.
Personal AI Agent is controlled by the user. Users can customize instructions and enter memory settings to tailor the agent to their personal circumstances.
Special agents are agents designed for specific tasks. For example, an Airdrop Agent is trained to look up addresses and allocates tokens to designated holders, while a Staking Agent is programmed to stake SOL or other assets to an asset pool. in order to implement the mining strategy.
Griffain's multi-agent collaboration system is a distinctive feature. Multiple agents can work together in a chat room. These agents can solve complex tasks independently while maintaining collaboration.
▲ Source: Source: https://griffain.com
After the account is created, the system will generate With a wallet, users can entrust their accounts to agents, who can execute transactions and manage investment portfolios independently.
Among them, the key is split through Shamir Secret Sharing, so that neither griffain nor privy can host the wallet. According to Slate, the working principle of SSS is to split the key into three parts, including:
Device sharing: stored in the browser, retrieved when opening a tab
Authorized sharing: stored on the Privy server, retrieved when authenticating and logging into the application
< p>Restore Share: Encrypted Storage in Privy On the server, it can only be decrypted and obtained when the user enters the password to log in to the tabIn addition, the user can also choose to export or export on the griffain front end.
AnonAnon was founded by Daniele Sesta, known for creating DeFi protocols Wonderland and MIM (Magic Internet Money). Similar to Griffin, Anon is also designed to simplify user interaction with DeFi.
Although the team has introduced its future features, since the product is not yet public, no features have been verified yet. Some features include:
Use natural language (multiple languages including Chinese) to execute transactions
Cross-chain bridging through LayerZero
With Aave, Sparks, Partner agreements such as Sky and Wagmi for lending and supply
Get real-time price and data information through Pyth
Provide time-based and gas prices Automatic execution and triggers
Provide real-time market analysis, such as emotion detection, social profile analysis, etc.
In addition to core functions, Anon also supports various AI models, including Gemma, Llama 3.1, Llama 3.3, Vision, Pixtral and Claude. These models have the potential to provide valuable market analysis, providing information that helps users save research time and make informed decisions, which is especially valuable in today’s market where new tokens with a market cap of $100 million are emerging every day.
Wallets can be exported and authorization revoked, but specific details about wallet management and security protocols have not yet been made public.
In addition to core functionality, Anon supports a variety of AI models, including Gemma, Llama 3.1, Llama 3.3, Vision, Pixtral, and Claude.
In addition, daniele recently released 2 updates about Anon:
Automate framework:
A typeScript framework that can help more projects become more efficient. Quickly integrate with Anon. The framework will require all data and interactions to follow a predefined structure so that Anon can reduce the risk of AI being hallucinated and be more reliable.
Gemma:
A research/research agent that can collect from on-chain defi metrics (such as TVL, transaction volume, prepdex funding rate) and off-chain data (such as Twitter and Telegram) Real-time data for social sentiment analysis. This data will be transformed into opportunity alerts and customized insights for users.
Judging from the documentation, this makes Anon one of the most anticipated and powerful abstraction tools in the entire field. This is especially valuable in today’s market where new tokens with market caps of $100 million are emerging every day.
SlatePowered by BigBrain Holdings, SlatePositioning itself as an “Alpha AI” that can conduct autonomous transactions based on on-chain signals. Currently Slate is the only abstract AI capable of automating transactions on Hyperliquid.
Slate prioritizes price routing, fast execution, and the ability to simulate before trading. The main functions include:
Cross-chain swap between EVM chain and Solana
Automated trading based on price, market value, gas fees and profit and loss indicators
Natural language Task scheduling
On-chain transaction aggregation
Telegram notification system
Can open long and short positions, liquidate under specific conditions, LP management + mining, including hyperliquid Execute on There is a 0.35% fee for operations such as borrowing, lending, repaying, pledging, unplacing, long, short, locking, and unlocking.
Conditional operation: If a conditional order (such as a limit order) is set. Slate charges a 0.25% fee if based on gas fees; all other conditions are 1.00%.
In terms of wallets, Slate integrates Privy’s embedded wallet architecture to ensure that neither Slate nor Privy will host users’ wallets. Users can either connect their existing wallets or authorize an agent to perform transactions on their behalf.
▲ Source: https://docs.slate.ceo
Comparative analysis of abstract AIComparison with the mainstream Abstract AI:
▲ Source: IOSG Venture
Currently, most AI abstraction tools are supported in Solana and EVM Cross-chain transactions and asset bridging between chains. Slate offers Hyperliquid integration, while Neur and Griffin currently only support Solana, but cross-chain support is expected to be added soon.
Most platforms integrate Privy embedded wallets and EOA wallets, allowing users to manage funds independently, but require users to authorize agent access to perform certain transactions. This provides an opportunity for TEE (Trusted Execution Environment) to ensure the tamper resistance of AI systems.
Although most A.I.Abstract tools share functionality such as token issuance, trade execution, and natural language conditional orders, but their performance differs significantly.
At a product level, we are still in the early stages of abstract AI. Comparing the five projects mentioned above, Griffin stands out for its rich feature set, extensive collaboration network, and workflow handling of multi-agent collaboration (Orbit is also another project that supports multi-agent). Anon excels with fast responses, multi-language support, and Telegram integration, while Slate benefits from its sophisticated automation platform and is the only proxy to support Hyperliquid.
However, among all the abstract AI, some platforms still face challenges when processing basic transactions (such as USDC Swap), such as being unable to accurately obtain the correct token address or price, or failing to analyze the latest market trend. Response time, accuracy, and result correlation are also important differentiators in measuring a model's underlying performance. In the future, we hope to work with the team to develop a transparent dashboard that tracks the performance of all abstract AI in real time.
#2 Autonomous Return Optimization and Portfolio Management
Unlike traditional return strategies, protocols in this field use AI to analyze on-chain data for trend analysis and provide help teams to develop more Information on superior return optimization and portfolio allocation strategies. To reduce costs, models are often trained on a Bittensor subnet or off-chain. In order for AI to execute transactions autonomously, verification methods such as ZKP (zero-knowledge proof) are used to ensure the honesty and verifiability of the model. Here are a few examples of DeFai protocols benefiting from optimization:
T3AI is a lending protocol that supports under-collateralization by using AI as an intermediary and risk engine. Its AI agent monitors loan health in real time and ensures loans are payable through T3AI’s risk indicator framework. At the same time, AI provides accurate risk predictions by analyzing the relationship between different assets and their price trends. The specific performance of T3AI's AI is:
Analyze the price data of major CEX and DEX;
Measure the volatility of different assets;
Study the correlation of asset prices and connectivity;
Discover hidden patterns in asset interactions.
AI will recommend optimal allocation strategies based on the user’s portfolio, and potentially enable autonomous AI portfolio management after model adjustment. In addition, T3AI ensures the verifiability and reliability of all operations through ZK proofs and a network of validators.
▲ Source: https://www.trustinweb3.xyz/
KudaiKudai is an experimental GMX ecosystem agent developed by the GMX Blueberry Club utilizing the EmpyrealSDK toolkit, and its token is currently traded on the Base network.
Kudai’s philosophy is to use all transaction fees generated by $KUDAI to fund agents that fund autonomous trading operations and distribute profits to token holders.
In the upcoming Phase 2/4, Kudai will be able to interpret natural language on Twitter:
Buy and stake $GMX to generate new revenue streams;
< p>Invest in the GMX GM pool to further increase returns;Buy GBC NFT at the lowest price to expand your investment portfolio.
After this stage, Kudai will be fully autonomous and can independently execute leveraged transactions, arbitrage and earn returns on assets (interest). The team has not disclosed any further information.
Sturdy Finance V2Sturdy Finance is a lending and yield aggregator that utilizes AI models trained by Bittensor SN10 subnet miners to optimize yields by moving funds between different whitelisted silo pools.
Sturdy adopts a two-layer architecture, consisting of independent asset pools (silo pools) and aggregator layer (aggregator layer):
Independent asset pools (Silo Pools) These are single asset isolation pools , where users can only borrow from a single asset or borrow using a single collateral.
The aggregator layer (Aggregator Layer) is built on Yearn V3 and allocates user assets to independent asset pools that have been whitelisted and reviewed based on utilization and rate of return. The Bittensor subnet provides the aggregator with the best distribution strategy. When users deposit assets into an aggregator, they are only exposed to the selected collateral type, completely avoiding risks from other lending pools or collateralized assets.
▲ Source: https://sturdy.finance
As of the time of writing, the TVL of Sturdy V2 Having been declining since May 2024, the aggregator’s TVL is currently approximately $3.9 million, accounting for 29% of the protocol’s total TVL.
Sturdy has consistently had double-digit daily active users (>100) since September 2024, with pxETH and crvUSD being the main lending assets on the aggregator. However, the performance of this agreement isThere has been a noticeable stagnation over the past few months. The integration of AI appears to be introduced in hopes of reigniting the protocol’s growth momentum.
▲ Source: https://dune.com/tk-research/sturdy-v2
# 3 Market Analysis Agent
#Aixbt
Aixbt is a market sentiment tracking agent that aggregates and analyzes data from more than 400 Twitter KOLs. With its proprietary engine, AixBT is able to identify real-time trends and publish market observations around the clock.
Among the existing AI agents, AixBT holds a significant 14.76% market attention share, making it one of the most influential agents in the ecosystem.
▲ Source: Kaito.com
Aixbt is designed for social media interaction and publishes insights Directly reflects the focus of market attention.
Its functionality goes beyond providing market insights (alpha) and includes interactivity. AixBT is able to respond to user questions and even conduct token issuance via Twitter using a professional toolkit. For example, the $CHAOS token was created in collaboration with Simi, another interactive bot, using the @EmpyrealSDK toolkit.
As of now, users holding 600,000 $AIXBT tokens (worth approximately $240,000) have access to its analytics platform and terminal.
#4 Decentralized AI infrastructure and platform
The existence of Web3 AI Agent cannot be separated from the support of decentralized infrastructure. These projects not only provide support for model training and inference, but also provide data, validation methods, and coordination layers to drive the development of AI agents.
Whether it is Web2 or Web3 AI, models, computing power, and data are always the three cornerstones that promote the excellent development of large language models (LLM) and AI agents. Open source models trained in a decentralized manner will be favored by agent builders because this approach completely eliminates the single point of risk caused by centralization and opens up the possibility of user-owned AI. Developers don’t need to rely on the LLM APIs of Web2 AI giants like Google, Meta, and OpenAI.
The following is an AI infrastructure diagram drawn by Pinkbrains:
▲ Source: Pink Brains
Model Creation
Nous Research Pioneers such as , Prime Intellect and Exo Labs are pushing the boundaries of decentralized training
Nous Research’s Distro training algorithm and Prime Intellect’s DiLoco algorithm have successfully trained more than 100 students in a low-bandwidth environment. 100 million parameter model, which shows that large-scale training can also be achieved outside of traditional centralized systems. Exo Labs has further launched the SPARTA distributed AI training algorithm, which reduces the communication volume between GPUs by more than 1,000 times.
Bagel is committed to becoming a decentralized HuggingFace, providing models and data for AI developers, while solving the ownership and monetization issues of open source data through encryption technology Bittensor. A competitive market is built where participants can contribute computing power, data and intelligence to accelerate the development of AI models and agents.
Data and computing power service providers
Many people. It is believed that AixBT can stand out in the utility proxy category mainly due to its ability to obtain high-quality data sets.
Providers such as Grass, Vana, Sahara, Space and Time and Cookie DAOs provide high-quality, Domain-specific data, or allow AI Developers gain access to "walled gardens" of data, increasing their capabilities by leveraging more than 2.5 million nodes, allowing Grass to crawl up to 300 terabytes of data per day.
Currently, Nvidia is only capable of 20 million hours. Train its video model on video data, and Grass's video data set is 15 times larger (300 million hours) and growing by 4 million hours every day - that is, Nvidia's total data set 20% is being collected by Grass every day. In other words, it only takes 5 days for Grass to acquire the same amount of data from Nvidia's total video dataset.
Aethir and io. Computing markets such as Net provide cost-effective options for agent developers by aggregating various GPUs. Hyperbolic's decentralized GPU market cuts computing costs by up to 75%, while hosting open source AI models and providing Web2 cloud providers are quitelow-latency reasoning capabilities.
Hyperbolic further enhances its GPU marketplace and cloud services with the launch of AgentKit. AgentKit is a powerful interface that allows AI agents full access to Hyperbolic’s decentralized GPU network. It features an AI-readable map of computing resources that scans and provides detailed information on resource availability, specifications, current load, and performance in real time.
AgentKit opens up a revolutionary future where agents can independently obtain the required computing power and pay related fees.
Verification mechanism
Through the innovative Proof of Sample verification mechanism, Hyperbolic ensures that every reasoning interaction in the ecosystem is verified, establishing a foundation of trust for the future agent world.
However, verification only solves part of the trust problem for autonomous agents. Another dimension of trust involves privacy protection, which is where TEE (Trusted Execution Environment) infrastructure projects like Phala, Automata and Marlin excel. For example, proprietary data or models used by these AI agents can be securely protected.
In fact, a truly autonomous agent cannot fully operate without a TEE, as TEE is critical to protecting sensitive information, such as protecting wallet private keys, preventing unauthorized access, and ensuring Twitter Account login security, etc.
How TEE works
TEE (Trusted Execution Environment) isolates sensitive data during processing within a protected CPU/GPU enclave (secure enclave). Only authorized program code can access the contents of the enclave; cloud service providers, developers, administrators, and other parts of the hardware cannot access this area.
The main use of TEE is to execute smart contracts, especially in DeFi protocols involving more sensitive financial data. Therefore, TEE's integration with DeFai includes traditional DeFi application scenarios, such as:
Transaction privacy: TEE can hide transaction details, such as sender and receiver addresses and transaction amounts. Platforms like Secret Network and Oasis use TEE to protect transaction privacy in DeFai applications, enabling private payments.
Anti-MEV: By executing smart contracts in TEE, block builders cannot access transaction information, thus preventing front-running attacks that generate MEV. Flashbots leveraged TEE to develop BuilderNet, a decentralized block building network that reduces censorship risks associated with centralized block building. UnichaChains such as in and Taiko also use TEE to provide users with a better trading experience.
These features also work with alternative solutions such as ZKP or MPC. However, TEE is currently the most efficient among the three solutions for executing smart contracts simply because the model is hardware-based.
In terms of agents, TEE provides various capabilities for agents:
Automation: TEE can create an independent operating environment for agents to ensure that the execution of its policies is free from human interference. This ensures that investment decisions are entirely based on the independent logic of the agent.
TEE also allows agents to control social media accounts to ensure that any public statements they make are independent and free from outside influence, thereby avoiding suspicion of advertising and other propaganda. Phala is working with the AI16Z team to enable Eliza to run efficiently in a TEE environment.
Verifiability: People can verify that the agent is performing calculations using the promised model and producing valid results. Automata and Brevis are working together to develop this capability.
AI Agent Cluster
As more and more professional agents with specific use cases (DeFi, games, investments, music, etc.) enter this field, better agent collaboration and Seamless communication becomes critical.
Infrastructure for agent swarm frameworks has emerged to address the limitations of single agents. Swarm intelligence allows agents to work together as a team, pooling their capabilities to achieve a common goal. The coordination layer abstracts complexity and makes it easier for agents to collaborate under common goals and incentives.
Several Web3 companies, including Theoriq, FXN and Questflow, are moving in this direction. Of all these players, Theoriq, which originally launched in 2022 as ChainML, has been working toward this goal the longest, with the vision of becoming a universal base layer for agent artificial intelligence.
To realize this vision, Theoriq handles agent registration, payment, security, routing, planning and management in low-level modules. It also connects supply and demand, offering an intuitive agent-building platform called Infinity Studio that allows anyone to deploy their own agents, as well as Infinity Hub, a marketplace where customers can browse all available agents. In its swarm system, meta-agents select the most appropriate agent for a given task, creating “swarms” to achieve common goals while tracking reputation and contributions to maintain quality and accountability.
Theoriq tokens provide economic security that agent operators and community members can use to represent the quality of their agentsand trust, thereby incentivizing quality service and discouraging malicious behavior. Tokens also serve as a medium of exchange, used to pay for services and access data, and reward participants for contributing data, models, etc.
▲ Source: Theoriq
As the discussion around AI Agent gradually becomes a long-term industry field, And led by clear utility agents, we may see a resurgence of Crypto x AI infrastructure projects, leading to strong price performance. These projects have the potential to leverage their venture capital funding, years of R&D experience and domain-specific technical expertise to expand across the value chain. This allows them to develop their own advanced, practical AI agents capable of outperforming 95% of other agents currently on the market.
4. The evolution and future of DeFaiI have always believed that the development of the market will be divided into three stages: first, the requirement for efficiency, then decentralization, and finally privacy. DeFai will be divided into 4 stages.
The first phase of DeFi AI will focus on efficiency, improving user experience through a variety of tools to complete complex DeFi tasks without the need for solid protocol knowledge. Examples include:
Artificial intelligence that understands user prompts even if the format is imperfect
Quick execution of swaps in the shortest block time
Real-time market research, Help users make favorable decisions based on their own goals
If the innovation is realized, it can help users save time and energy while lowering the threshold for on-chain transactions, potentially creating a "phantom" in the next few months. "time.
In the second phase, the agent will trade autonomously with minimal human intervention. Trading agents can execute strategies based on third-party opinions or data from other agents, which will create a new DeFi model. Professional or sophisticated DeFi users can fine-tune their own model-building agents to create optimal returns for themselves or their clients, thereby reducing manual monitoring.
In the third phase, users will start to focus on wallet management issues and AI verification, as users will demand transparency. Solutions such as TEE and ZKP will ensure that AI systems are tamper-proof, immune to third-party interference and verifiable.
Finally, once these stages are complete, a no-code DeFi AI engineering toolkit or AI-as-a-service protocol can create an agent-based economy that uses models trained on cryptocurrencies to conduct transactions.
While this vision is ambitious and exciting, there are still several bottlenecks that have yet to be resolved:
Most of the current tools are just ChatGPT wrappers, with no clear benchmarks to Identify high-quality projects
On-chain data fragmentation will bring artificial intelligenceThe model is pushing towards centralization rather than decentralization, and it is unclear how on-chain proxies will solve this problem