Coin World News Report:
At first glance, AI x Web3 seems to be independent technologies, each based on fundamentally different principles and serving different functions. However, a closer examination reveals that these two technologies have the potential to balance each other’s trade-offs, and their unique advantages can complement and enhance each other. Balaji Srinivasan eloquently expounded on this concept of complementary capabilities at the SuperAI conference, sparking a detailed comparison of how these technologies interact with each other.
Token emerged from the decentralized efforts of the anonymous network punks and has evolved over the past decade through the collaborative efforts of numerous independent entities. In contrast, artificial intelligence has been developed in a top-down manner, dominated by a few tech giants. These companies determine the pace and dynamics of the industry, and the entry barrier is determined more by resource intensity than technological complexity.
These two technologies also have fundamentally different natures. Essentially, Token is a deterministic system that produces immutable results, such as the predictability of hash functions or zero-knowledge proofs. This is in stark contrast to the probabilistic and often unpredictable nature of artificial intelligence.
Similarly, cryptography excels in verification, ensuring the authenticity and security of transactions, and establishing trustless processes and systems, while artificial intelligence focuses on generation and creating rich digital content. However, ensuring the source of content and preventing identity theft becomes a challenge in the process of creating digital richness.
Fortunately, Token offers a contrasting concept of digital scarcity. It provides relatively mature tools that can be extended to artificial intelligence technologies to ensure the reliability of content sources and avoid identity theft.
One notable advantage of Token is its ability to attract a large amount of hardware and capital into coordinated networks to serve specific goals. This ability is particularly beneficial for resource-intensive artificial intelligence. Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.
By comparing these two technologies, we can not only appreciate their respective contributions but also see how they work together to create new paths for technology and economy. Each technology can compensate for the shortcomings of the other and create a more integrated and innovative future. In this blog post, we aim to explore the emerging AI x Web3 industry landscape, with a focus on some emerging verticals at the intersection of these technologies.
Source: IOSG Ventures
2.1 Computing Networks
The industry landscape first introduces computing networks, which aim to address the constrained GPU supply problem and attempt to reduce computing costs in different ways. The following are worth highlighting:
Non-uniform GPU interoperability: This is a very ambitious attempt with high technological risks and uncertainties. However, if successful, it could create significant results on a large scale, making all computing resources interchangeable. Essentially, the idea is to build compilers and other prerequisites so that any hardware resource can be inserted on the supply side, and the non-uniformity of all hardware on the demand side will be completely abstracted, allowing your computing requests to be routed to any resource in the network. If this vision succeeds, it will reduce the current dependence on CUDA software, which is completely dominated by AI developers. Although there are high technological risks, many experts are highly skeptical of this approach’s feasibility.
High-performance GPU aggregation: Integrating the world’s most popular GPUs into a distributed and permissionless network without worrying about interoperability issues between non-uniform GPU resources.
Consumer-grade GPU aggregation: Aiming to aggregate some lower-performance GPUs that may be available in consumer devices, which are currently underutilized resources on the supply side. It caters to those who are willing to sacrifice performance and speed for cheaper and longer training processes.
2.2 Training and Inference
Computing networks are primarily used for two main functions: training and inference. The demand for these networks comes from Web 2.0 and Web 3.0 projects. In the Web 3.0 field, projects like Bittensor utilize computing resources for model fine-tuning. In terms of inference, Web 3.0 projects emphasize the verifiability of the process. This focus has given rise to verifiable inference as a market vertical, where projects are exploring how to integrate AI inference into smart contracts while maintaining decentralization principles.
2.3 Intelligent Agent Platforms
Next is the intelligent agent platform, and the landscape outlines the core issues that startups in this category need to address:
Agent interoperability and discovery and communication capabilities: Agents can discover and communicate with each other.
Agent cluster building and management capabilities: Agents can form clusters and manage other agents.
Ownership and marketplace for AI agents: Providing ownership and marketplace for AI agents.
These features highlight the importance of flexible and modular systems that can seamlessly integrate into various blockchain and AI applications. AI agents have the potential to fundamentally change the way we interact with the internet, and we believe that agents will leverage infrastructure to support their operations. We envision AI agents relying on infrastructure in the following areas:
Utilizing distributed crawling networks to access real-time internet data.
Using DeFi channels for inter-agent payments.
Requiring economic deposits not only for punishment in case of misconduct but also to increase agent discoverability (i.e., using deposits as economic signals during the discovery process).
Utilizing consensus to determine which events should result in slashing.
Open interoperability standards and agent frameworks to support building composable collectives.
Evaluating past performance based on immutable data history and dynamically selecting appropriate agent collectives.
Source: IOSG Ventures
2.4 Data Layer
In the fusion of AI x Web3, data is a core component. Data is a strategic asset in AI competition, along with computing resources, forming critical resources. However, this category is often overlooked as most of the industry’s attention is focused on the computing layer. In fact, primitives provide many interesting value directions in the data acquisition process, mainly including the following two high-level directions:
Access to public internet data: This direction aims to build a distributed crawler network that can crawl the entire internet in a matter of days to obtain massive datasets or access very specific internet data in real-time. However, to crawl large datasets from the internet, the network demand is very high, requiring at least hundreds of nodes to start meaningful work. Fortunately, Grass, a distributed crawler node network, has already had over 2 million nodes actively sharing internet bandwidth to crawl the entire internet. This demonstrates the great potential of economic incentives in attracting valuable resources.
Although Grass provides a fair competitive environment for public data, there are still challenges in utilizing potential data, specifically the access to proprietary datasets. Specifically, there is still a significant amount of data that is stored in a privacy-protected manner due to its sensitive nature. Many startups are leveraging cryptographic tools to allow AI developers to build and fine-tune large language models using the underlying data structure of proprietary datasets while keeping sensitive information private.
Technologies such as federated learning, differential privacy, trusted execution environments, fully homomorphic encryption, and multi-party computation provide different levels of privacy protection and trade-offs. Bagel’s research article (https://blog.bagel.net/p/with-great-data-comes-great-responsibility-d67) provides an excellent overview of these technologies. These technologies not only protect data privacy in the machine learning process but also enable comprehensive privacy protection AI solutions at the computing level.
2.5 Data and Model provenance
Data and model provenance technologies aim to establish processes that can guarantee to users that they are interacting with the expected models and data. Additionally, these technologies provide guarantees of authenticity and provenance. Taking watermarking technology as an example, it is one of the model provenance technologies that directly embeds signatures into machine learning algorithms, more specifically, directly embedding them into model weights so that in retrieval, the inference can be verified to come from the expected model.
2.6 Applications
In terms of applications, the possibilities are endless. In the industry landscape mentioned above, we list some development cases that are particularly anticipated as AI technologies are applied in the Web 3.0 field. Since these use cases are mostly self-descriptive, we will not provide additional comments here. However, it is worth noting that the intersection of AI and Web 3.0 has the potential to reshape many verticals in the field as these new primitives provide developers with more freedom to create innovative use cases and optimize existing ones.
Summary
The fusion of AI x Web3 brings promising prospects full of innovation and potential. By leveraging the unique advantages of each technology, we can address various challenges and pave the way for new technological paths. In exploring this emerging industry, the synergy between AI x Web3 can drive progress and reshape our future digital experiences and interactions on the internet.
The fusion of digital scarcity and digital richness, the mobilization of underutilized resources for computing efficiency, and the establishment of secure and privacy-protecting data practices will define the era of the next generation of technological evolution.
However, we must recognize that this industry is still in its early stages, and the current industry landscape may become outdated in a short period of time. The rapid pace of innovation means that today’s cutting-edge solutions may soon be replaced by new breakthroughs. Nevertheless, the foundational concepts discussed, such as computing networks, agent platforms, and data protocols, highlight the immense possibilities of the fusion of artificial intelligence and Web 3.0.