Several data infrastructure and intelligence use cases take a decentralized approach to provide AI functionalities.
The rise of ChatGPT has been nothing short of spectacular. Within two months of launch, the artificial intelligence (AI)-based application reached 100 million unique users. In January 2023 alone, ChatGPT registered about 590 million visits.
In addition to AI, blockchain is another disruptive technology with increasing adoption. Decentralized protocols, applications and business models have matured and gained market traction since the Bitcoin (BTC) white paper was published in 2008. Much needs to be done to advance both of these technologies, but the zones of convergence between the two will be exciting to watch.
While the hype is around AI, a lot goes on behind the scenes to create a robust data infrastructure to enable meaningful AI. Low-quality data stored and shared inefficiently would lead to poor insights from the intelligence layer. As a result, it is critical to look at the data value chain holistically to determine what needs to be done to get high-quality data and AI applications using blockchain.
The key question is how Web3 technologies can tap into artificial intelligence in areas like data storage, data transfers and data intelligence. Each of these data capabilities may benefit from decentralized technologies, and firms are focusing on delivering them.
Data storage
It helps to understand why decentralized data storage is an essential building block for the future of decentralized AI. As blockchain projects scale, every vector of centralization could come to haunt them. A centralized blockchain project could suffer governance breakdown, regulatory clampdown or infrastructure issues.
For instance, the Ethereum network “Merge,” which moved the chain from proof-of-work to proof-of-stake in September 2022, could have added a vector of centralization to the chain. Some have argued that major platforms and exchanges like Lido and Coinbase, which have a large share of the Ethereum staking market, have made the network more centralized.
Another vector of centralization for Ethereum is its reliance on Amazon Web Services (AWS) cloud storage. Therefore, storage and processing power for blockchain projects must be decentralized over time to mitigate the risks of a single centralized point of failure. This presents an opportunity for decentralized storage solutions to contribute to the ecosystem, bringing scalability and stability.
But how does decentralized storage work?
The principle is to use multiple servers and computers worldwide to store a document. Simply, a document can be split, encrypted and stored on different servers. Only the document owner will have the private key to retrieve the data. On retrieval, the algorithm pulls these individual parts to present the document to the user.
Recent: Tokenized mortgages can prevent another housing bubble crisis, says Casper exec
From a security perspective, the private key is the first layer of protection, and the distributed storage is the second layer. If one node or a server on the network is hacked, it can only access part of the encrypted data file.
Major projects within the decentralized storage space include Filecoin, Arweave, Crust, Sia and StorJ.
Decentralized storage is still in a nascent state, however. Facebook generates 4 petabytes (4,096 terabytes) of data daily, yet Arweave has only handled about 122TB of data in total. It costs about $10 to store 1TB of data on AWS, while on Arweave, the cost is about $1,350 at the time of publication.
Undoubtedly, decentralized storage has a long way to go, but high-quality data storage can boost AI for real-world use cases.
Data transfer
Data transfer is the next key use case on the data stack that can benefit from decentralization. Data transfers using centralized application programming interfaces (APIs) can still enable AI applications. However, adding a vector of centralization at any point in the data stack would make it less effective.
Once decentralized, the next item on the data value chain is the transfer and sharing of data — primarily through oracles.
Oracles are entities that connect blockchains to external data sources so that smart contracts can plug into real-world data and make transaction decisions.
However, oracles are one of the most vulnerable parts of the data architecture, with hackers targeting them extensively and successfully over the years. In one recent example, the Bonq protocol suffered a $120 million loss due to an oracle hack.
Besides smart contracts and cross-chain bridge hacks, oracle vulnerabilities have been low-hanging fruit for cybercriminals. This is mainly due to a lack of decentralized data transfer infrastructure and protocols.
Decentralized oracle networks (DONs) are a potential solution for secure data transfer. DONs have multiple nodes that provide high-quality data and establish end-to-end decentralization.
Oracles have been used extensively within the blockchain industry, with different types of oracles contributing to the data transfer mechanism.
There are input, output, cross-chain and compute-enabled oracles. Each of them has a purpose in the data landscape.
Input oracles carry and validate data from off-chain data sources to a blockchain for use by a smart contract. Output oracles allow smart contracts to carry data off-chain activity and trigger certain actions. Cross-chain oracles carry data between two blockchains — which could be fundamental as blockchain interoperability improves — while compute-enabled oracles use off-chain computation to offer decentralized services.
While Chainlink has been a pioneer in developing oracle technologies for blockchain data transfer, protocols like Nest and Band also provide decentralized oracles. Apart from pure blockchain-based protocols, platforms like Chain API and CryptoAPI provide APIs for DONs to consume off-chain data securely.
Data intelligence
The data intelligence layer is where all the infrastructure efforts of storing, sharing and processing data come to fruition. A blockchain-based application using AI can still source data from traditional APIs. However, that would add a degree of centralization and could affect the robustness of the final solution.
However, several applications are tapping into machine learning and artificial intelligence in crypto and blockchain.
Trading and investments
For several years, machine learning and artificial intelligence have been used within fintech to deliver robo-advisory functionalities to investors. Web3 has taken inspiration from these applications of AI. Platforms source data on market prices, macroeconomic data and alternate data like social media, generating user-specific insights.
The user typically sets their risk and returns expectations, with the recommendations from the AI platform falling within these parameters. The data required to deliver these insights is sourced by the AI platform using oracles.
Bitcoin Loophole and Numerai are examples of this AI use case. Bitcoin Loophole is a trading application that employs artificial intelligence to provide trading signals to platform users. It claims to have over 85% success rate in doing so.
Numerai claims it is on a mission to build “the world’s last hedge fund” using blockchain and AI. It uses AI to collect data from different sources to manage a portfolio of investments like a hedge fund would.
AI marketplace
A decentralized AI marketplace thrives on the network effect between developers building AI solutions at one end, and users and organizations employing these solutions at the other end. Due to the application’s decentralized nature, most commercial relationships and transactions between these stakeholders are automated using smart contracts.
Developers can configure the pricing strategy through inputs to smart contracts. Payment to them for using their solution could happen per data transaction, data insight or just a flat retainer fee for the period of use. There could also be hybrid approaches to the price plan, with the usage tracked on-chain as the AI solution is used. The on-chain activities would trigger smart contract-based payments for using the solution.
SingularityNET and Fetch.ai are two examples of such applications. SingularityNET is a decentralized marketplace for AI tools. Developers create and publish solutions that organizations and other platform participants can use through APIs.
Fetch.ai, similarly, offers decentralized machine learning solutions to build modular and reusable solutions. Agents build peer-to-peer solutions on this infrastructure. The economic layer across the entire data platform is on a blockchain, enabling usage tracking and smart contract transaction management.
NFT and metaverse intelligence
Another promising use case is around nonfungible tokens (NFTs) and metaverses. Since 2021, NFTs have been viewed as social identities by many Web3 users using their NFTs as Twitter profile pictures. Organizations like Yuga Labs have gone one step further, allowing users to log in to a metaverse experience using their Bored Ape Yacht Club NFT avatars.
As the metaverse narrative ramps up, so will the use of NFTs as digital avatars. However, digital avatars on metaverses today are neither intelligent nor do they bear any resemblance to the personality that the user expects. This is where AI can add value. Intelligent NFTs are being developed to allow NFT avatars to learn from their users.
Recent: University students reveal new Web3 solutions at ETHDenver 2023
Matrix AI and Althea AI are two firms developing AI tools to bring intelligence to metaverse avatars. Matrix AI aims to create “avatar intelligence,” or AvI. Its technology allows users to create metaverse avatars as close to themselves as possible.
Althea AIis building a decentralized protocol to create intelligent NFTs (iNFTs). These NFTs can learn to respond to simple user cues through machine learning. The iNFTs would become avatars on its metaverse named “Noah’s Ark.” Developers can use the iNFT protocol to create, train and earn from their iNFTs.
Several of these AI projects have seen an increase in token prices alongside the rise of ChatGPT. Yet, user adoption is the true litmus test, and only then can we be sure that these platforms solve a real problem for the user. These are still early days for AI and decentralized data projects, but the green shoots have emerged and look promising.