Etherscan code reader to use AI to unlock Ethereum contracts – here is what you need to know

On June 19, Etherscan, a popular Ethereum block explorer and analytics platform, unveiled a new tool called “Code Reader,” which leverages artificial intelligence (AI) to extract and interpret the source code of a specific contract address. By utilizing OpenAI’s large language model (LLM), Code Reader generates comprehensive responses, offering valuable insights into the contract’s source code files. The developers at Etherscan emphasized that to utilize the tool, users must possess a valid OpenAI API Key and have sufficient OpenAI usage limits, assuring that the tool does not store users’ API keys.

Buy physical gold and silver online

“To use the tool, you need a valid OpenAI API Key and sufficient OpenAI usage limits. This tool does not store your API keys.”

Code Reader presents numerous use cases that benefit users seeking a deeper understanding of smart contract code through AI-generated explanations. The tool enables users to obtain extensive lists of functions associated with Ethereum data, facilitating a comprehensive comprehension of how the underlying contract interacts with decentralized applications (dApps). Once the contract files are retrieved, users have the option to choose a specific source code file to peruse. Moreover, the tool permits users to modify the source code directly within the user interface before sharing it with the AI, empowering them to experiment and iterate on the code effectively.

Etherscan vs AI

As the field of AI continues to experience exponential growth, some experts have raised concerns about the feasibility of current AI models. In a recent report published by Singaporean venture capital firm Foresight Ventures, it was suggested that computing power resources would be a pivotal battleground in the coming decade. While there is a rising demand for training large AI models within decentralized distributed computing power networks, researchers have identified significant constraints that hinder their development. These challenges include complex data synchronization, network optimization, and concerns regarding data privacy and security.

Foresight Ventures highlighted a specific example to illustrate the limitations faced by large AI models. The report stated that training a model with a staggering 175 billion parameters, utilizing single-precision floating-point representation, would necessitate approximately 700 gigabytes of storage. However, distributed training mandates frequent transmission and updates of these parameters between computing nodes. 

In a scenario involving 100 computing nodes, with each node requiring updates to all parameters at each unit step, the model would demand the transmission of a colossal 70 terabytes of data per second. This requirement far surpasses the capacity of most existing networks, illustrating the immense challenges in implementing distributed training for large AI models.

“In most scenarios, small AI models are still a more feasible choice, and should not be overlooked too early in the tide of FOMO on large models.”

In light of these constraints, researchers suggest that in many scenarios, smaller AI models remain a more viable choice. It is crucial not to overlook the potential of smaller models amidst the current frenzy surrounding large models. As the AI landscape evolves, advancements in computing power and network infrastructure will likely address some of these limitations. Nonetheless, for the time being, smaller AI models can still deliver substantial value and should not be prematurely dismissed in the pursuit of larger, more complex models.

About the author

Why invest in physical gold and silver?
文 » A