- By
- – 04/21/2023
In 2022, investors pumped $2.6 billion into generative AI startups—70% more than the previous year.1 The November 2022 release of ChatGPT, OpenAI’s powerful and much hyped new language model, further fanned interest. Like many, crypto enthusiasts have taken notice, turning to social media to share example after example of how ChatGPT might be applied to crypto and blockchain. Early use cases include coding trading bots and assisting with smart contracts.
The fusing of blockchain and generative AI is in its infancy, and while generative AI technology suffers from multiple issues, including accuracy problems, there are already some important ways it can improve blockchain development:
Coding. A modified version of Dune, a crypto data analytics platform, lets users type natural language and get SQL code that they can use to explore and share crypto data. For broader coding applications, GitHub’s Copilot for Business product gives enterprises access to its OpenAI-powered code completion tool.2 And now OpenAI’s latest model ChatGPT is also being used for writing code. Fidelity employee Erman Akdogan, Director of Emerging Technologies for Enterprise Cybersecurity, experimented with ChatGPT and found it capable at writing Solidity code for Ethereum smart contracts, C++ for Bitcoin, Go for Hyperledger Fabric chaincode, and even supporting other public chains like Tron and EOS. A social media user shared a video of how he used ChatGPT to code crypto trade volumes in Python.3 And, while the code generated by ChatGPT often needs modifications to work, skilled developers can still use ChatGPT’s outputs to quickly get started and save time (see Figure 1).
Optimizing code. Generative AI tools can also make code more efficient. When asked to optimize the secure code of Figure 1, ChatGPT produced new code that improved on security, maintainability, and gas efficiency (see Figure 2). Gas efficiency is crucial because, in addition to environmental benefits, it also reduces the fee required to successfully conduct a transaction on Ethereum. And fixing security issues is critical for blockchain technology given that code on a blockchain is publicly visible, making it easier for hackers to find vulnerabilities. ChatGPT can also optimize code by using existing libraries when explicitly asked, which can further reduce bugs and improve security (see Figure 3).
Explaining code. Generative AI tools are also useful to non-technical users and those new to blockchain development, as they can provide a simple explanation of code. As shown in the above figures, ChatGPT explains how and why it optimizes code. ChatGPT can also tailor its explanations to a developer’s skill level. For example, in Figure 4 below, ChatGPT explains the secure smart contract code of Figure 1 to someone with little technical knowledge. In a similar vein, this technology may also reassure hesitant blockchain users, such as those buying an NFT or participating in a DAO, that a smart contract does what it claims to do.
Debugging and auditing. 2022 was the biggest year ever for crypto hacking, with $3.8 billion stolen.4 Enter ChatGPT, which can not only help debug code but also audit code for vulnerabilities.5 In Figure 2’s optimization example above, ChatGPT made a small security fix by changing variables from public to private. When fed code that knowingly contained errors, including an older Solidity version, a missing require statement, and an unsafe owner assignment, ChatGPT successfully fixed all the issues (see Figure 5). Smart contract security company Beosin tested ChatGPT and found that it could detect most problems, though there were some that were incorrectly determined and a manual audit was still required to find more comprehensive issues.6 Perhaps this is where a human + machine solution would work best: together, generative AI and a human auditor can find the issues that the other may have overlooked.
But Caution is Still Needed
Despite the speed, security, and learning and development benefits that generative AI technology bring to software development, some caution is warranted. Stack Overflow, a question/answer website for programmers, temporarily banned answers generated by GPT and ChatGPT since the answers often contained hidden mistakes that users failed to catch.7 In our own testing, when asked to use existing libraries to code a simple voting smart contract, ChatGPT used a library that doesn’t actually exist.
While we’re not far from these technologies becoming widespread and likely a standard part of software development, total reliance on generative AI technology is risky today, particularly for blockchain development, given that:
Smart contract code is less forgiving than traditional software. With traditional software, code can be updated to fix bugs that are found after it’s shipped. For smart contracts though, unless they’re written to be upgradeable, once the code is live, it can’t be updated. And, as mentioned above, code that’s hosted on a blockchain is publicly visible, so hackers can review it to find mistakes to exploit. What happens if your generative AI-written software includes a bug? In 2022, hackers stole $80 million by exploiting a logical error in Qubit Finance’s code.8
Combining two nascent technologies can be tricky. Generative AI is still in early stages of development and may not be able to adapt to changes in the blockchain landscape as quickly as needed. 2022 saw the introduction of 30 new public blockchains as well as countless private chains running on different programming languages and with varying scalability and privacy properties.9 And currently, ChatGPT hasn’t been trained on any data after September 2021, while languages like Solidity are updated monthly. And those version changes can sometimes fix critical issues. For example, Solidity version 0.8.x embeds a SafeMath library that automatically checks for integer overflow and underflow in arithmetic operations to help prevent attacks. What happens when yet another version is released—how soon can ChatGPT adapt?
Real value is at stake. Blockchain code often involves money, making it especially vulnerable. In 2022, NFT artist Micah Johnson lost $34 million due to a bug in the code used to run a crypto auction.10 Johnson couldn’t withdraw the funds; nor could he refund money to the people who bid on NFTs but lost their auction. Not to mention that people are emotional about money—even small monetary losses can have large ramifications. Investors in FTX explained that they hadn’t just lost money, they felt like they had lost a shot at financial security.11
Questions To Consider
What happens as generative AI technology improves? Right now, AI-generated code may contain a bug or two, forcing users to review it rather than blindly trust it. How will we know when we’ve reached the time where a human in the loop is no longer necessary? Will AI eventually write code, debug, and audit it again and again until the code is fully optimized and bug-free? Until such time, what is the right human + machine approach to software development? And what kinds of policies and best practices should development teams adopt when incorporating generative AI into its processes?
Where should companies begin testing generative AI technologies for blockchain development? How do we evaluate the best use cases? What metrics should we use to measure success, beyond time savings? Should we account for maintainability, gas efficiency, and security? And who should use it first? New developers who may need it most or skilled developers who can evaluate its flaws?
Could generative AI technology be a new form of rubber duck debugging? When a programmer gets held back by bugs, sometimes explaining the code line by line to a literal rubber duck in the hopes of finding a solution can help. Can AI be used in this way? In the same way that ChatGPT “explains” code to its users, can users explain their code back to the AI? Similarly, can we use generative AI to “debug” issues beyond coding, finding an accounting error, HR issue, or an operational problem?
Views expressed are as of the date indicated, based on the information available at that time, and may change based on market or other conditions. The opinions provided are those of the author and not necessarily those of Fidelity Investments or its affiliates. Fidelity does not assume any duty to update any of the information. Fidelity and any other third parties are independent entities and not affiliated. Mentioning them does not suggest a recommendation or endorsement by Fidelity.
2 https://techcrunch.com/2023/02/14/githubs-copilot-for-business-is-now-generally-available/
3 https://twitter.com/tokenstate/status/1618278011966754820
4 https://www.cnn.com/2023/02/01/tech/crypto-hacks-2022/index.html
5 https://twitter.com/CleyfeETH/status/1619945089576607744
https://twitter.com/LionUnchainedTV/status/1619539639920136192
6 https://medium.com/@Beosin_com/can-chatgpt-the-most-powerful-ai-detect-vulnerabilities-in-smart-contracts-57609a6fd5df
7 https://www.msn.com/en-us/news/technology/stack-overflow-bans-chatgpt-as-substantially-harmful-for-coding-issues/ar-AA14VWSE
8 https://certik.medium.com/qubit-bridge-collapse-exploited-to-the-tune-of-80-million-a7ab9068e1a0
9 https://www.publish0x.com/vedao/future-of-public-blockchain-new-public-blockchain-inventory-xvmlwer
10 The computer scientist who hunts for costly bugs in crypto code. (n.d.). MIT Technology Review. Retrieved March 13, 2023, from https://www.technologyreview.com/2023/01/02/1064795/certik-ronghui-gu-crypto-computer-science/
11 Breland, A. (n.d.). They weren’t rich but they wanted to invest. Then they lost everything on FTX. Mother Jones.