Opinion: If you used ChatGPT to write an article, do us all a favor and delete it. None of us want to read it.
There was a time when anything generated by ChatGPT was fascinating.
To fathom that a super-intelligent robot was writing poetry, drafting legal documents or even acting as a personal life coach was mind-boggling.
As if each prompt was a window into a kind of shared human consciousness.
That was the novelty then, at least. But it’s been nine months since GPT-3 and almost five months since the launch of GPT-4.
ChatGPT has become the go-to strategy for churning out useless content
In May, a crypto firm shared a 10-minute video presentation it had whipped up about artificial intelligence’s role in increasing business productivity (in the hopes of coverage).
Ironically, the presenter was an AI-generated avatar reading from a script most likely lifted straight out of ChatGPT. It took all of 20 seconds to close it.
Related: Worldcoin: Should you let Sam Altman scan your eyeballs for WLD?
Oh, was it written by a bot? No worries, send it to my AI transcriber.
Another great example came in recently from a colleague who received a pitch from a third party — offering an article on how to use MetaMask more efficiently.
The 1,159-word explainer discussed six cold wallets that could be connected to MetaMask, a potentially relevant subject given the recent spate of hot wallet hacks, such as the $23 million hack of the Bitrue crypto exchange on April 14.
At first glance, the article was decently written. It had an introduction, a sub-introduction and six subheadings for each individual cold wallet, followed by a conclusion. There wasn’t a single grammatical error.
Unfortunately, it reads like the nutritional label of a cereal box
Packed with facts but void of any personality, flair or human element. No impactful conclusions.
“Each of these wallets has its unique features and advantages, so it’s important to do your research and choose the one that best suits your needs,” was its fireworks ending.
Surprise, surprise. The article came back on an AI detection tool as 74.2% AI/GPT generated. Even the cover letter that pitched the idea wasn’t authentic — scoring a whopping 93.57% on ZeroGPT.
Crypto news aggregators are also starting to show more of these types of articles — with headlines along the lines of: “We asked ChatGPT… and this is what it said.”
Get ready for a hell of a lot more
Unfortunately, it’s only the latest indication of more AI-generated news content to come.
In May, misinformation watchdog NewsGuard identified 49 websites spanning seven languages that appear to be entirely or mostly generated by artificial intelligence language models — mainly in the form of independent news outlets.
Related: Think AI tools aren’t harvesting your data? Guess again
By Aug. 9, that number had rocketed to 408, with these fake news websites spanning 14 languages. There’s no telling how much that will reach by year’s end.
So, here’s my non-AI-generated conclusion.
Generative AI technology like ChatGPT is a game-changing piece of technology that can be wielded by nearly anyone with an internet connection.
But if you’re a crypto writer, do me a favor: Submit your typo-laden rants, share your clickbait articles, and forward your poorly disguised advertorials.
Just make sure it’s your real work — not this ChatGPT-generated trash.
This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.