Are AI Bots Worth Blocking, or Should You Allow Them?

When OpenAI launched their chatbot ChatGPT, they also revealed their web crawler called the GPTBot. Their crawler works the same way as a Google bot, and website owners can block these crawlers from accessing their websites in robots.txt. OpenAI also gave access for blocking their bot. It is said that around 48% of websites have blocked their access, according to a study by Originality.AI. After this, Google also announced a separate Google-Extended bot in order to allow webmasters to block their AI tools without affecting their search bot.

Ongoing argument on AI bots

There has been an ongoing argument since the launch of ChatGPT about how to take benefit of this ability of blocking some of the AI bots from accessing sites. There is no one answer that can satisfy all stakeholders across the board, as the discussion around this is persistent among SEO industry players and nearly all online publications. This is a secondary topic about whether blocking these bots will have any practical impact, as many services are scouring the web without permission.

Buy physical gold and silver online

One point that supports this school of thought is that OpenAI’s crawler was announced quite late. The company may have used many different methods to source data, which they call widely available over the web. The concerns are that companies can use bots under different names and under different vendors, so if you block a new one, you can’t take the data back that has already been digested by them.

This has made it easier for scraper sites to churn out content from these very bots that are trained on your data, the human produced one. 

They want your content

Some may believe that AI companies don’t want their content, and they will one day be able to produce content of the same caliber as yours, then this is a very concerning issue for content centric sites. This also raises questions about the effectiveness and relevance of SEO practices that we have known. Another problem will be that there will be a large number of sites producing the very same content. So, considering these points, some industries may block AI bots on a large scale. 

Experts that favor providing access to the likes of ChatGPT bots base their argument on the grounds that the service is not used as a search engine, but it is used as an assistant, especially for code creation, translation, and also content creation. They also say that Bing’s market share has only increased by 1% since it integrated ChatGPT. At the moment, Google says that its AI bot is separate, but we don’t know what will happen in the future as Google is going to integrate AI into search results in future.

People in favor also say that it’s not all about chatbots writing content, but the point is that if they are mentioning your brand name, according to them, the situation also gives your brand exposure to a more wider audience. Also, if you block their access, you prevent your thoughts from being included in training future LLMs. However, there is a strong possibility that in the future, clones of brands and products will be generated by these same models, which will eventually amplify the deepfake problem.

The threat that AI models pose to SEO and Google is not as a straight competitor but as a tool that can be used to create content at scale. This is disruptive for search, as it will create problems for Google and Bing, and it will consume the traffic that would otherwise consume the already existing human content. So at the end, the decision of whether to allow a bot or not comes down to the preference of an individual as a blogger or a publisher as a firm after analyzing their priorities. 

Find the original story here.

About the author

Why invest in physical gold and silver?
文 » A