Search Engine Optimization Experts Weigh In On Whether It’s a Smart Move for Your Business
In its original form, ChatGPT cannot independently access internet content; it relied on previous training data or information fed to it through plugins. However, with the introduction of an official ChatGPT-hosted web browser, there may soon be a shift in this paradigm, allowing the model to pull information directly from the internet – including your business’s website.
This update has raised questions regarding copyright and plagiarism, but is AI truly in violation? And should businesses block OpenAI bot access via their robots.txt plugin files?
Marketers and business owners have been curious how to not only protect their website content, but to ensure the integrity of their search engine optimization work. Keep reading to learn if this update to one generative AI large language model is worth the worry.
Yes, You Can Block Open AI’s Bot
ChatGPT has shared details about their web browser and its plug-in bot, including that it is possible to block it. But there are a few things you should know before deciding to do it.
- All requests for crawling your website must come from a direct user request; someone using ChatGPT software will prompt this.
- The ChatGPT bot won’t crawl the web like a search engine.
To keep it simple, this means that the AI software isn’t accessing your website at random; it must be manually told to do so.
But Should You Block the ChatGPT Bot?
There is truly one major reason that business wonders and marketers may worry about a large language model gleaning information from their websites: the question of copyrighted work.
The fear that a robot will take content from your website and supply it as an output for another user – essentially committing plagiarism – is a valid one. As Search Engine Land points out, it’s also a debate that’s “been raging for a while and could easily take 20,000 words to dig into.”
But it’s not just the plagiarism that hurts; it’s the fact that duplicate content on another website could tank your site’s search engine optimization efforts and rankings in search engine results pages.
Kansas City SEO experts and the greater search engine optimization community as a whole are likely to advise to take a more measured approach and avoid blocking new technology until we have more information about how it works. It reminds us of other machine learning tools digital marketing agencies were once hesitant to adopt; there is potential good that can come of it.
Predicting the Benefits of Allowing the Plug-In to Crawl Your Website
Let’s examine a scenario where you could get increased traffic to your website simply by allowing the ChatGPT bot to crawl your website.
First, someone elsewhere in the world uses ChatGPT to glean information from your site. They attribute that information back to your website, using a hyperlink as the citation. As users visit their website, they click on the hyperlink to find the original source of the information.
This organic linking strategy can increase traffic and visit durations, both of which are great for building authority.
AI may also change the way we use search engines. “AI will be a new starting point for many web users,” predicts Ryan Jones of Search Engine Journal.
The ChatGPT plugin could help you reach new users who wouldn’t have found you otherwise. Jones goes on to predict that AI could simply become a new acquisition channel. And we’ve adopted new acquisition channels before; just take a look at social media, Google Performance Max campaigns, and the latest developments in OTT video ads on streaming platforms.
The Effects of AI on Search Engine Optimization
As the debate about AI rages on, one of the biggest concerns digital marketers have raised – and asked many questions about – is whether AI-generated content will negatively affect search rankings, crawler bot aside.
Google Search Advocate John Mueller has repeatedly stated that AI-generated content is against its guidelines. He has continued to say that as long as AI-generated content is produced with the intent to be helpful to searchers, a website won’t receive manual penalties – that is, from a human reviewer working for Google. These seemingly contradictory statements make it all the more difficult to decide how to use new marketing technologies to your business’s benefit.
We think it begs the question: If everyone uses generative AI, will Google penalize everyone? Perhaps we’ll discover the answer in the future, but until then, there’s much confusion on behalf of inexperienced marketers and business owners surrounding what is and isn’t allowed.
This confusion is why it’s important to work with a digital marketing agency who can stay abreast of the latest pivots from the internet giant, so you can focus on running your business without having to adapt your SEO strategy when Google announces a new update.
Related Reading on Generative Artificial Intelligence
“5 Reasons AI Will Never Beat a Human Copywriter”
“Your Copywriter Makes or Breaks Your Search Engine Optimization. Here’s How.”
Related Reading on Search Engine Optimization
“Are you Using These Types of Search Engine Optimization Keywords in Your Web Content?”
“What Does It Mean to Optimize Your Website for SEO?”
“Content Intent: 3 Types You Need in Your Business’s Blog”
Peter Mishler, senior digital copywriter at iFocus Marketing, is dedicated to using language to help clients find their audiences. He is the author of two books: a Kathryn A. Morton Prize-winning collection of poetry, from Sarabande Books (2018) and a book of reflections for public school teachers from Andrews McMeel Publishing (2021). His poetry has appeared in many national publications, including The Paris Review, and he is a contributing editor for Literary Hub. New to marketing, Peter taught English for 15 years.