Connect with us


Google unable to see content behind captchas




Google says that websites can run into problems in case they hide their content behind captchas. It bars the web crawlers from seeing the content behind it. Google bot refuses to interact with anything else while crawling over a webpage.

In case it lands on a page with captcha blocking on the main content, then it will assume that it is the only thing present on that page.

There are many ways around this; however, while the captchas are problematic, there is no reason to stop their usages. This is what Google’s John Mueller has stated during the Search Central SEO.

An owner for a directory site has written if the captcha they have added to avoid scraping will impact the SEO or not. To answer in short, Mueller added, yes, it will. However, there is a way to use the captchas that do not interfere with crawling or indexing.

Mueller made it clear that Googlebot doesn’t fill out the captchas. If the captcha needs to be filled out before looking at the content, then the boot will not crawl. It will index the page, but none of the content behind the captcha will get the ranking.

Mueller further says you can safely use captchas if it does not block access to main content. To be very sure that captcha is not blocking the view of Google, he recommended using the Inspect URL tool in Search Control.

If you want to block your content completely with captchas and keep it Google friendly also, then you can do it too. It involves a technique where you may think against the guidelines of Google. But Mueller confirmed that it would not violate any policy.

Serve Googlebot is a different version for the page than the one that the regular user gets. Googlebot can have a captcha-free version for the page. At the same time, the users will have the complete captcha before they view any content. Using this will help GoogleBot to use your content for ranking.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *