Fixing Google page indexing issue using prestashop as an example

Fixing Google page indexing issue using prestashop as an example
Sometimes there are situations when it is necessary to add to the index, previously closed pages in robots.txt or even in the html code of the page itself. Basically, this applies more to prestashop, as its structure is strictly prescribed for search engines. Often this addition is due to a change in the structure and functionality of the store.

Let’s look at this example. As an example let’s take prestashop 1.7.8.x. On a running store, you need to index search pages that have acquired a certain search weight in the store.

As a rule, such pages are closed to search robots in the code of the page itself:

And in the robots.txt file:

Disallow: /*?search_query=

Disallow: /*&search_query=

For the code of the page itself, such an opening for indexing can be done through the appropriate controller controllers\front\listing\SearchController.php:

For the robots.txt file, the opening is done in this way by adding a # symbol before the line:

# Disallow: /*?search_query=

# Disallow: /*&search_query=

But, even after all these indexing changes, you may get an error when adding a page in Google Master:

Get an error when adding a page in Google Master

To avoid this, you need to create a request to upload a fresh version of robots.txt in the Google webmaster:

https://www.google.com/webmasters/tools/robots-testing-tool?siteUrl=

After that, Google robot can add your new page for indexing.


How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Leave a comment

Your email address will not be published.