Fixing Google page indexing issue using prestashop as an example
Let’s look at this example. As an example let’s take prestashop 1.7.8.x. On a running store, you need to index search pages that have acquired a certain search weight in the store.
As a rule, such pages are closed to search robots in the code of the page itself:
<meta name="robots" content="noindex">
And in the robots.txt file:
For the code of the page itself, such an opening for indexing can be done through the appropriate controller controllers\front\listing\SearchController.php:
public function getTemplateVarPage()
$page = parent::getTemplateVarPage();
//$page['meta']['robots'] = 'noindex'; -- change it to index
$page['meta']['robots'] = 'index';
For the robots.txt file, the opening is done in this way by adding a # symbol before the line:
# Disallow: /*?search_query=
# Disallow: /*&search_query=
But, even after all these indexing changes, you may get an error when adding a page in Google Master:
To avoid this, you need to create a request to upload a fresh version of robots.txt in the Google webmaster:
After that, Google robot can add your new page for indexing.