Fixing Google page indexing issue using prestashop as an example

Let’s look at this example. As an example let’s take prestashop 1.7.8.x. On a running store, you need to index search pages that have acquired a certain search weight in the store.
As a rule, such pages are closed to search robots in the code of the page itself:
1 2 3 |
<meta name="robots" content="noindex"> |
And in the robots.txt file:
Disallow: /*?search_query=
Disallow: /*&search_query=
For the code of the page itself, such an opening for indexing can be done through the appropriate controller controllers\front\listing\SearchController.php:
1 2 3 4 5 6 7 8 9 10 11 |
public function getTemplateVarPage() { $page = parent::getTemplateVarPage(); //$page['meta']['robots'] = 'noindex'; -- change it to index $page['meta']['robots'] = 'index'; return $page; } |
For the robots.txt file, the opening is done in this way by adding a # symbol before the line:
# Disallow: /*?search_query=
# Disallow: /*&search_query=
But, even after all these indexing changes, you may get an error when adding a page in Google Master:

To avoid this, you need to create a request to upload a fresh version of robots.txt in the Google webmaster:
https://www.google.com/webmasters/tools/robots-testing-tool?siteUrl=
After that, Google robot can add your new page for indexing.