[SEO] Getting Google to Index Search Result Pages

Hey everyone,

I have an index page with a location search, as well as a load of filter buttons.

I have set it up so that the URL is dynamically updated as the search & filters are used.

I.e. If a user searches ‘London’, sorts by ‘highest price’ and selects a filter for ‘vegan’, the URL becomes:
mydomain.com/most-expensive-vegan-X-in-London

I want Google to be able to crawl my site and index each of the (many) permutations that should be created by my search results.

Do you know if this is possible, and if so, the best way to go about this?

Thanks very much for any help in advance!

1 Like

Replying just to bump this one up, because I’m also curious; someone recently asked something similar and I’d thought it was possible with the site-map option on the SEO tab…but I never could confirm. Hoping someone else who knows can help :slight_smile:

You can use Google Indexing API to dynamically request a crawl on your dynamic site. Refer to @NigelG video below. Extremely helpful.

1 Like

Thanks for the reply. Using the Indexing API looks interesting, but given that there will be thousands (or tens of thousands) of dynamic URLs created from the search results, I’m not sure that manually requesting indexing of each is very realistic!

This topic was automatically closed after 70 days. New replies are no longer allowed.