I’ve recently installed Subsy Plugin to get SEO-friendly URLs and it works perfectly.
But since this is basically a workaround to get nice URLs, the Web Crawlers won’t look for these “pages”.
Is there a way to add a custom sitemap.xml file to manually indicate that it should crawl these pages too ?
Thanks for the post and great question! While I haven’t personally worked with that plugin, I believe what you’re looking for would be found in this section of our manual. Specifically, I’d pay close attention to the Robots and Sitemap sections as they’ll walk through where you can make those changes.
An additional thought for you would be to search for forum specifically for users who are pushing the boundaries in these areas and they might be able to lend some specific advice in terms of what they’ve done with those two settings.
And of course, you’re always welcome to email us directly by emailing Support@Bubble.io.
Thanks for answering.
I believe there’s no link to the manual section you’re talking about in your post.
I will try my best to find answers with those users.