josh24
8
Right, this makes sense. So I think you’d need to go down the route of javascript to solve that issue, in theory.
However, if I had to put money on it I’d expect that while the custom js will rewrite those fields, it’ll happen too late and the bots will have already picked up the previous data. So you’re back to square one.
The way to perhaps get around this would be to switch off the auto-generated sitemap in Bubble, and generate your own that has all of the URLs you want indexed and upload this to the search console. Then it’s probably more likely for the bots to hit that list of URLs and therefore load each URL as if it’s a different page then it trying to crawl through the site itself. Then you don’t have to worry about rewriting those fields on the fly.
It’s not an exact science and ultimately the bots will do as they please. But it’s probably got a decent chance of working as it’s easier for them to do it that way anyway. If it doesn’t work and they crawl your site using the sitemap and then crawl it manually and overwrite everything then you could look at injecting some custom code to mark those internal links to nofollow and increase your chances that they’ll just use the sitemap (they do whatever the $%^& they like anyway haha). I’m not an SEO expert, but I think that then opens another can of worms though.
Josh @ Support Dept
Helping no-code founders get unstuck fast
save hours, & ship faster with an expert
on-demand
1 Like