SEO:: Duplicate Content::: Self Rel Canonical?

I discovered that page paths have unlimited strings that will render the same page on a https request.

I’m new to bubble, but I was looking for seo issues to talk myself out of using this platform and I believe I have found it.

For example: will load the same page and content for any other string like etc

Is there some rel canonical tag that can be implemented or a work around?

1 Like

Not necessarily for the exact functions rel canonical tags perform as it relates to google, but in terms of getting a user to the page you want them to view yes.

Using the contains function

and making that ‘this url contains’ and the portion you want recognized ‘cd-players’ in your example you can then redirect the user to the main page you want them to view.

But this won’t help with the idea of duplicate content on pages as both pages would still exist in your app, however with more thought and proper set up you could avoid ever having the duplicate page and always navigate the user to the one page that you want them to view regardless of where in the app they are or how they might be searching.

Is there a specific use case for wanting rel canonical tags or just searching for SEO drawbacks to bubble in general?

1 Like

Hey, thanks for the reply, that’s a nice way of narrowing down on intent for sure to deliver the right page. I’m starting to think the search core function for building the web app may be the way to go here.

I’ve seen in many e-commerce websites that they will typically have some sort of function that will use an rel canonical tag to point back to itself to deal with request that are “not planned for” or in the case you mentioned above using “this url contains” for the user sake.

In the my seo career I’ve seen many log files where googlebot will not only request random strings in any url path, but sometimes index them and also cause more technical seo problems, ones that can really affect topical authority.

I like the power Bubble has, I’m just scared of scaling and regretting it. But the solution you explained should also stop Google because they’d be querying the same request as a user based on any elements within the ux/ui.

I’ll give that solution you mentioned a whirl and report back in a week to see how initial indexing goes. Thanks!

1 Like

Okay so after doing some testing, I figured out a way to keep googlebot and the rest from making trailing path requests as described in my initial post.

Also after some deep thought, something that has plagued my seo-mind for a while was nesting at all. After studying Amazon’s site arch, they are able to pull of ranking for terms that are not really close to their domain topical authority. There’s a connection to this as well, I’ve also noticed in the serp how other sites that follow this trend (placing more the main paths to the root) are ranking better than those sites nesting content three levels deep from the root.

So I used this site called, to test out a sure way to keep googlebot from making request after paths like and https://amazon/cd-players13r30t23151 , see example below…

In the attached image below here, I just looked at Amazon’s robots.txt file and tested out some directives:

  1. Disallow: /gp/dmusic/promotions/PrimeMusic*
  2. Disallow: /gp/dmusic/promotions/PrimeMusic/*
  3. Allow: /gp/dmusic/promotions/PrimeMusic/$
  4. Allow: /gp/dmusic/promotions/PrimeMusic$

Screen Shot 2020-09-10 at 5.46.40 AM

I tested this against my own site and followed the same patterns. I tested them in Google Search Console and it worked perfectly.

As far as the Duplicate Content issue by having a page vote for itself, with the correct robots.txt directives you can avoid having Google request anything that can duplicate itself.

  • It’s better to control the architecture early on, I still will post back on in different post as this one will close out in 2 months.

Is this site where you put the directives

and this site sends those directives to google bot?

Or does this site create the robot.txt directives for you and then you need to place them somewhere on your site? And if so, where in your bubble site are you placing it?

This tool (technical will indeed test the live robots.txt file on your site, or really any site. In this example I’m studying Amazon here, best to follow in the context of an app site and getting the most out of your content.

Bubble has this located in the Settings Section under SEO.

The tool can go both ways, little jittery, but it works and then of course paste them into the robots.txt field. Then I tested them on Google Search Console and those RegEx expressions work. I found a few websites that goes over how these work, Google is tricky, but there’s nothing better than testing live for yourself.

  • On a side note, something I noticed in the last year Google seems to be really devaluing websites that silo too deep. I think it’s time to rethink the long-tail and go with the root for these categories or search features of these web apps.

** The biggest SEO growth hack today is Adwords and this Big Bert algo seems to really figure out quickly all the other related organic (long tail keywords) to rank you for. So once again Paid Drives all, with these apps being high powered like this, Content is King Model will hit eventually!

But I still don’t want Google making up urls and that goes for any other bots too for server load performance, but I will report back on this and the log files to see what Google tries next. :slight_smile:


I’ve never really observed that, and it is actually true, I checked it. I never minded checking the process of doing the seo, since for me programming is a foreign field ( i know that seo isn’t programming but sometimes i feel like me and my computer are talking different languages) I chose to buy seo services for my website which recently was created, the design wasn’t the best one and because of it the ranking of it also was low, since I needed and a new design and a search engine optimization, the only option was to hire the #1 seo malaysia, they helped me with everything I needed, what is more I am ready to buy their services again if I will need it.

This topic was automatically closed after 70 days. New replies are no longer allowed.