I customized the robots.txt because I had problems with Google Search Console using the default settings. I don’t know why, but Google was telling me something was wrong (without telling me exactly what).
In order to have different results, you have to do different things.
So I decided to customize it, because according to forums robots.txt was probably the problem (it was).
The question should be why did it fail at first. I had invested 2 days trying to solve this silly thing (I’m not a technical guy, like 75% of bubble’s users.
I believe the standard robots.txt (without any customization) reads as below:
It makes sense because you don’t want Google to index your test version (you should only want Google to index your deployed, live version). Are you saying it resulted in a problem?
Yes, that’s the standard robots.txt.
You can check my SEO settings in the first post. I used the standard robots.txt.
And yes, it resulted in a problem. The Google message said: Google does not recognize this URL.
Don’t ask my why, but it was a nightmare. Luckily I solved it.
If Bubble’s standard robots.txt prevents Google from recognizing a bubble site, then that’s a problem that would be experienced by the majority of people with Bubble sites. (The standard robots.txt would block indexing of the test site, but no one should ask for their test site to be indexed).
To confirm that this was the problem, have you tried reverting the robots.txt back to standard and see if you experience the same issue? Also could you post a link to the forum entry that said robots.txt was the issue?