Google Crawler not fetching content on pages

I am working on my SEO, and noticed that the google crawler actually doesnt fetch any of the content of my pages. If you use google search console and use “fetch as googlebot” on a page you see what I mean. None of the content (text/links/etc) is fetched, it just sees all the “other” stuff, like and meta descriptions and so on.
You can also see this by just using “view source” in your browser.

Since this is not my area of expertise at all, I tried to search a bit, and found this page which I think might relate to the problem:

As google says in the article: Why is this important? It is important because search results are based in part on the words that the crawler finds on the page. In other words, if the crawler can’t find your content, it’s not searchable.

Am I missing something basic here, or might this be a big issue for bubble apps currently?

1 Like

Seems like this is a general issue with javascript websites, and there are solutions like for it.
Would this be a lot of work to implement @emmanuel ?

We do something like this already, sending the page once it’s generated to search engines crawler.

If you google Bubble you’ll see content from bubble’s pages.

Ok, can you explain a bit further? :slight_smile:
In Google Search console, it seems the content on the pages is not fetched by the crawler. But you say that there is created “static” content that is created and sent to the crawler every time we update a page?

When I google Bubble (, I see only what might be meta descriptions. :slight_smile:

You don’t, you actually see Bubble’s page content.

Once the page is generated it’s sent to the crawlers. Like what prerender does.

Ok great to hear.
How does it work when a page is updated? Is it sent to the crawler every time we push to live ?

We send it every few hours.

Ok, sounds good. Thanks for the quick reply.

I seem to be having this same problem. @pnodseth, did the Search Console show your page content after a few hours? Or is it even suppose to?


I have a question on this. I have a few pages for which I define the H1, Title etc. on the basis of the parameters supplied to the page. Some of that logic is written in “Conditional” parts of components, and some in “On page load” workflows.

Will these taken care of too while sending pre rendered data to Google?

Having the same issue - Google is crawling only some content / part of my page.

Has anyone found a solution to this? We’re knee-deep in this problem… it would benefit the Bubble community immensely!

I know there are solutions, because I’ve seen countless Bubble sites render correctly in Googlebot simulators.

So we just need to share the knowledge. No ‘gatekeeping’, as they say nowadays :grin:

+1 here. I’m using SemRush and it won’t even pick up H1 tags

anyone have updates on this?