All pages are crawled but excluded in Google Search Console

Hey, I’ve created my first app, didn’t mess with robots.txt or anything, simply submitted the sitemap to Google Search Console and few days later it displays the pages as crawled, but excluded (not indexed).

I’ve tried to look up some information about what this means and I found that pages are usually marked as excluded (and thus not indexed) when the website owner specifies it in the robots.txt or somewhere else on the website.

Since I haven’t done any of that, do you have any idea why the pages aren’t being indexed? What was your experience? Were they indexed outright?

Hi,
Have you managed to solve your problem? I have a similar one. I also have banned pages on GSC.

How long ago did you send your site live and submit your sitemap to Google? And can you clarify what you mean by ‘banned pages’ (screenshot from GSC probably the best way to highlight what you’re seeing)

My pages have been properly indexed week ago, but excluded (not banned by Google - my english is poor). I did not modify the robot.txt file. I suspect this is a problem with the bubble setup.

I have the same issue. Lighthouse says my header links cant be crawlled and GSC has indexed none of my pages. I’ve been reading and suggested issue the href = tag. what I dont understand is that my link do have a href = tag. I’m lost.