Soft 404s being penalized for SEO

I have a page linked to a “products” datatype. These products are unique and once sold are gone forever and I delete the thing from my DB. On the page I redirect to 404 if this page’s thing is empty.

The problem is bubble will not send a http 404 error code, instead it still sends http 200 OK status. Google considers this a soft 404 and soft 404s are bad. Google will continue to crawl these pages, wasting your crawl budget and if you have too many soft 404s it will take it as a signal that your website has quality issues and you will get penalized for it (either in your SEO rankings or by them determining your site is not worth crawling). Sending hard 404s is the correct way to signal to google to forget an URL. Here’s a link to google on Soft 404s.

The only thing I can think of is blocking crawling entirely for the page in robots.txt with disallow, but this is no solution.

Anyone have any ideas? Just pinging some names I’ve seen pop up a lot on SEO topics @adamhholmes @stuart4 @josh24 @NigelG @ed727 @boston85719

I believe you can do that dynamically so that you put the do not allow onto the page header conditionally based on if the product exists or not, but I’m not sure.

I think I recall somebody discussing 404 redirects and status 200 OK before, and I think the thread had some solution to it which if my mind serves me right (not likely) it had something to do with when the content was rendered after a check of some kind.

I think you can only specify noindex in the page header which is different from crawling. Since google has to download the page to see its header, it’s crawling it at that point. The noindex only tells google to not include this page on SERPs.

You wouldn’t happen to have any clues as to how I can find this thread? I just tried searching for and didn’t come up with anything.