I have a web application that has been live for about 48 hours. It allows users to, among other things, select images from an assorted bunch as part of their order. I woke up this morning to find that all of a sudden, the hundreds if images that I was sure were saving onto the database do not appear anymore. Orders that were made an hour go do not seem to store the image, while those I am currently making save the images with no issues. The last sync to live that I made was about 5 hours ago, but the orders before that and after that up until the couple of mins are missing. I have double checked my paths, and they all seem ok. Super strange. I would appreciate any insight into what could be going on.
Is this to do with your live database and development database being separate? Did you upload the images into the development database?
I am accessig the images (and other pieces of data) from Airtable via an API. Other pieces of data from the same base are displaying fine, but this most critical one is not. The same database has two other tables that are pulling fine
Turns out that public URLs from Airtable expires after two days, for security reasons.
Anyone know a workaround for this? Or even a quick way of importing all the images from Airtable into Bubble in a smooth and painfree way.
Are you saving the Airtable url to a thing in your Bubble DB already? If so, just use the save to S3 operator to store in your bubble DB also.
If you are pulling the urls from the API every time, the expiring URL shouldn’t be an issue
Thanks DjackLowCode. Do you know how this works for data that has already been oulled through into the Bubble DB?
There are hundreds of images that are being hosted by Airtable that users are accessing, with each user only need to acess 1 or 2 images. So it is more likely that each image will not be selected again for the next two hours (the url’s expiry limit window).
If there’s some reference to the parent Airtable data in the Bubble DB thing also (not the expired url) you should be able to run a workflow on all these items and run the saved to S3 operator.