Wondering if anyone has had any luck with mass image processing or migration of the bubble S3 storage files to reduce storage size.
When we started we simply used the built-in storage without thinking about compression and its quickly built up to 100k items with quite large images spanning 100’s of GB’s. Now we need to compress them on-mass. Has anyone tackled this already? any advice on best things to do? I am hoping to stay with the built in storage but currently its very expensive for us; so best to compress or migrate in place.
In our use case we have people requiring a hardship grant in which case they need to upload ID documents + other state benefits they get to verify themselves, these are attached to their ID record. So we can pull a list of ID records and process each one or do something on a list. Some of the tools available cannot be used for server side actions so its limiting to pass a list to an API call to process each record which has the file attached to it.
For this kind of project I usually use an external storage as Dropbox or Google Drive that have a cheaper price for storage, I think is better solution than compact the data.
There are advantages to using Bubble storage, however, such as auto-optimized images via Bubble’s Imgix integration.
Plus, with Upload Buddy, images can be compressed before uploading, which not only conserves Bubble storage, but also results in a better user experience - i.e. less bandwidth and less time to upload (which is especially important on mobile).
That said, the issue in this thread is how to compress already-uploaded images. It’d be nice if they could be downsampled in place so as to preserve the existing URLs so that all the references to the images wouldn’t have to be updated, but I don’t think that’s possible.
If each image is used in just a single place within the app, then perhaps retrieving the relevant record and [somehow] scaling the image server-side might be possible. I haven’t actually explored that option though.
Premium Bubble Plug-Ins