Just to let you know that a Cloud2Cloud File Transfer plugin has been released which may be useful as you are already using Google Cloud Storage as an external Cloud storage provider.
Typical use-case are Cloud to Cloud bulk file transfers or bulk file transfers from public sources to a Cloud storage provider, including Google Cloud Storage.
Cloud2Cloud File Transfer Plugin page on Bubble, by wise:able. Add Cloud2Cloud File Transfer in one click to your app. Bubble lets you build web apps without any code.
Hey - Thanks for this plugin and documentation.
I have added the plugin and succeed to reproduce all scenarios …
unfortunately, still need to list all previous uploaded files (only by names)…
is it possible to use the same RG that upload the files to list previous ones?
Many thanks.
Thanks for the feedback.
I already tried the custom state way but does not load previous files…
below some snapshots… may be I am missing something?
Thanks
below I have set a state “google_list” with type text/list:
“see all” button will kick the workflow and outputs in the RG type/text
@redvivi Using the plugin and all works better than expected. When i use the * Create Bucket in Google Cloud Storage is there any way to set the Access Control to Fine-grained: Object-level ACLs enabled. when i use the plug action to set a file to public it will only work with the aforementioned access control although when i use the plugin to create the bucket access control is Uniform which does not allow setting files to public access. using the preassigned download url will not work for my use case
Hello, can we have a response of the action “Save File to Google Cloud Storage (Backend)” to catch any error and avoid a break in a workflow ?
For example , when the file is too heavy i’ve a workflow error and nothing after could be done .
Thanks by advance for your answer
I will have a look.
Meanwhile, you may simply check your file size before the action is triggered.
As the file is locally stored on your Bubble apps, the file size may be retrieved through the headers of an API connector call, GET< your file URL >
Yes thanks I’m using a script to know the file size before uploading , but however for different use case it could be good to have this kind of security on the workflow
@redvivi once a file has been uploaded is there any way to reset the dropzone ‘total upload progress’ exposed state? even if i reset the dropzone etc this remains as 100%
Wanted to inform you on a major update featuring instant Uploads Starts, Resumable, Parallel, Multi-Part 5TB Upload Support!
Don’t forget to re-run the Google Cloud deployment template so your bucket setting are updated. Alternatively refer to the documentation for manual setup.
Also, this plugin now supports video and audio compression.
I felt compelled to share my experience—I’ve been a long-time user of this plugin, and it’s fantastic! That said, I recently ran into a snag after the latest update. I hadn’t realized it required changes to my workflows, which caught me off guard. Lesson learned: I need to read update notes more carefully before upgrading.
I ran into some real issues but dropped a note to @redvivi , and the response was prompt, helpful, and thorough. The support was outstanding, and the plugin is now running better than ever—super performant. Overall, bloody marvellous!