@NGBK @jeffrey.j.obrien Would one of you be able to try this with the action that doesn’t create a private file to see if you have the same issue? I’ve made no changes to the plugin recently so this may be an issue with Bubble’s fileupload
endpoint.
Want to drop this here for anyone running into the issue of timeouts on the backend action. A simple walkthrough of how to create your JSON in batches using a recursive workflow.
What is the biggest table/rows/record that can be uploaded?
(I need to upload 9,000 at a time…)
@eli @jeffrey.j.obrien , i’ve tried with the action “CSV Creator - New file from base64”. it works juste fine for me
@eli @NGBK - I checked today and its working ok. I didn’t change anything.
No parsing issues. Not sure what caused it.
Thanks for help.
Hello,
I am trying to use the privacy feature of “CSV Creator - New Private File from Base64”.
I’m not sure what to put in the attach_to field (param.) for the moment I tried with user ID or company ID (data related to my application modeling) but impossible to download the file (permission problem). Would it be possible to have more information on what to put in the attach_to field?
Also I tried to go to the documentation (https://csvcreatortest.bubbleapps.io/) but it asks me is this normal?
Thank you in advance for your feedback
hey @thinus are you uploading or creating the CSV? see this post 1T - CSV Uploader import limit? - #4 by eli where @eli talks about upload limits.
Ola, você encontrou uma solução para isso?
Can you do a calculation in the “format as text” box?
Do you have documentation on the next steps of your video on timeout csv creator? As I understand I can create a CSV from the FINAL JSON by using the CREATE CSV FROM JSON (SSA) action. Can I just add RESULTS OF STEP1 and then “NEW FILE FROM BASE64” then save file created from “NEW FILE FROM BASE64”?
That should work. Once the JSON is assembled the rest of the steps work as normal.
Dear @eli
Thanks for this excellent plugin, I have used it on a couple of APPs on client side, but now I need to export more than 50.000 rows and the time goes out.
I follow you video about it, I already created the download file pn server side; but I am having issues to upload the CSV file on server.
I would appreciate if you can help me.
Hi @thinus1
No, I have not solve it yet. Totally.
It works when I have few rows. In the last step I replace the " by space. So it save the file.
But when I have a lot of rows it does not work.
Can you confirm that the file is in your file manager?
I resolved my issue. It seems that the “Create CSV from JSON (SSA)” action did not receive data to create the csv. Not sure why the “Results of step” did not work. I used a search function to make sure the “Create CSV from JSON (SSA)” got the correct info. At the “Make a change” workflow, I did a search again with the “find & replace” added.
Hi @thinus1,
When the process takes longer the files are not in my file manager, it looks like the time expires and it did not finish the file.
Could you please send screen shots to see your solution.
Hello. @eli
I have been working for a long time with your plugin for more than 6 months and now I have an error, I have not modified anything, no type of change, nothing.
Do you know what it could be?
action create CSV from JSON
The plugin 1T - CSV Creator / action Create CSV from JSON threw the following error: TypeError: Cannot read properties of null (reading ‘replace’)
at eval (PLUGIN_1582601241392x653181519983018000/1T—CSV-Creator-action–Create-CSV-from-JSON-.js:7:51)
at https://app.soltechpty.com/package/run_debug_js/9b723912685f24b7190c3b47bf2eaf4ebefd09387c516f00472cd93b7e1af5b7/xfalse/x21/run_debug.js:6:2604326 (please report this to the plugin author)
Hi Ton,
I think you should ensure the quantity of elements on each column (OS, Project) has to be the same.
Currently you have 4 elements on OS column (0,1,1,1), and 3 elements on Projeto column (1/23, 2/23, 3/23).
Hi @edwinbolanosb, generally when it works with a few rows but not a lot of rows it could mean:
- You have some data that is breaking the JSON in the larger data set. Make sure any fields that could contain a double quote, line break, tab, or backslash you are using the
format as json safe
operator. I would save the final json to a record in your database and copy it into a JSON checking tool like jsonlint.com to make sure your json is valid. - The back end workflow could be timing out if its taking too long to build the JSON. There a 4 minute timeout on the backend workflow I believe.