Best way to export CSV files

Hi there !
Im looking for the best solution for my users to download a csv file. The CSV file can get from 2K to 100K lines so when i try to do this on thhe front end, the app freeze. What are your suggestion in order to do it better ? thank you very much

Hey @Alter345 how are you?

How are you doing at the moment?

It sounds like you are downloading all those lines to the frontend. That would explain why it freezes.

Try doing it in the backend by using the 1T - CSV Creator plugin and then sending an email to download it.

1 Like

@ademiguel yes thank you but i guess it will consume a lot of WU right ? is there any other solution to reduce this ? thanks

You can do a test with 10 - 100 rows and see how much it’s consuming. You then just extrapolate.

I doubt there is gonna be much difference WU-wise between doing it frontend or backend, just because downloading all those rows to frontend is consuming WUs as well.

But again, you can just make tests with a smaller amount of rows and see what’s the consumption in both cases.

Yes thank you !

in addition to what @ademiguel mentioned, you will be confronted by limitations even in the backend due to data size. I would suggest that you check out this video that can guide you on how to overcome this timeout limitation:

Hope this helps !

1 Like

thank you for your help

A quick and dirty calculation of how many WUs you are gonna spend is:

Frontend execution

WU usage sources

Performing a database search - 0.3
Each thing returned from the database - 0.015
Each character of data returned from the database - 0.000003

Assuming downloading all records in one search you have
0.3 + 0.015*100k + 0.000003 * charsPerRow = 1500.3 + 0.000003 * charsPerRow

Backend execution

WU usage sources

Adding a new item to the API workflow scheduler - 0.1
Running a server-side workflow action - 0.6
Each call to a server-side plugin action - 0.2
Performing a database search - 0.3
Each thing returned from the database - 0.015
Each character of data returned from the database - 0.000003

Assuming getting all rows in a single search and processing in a single workflow iteration
0.3 + 0.015*100k + 0.000003 * charsPerRow + 0.1 + 0.6 + 0.2 = 1501.2 + 0.000003 * charsPerRow

You can see the difference is neglectible. Retriving all those rows from DB is what matters here.

EDIT: should you have to use recursive workflows, then WU increases considerably in the backend, since you have to add up to (0.1 + 0.6*3 + 0.2) * 100k = 210k WU to the calculation in case you retrieve only 1 row per iteration.

You can potentially reduce it by an X factor retrieving X rows per iteration.

Its clear, thank you for your time and explanations