Hello,
Wondering if anyone using the File Uploader plug-in has come up with a way to prevent duplicate entries from actually propagating to the database?
Imagine I have a list of three names in a .csv file I wish to upload. Let’s say the names are Tom, Dick and, Harry. Now imagine that I add a fourth name to the .csv file which again is “Harry”. So now the .csv file contains Tom, Dick, Harry and, another duplicate instance of “Harry”. I need to prevent this second instance of the name “Harry” from being added to the database on upload.
You could make a recursive “cleanup” workflow after it uploads. It would loop through and check if the name exists already then delete it’s Thing
1 Like
I’ll give it a try. Thanks!
Hello Tyler,
Still attempting to create a recursive workflow that looks at the db and removes duplicate instances of a names. I figured out how to prevent duplicate names from being entered manually (see link to page below). However, as users also have the option to upload names in bulk using the File Uploader plugin, I can’t figure out how to set up the recursive workflow. Any chance you can give me a hand with this?
Link to my test page: Application_MAIN | Bubble Editor
If they upload something and it contains people’s names that are already in the database, would you want it to just ignore them and not write and new data?
And I see in the checkduplicates workflow it’s searching for Count > 1, is that related to this cleanup process? How are you setting that count?
So the answer to your first question is yes. As for count, that’s where I’m stuck.
Ideally, preventing duplicate names included in the upload file from ever being written to the db is best.
I’ll take a look at the editor link in a little bit. Unfortunate when you do the “Upload data as CSV” Bubble doesn’t have any way to validate data first or anything they just upload the data. But I think it should be easy to make a cleanup workflow after the upload.
I kind of made a mock-up of what should work.
Check the front end how it schedules the backend, and check the datatypes tab to see the yes/no field I added to the “Person” datatype.
Also my app is on the free plan so it won’t work on my app.
If the recursive process is too slow we can optimize further to speed things up but it will use more app capacity.
Thank you, Tyler. I’m studying your suggestion now. I’ll let you know if I can get it working.
Thank you so much!
Oooooh! Been looking this over for nearly 2 hours. Confusing. Is there any way we can connect via Zoom or something?
Sorry I’m not available for any calls for a while, but any question in particular after looking at the editor link?
Ok. Since I have a paid plan, let me replicate your example in my editor and study it some more. Thanks for all your help thus far. I really appreciate it!!
Yea take a look, I didn’t run it myself like I said because I’m on the free plan but in theory it should work, might be a little slow but it can be sped up/optimized
Ok. Now I think I’ve got it. What was confusing me was the other backend workflows… like Create and Delete in your example. In fact, there are only two WFs that constitute the recursive process… “csv-upload” and “csv - process”. Everything is working perfectly! Thank you so much!!
1 Like
Great suggestions here. I am working with 1T CSV Uploader which splits the bigger files into smaller ones and uploades them one by one. There is a logic behind that generates a number of files but there is no existing record check.
I am struggling with a similar issue in the sense that I give my customers the option to connect their environments via API or upload CSVs daily. For the CSVs, we are talking about files with 10Ks of records that need to be uploaded daily but only new ones to be added to db and the others to be updated with a new status, if any.
I think I got what you are trying to do here but it needs to make sense for me in words first before I can put it into WFs so I have a question:
- Is the backend wf set searching, updating and deleting records in the uploaded list before pushing it to the database or is it pushing it to the db first and then searches, updates and deletes records in the db?
Also, I fail to understand what this does
I guess, as stupid as this sounds, that I could use a word explanation of the workflows to try to replicate in my case. I would appreciate a lot if somebody can help with this.
Thanks a lot