Creative solution required - CSV Upload + Overwrite data

In need of some help with a challenging use case.

My customers are vehicle lease brokers - they contact other businesses and provide them with business vehicles on a lease basis.

The lease “Offers” are provided to them by various “Finance Providers”.

These “Offers” are provided in a CSV format (causing me much pain and frustration).

An “Offer” is effectively a combination of vehicle + lease terms + price. For example:

Offer A: 2024 Audi Q6 Sportback Black Edition 65kwh 4wd - 1 month + 36 months - £650/month (Massively simplified).

Each vehicle has it’s own “CAP ID” - it’s incredibly granular and it basically covers every possible combination of vehicle you can imagine. Every vehicle (at least in the UK) has a CAP ID and you can tell EXACTLY what make, model, variant a vehicle is, by looking up the CAP ID.

Here is the challenge:

It is likely that multiple “Finance Providers” will provide “Offers” containing vehicles with the same “CAP ID” but differing pricing/leasing options etc.

Customer uploads a file containing an offer which already exists in the databases (Meaning it comes from the same Finance Provider and has the same CAP ID). Upload must overwrite the existing offer, only when the CAP ID and Finance Provider match.

I’m using the T1 Uploader plugin and running it through a backend workflow as suggested, but struggling to progress from there - whichever way I try it, I end up with either another record or everything being deleted.

Any ideas?

Hey @jamie.robson.89 , could you outline the backend workflow process you’ve got through so far?

On the surface, you could send the CSV data to a backend loop, and then check each item against your records (do a search for CAP ID + FInance Provider match), and then create/replace as needed.

Hey @DjackLowCode - thanks for the reply.

Here’s the process:

Step 1: User uploads file (using T1 CSV Uploader) and maps fields.
Step 2:


Step 3:

Step 4:

Couple of notes:

Drivebase customer is required in order to ensure that the data being loaded is only visible to the customer(company) loading it.

The data is being otherwise loaded correctly using this process, it’s purely the deletion that I can’t seem to figure out.

Appreciate you taking a look.

Do you have any experience with coding? I find that when dealing with situations like these, sticking to vanilla bubble overcomplicates things. Making a server side action which processes the csv and returns JSON or the variables directly is probably your best bet

2 Likes

A bit, but this is probably outside of my expertise and ideally need a quick solution due to customer requirement.

Hey @jamie.robson.89 - so the problem with your current workflow is that in step 2, scheduled workflows run ‘asyncronously’, meaning that they will be triggered, and then progress on their own outside of the workflow. So, your telling bubble to create the things from the CSV, but simultaneosly telling it to delete all the offers (the Search for offers:filtered won’t be able to reference changes made by the bulk upload)

For things like this, and as @jonah.deleseleuc has pointed out, I’d recommend using the 1T CSV uploader to fire off the JSON from the CSV’s to an endpoint in your app, then depending on your use case, either run a series of workflows based on this list (if no duplication/error handling required - again asynch) ‘option 1’ or running a loop ‘option 2’ (if duplication/other checks need to be made on the data synchronously). This can all be done with vanilla Bubble + 1T csv uploader efficiently (depending on the scale)

Good luck!

2 Likes

Thanks @DjackLowCode - I’ll give this a whirl and see how far I get!

Appreciate the help.

Merry Christmas.

I’ve given this a shot but not making any progress. Figured I’d see how far I could get but turns out I don’t know where to start.

Any additional help you could give would be really appreciated!

@jamie.robson.89 - how many fields (columns) of the CSV required for your updates?

Thanks for the reply - there’s 28 columns.

Hey @jamie.robson.89 , with that number of columns - here’s a high level example using scheduled workflows on a list for an update flow:

1 - the 1TCSV uploader will give you a list of JSON from your data, something like this:
[
{
“id”: 1,
“first_name”: “John”,
“last_name”: “Doe”,
“email”: “john.doe@example.com”,
“ip_address”: “192.168.1.1”
},
{
“id”: 2,
“first_name”: “Jane”,
“last_name”: “Smith”,
“email”: “jane.smith@example.com”,
“ip_address”: “192.168.1.2”
}
]

2 - Create an endpoint in your app, and get it ready for initialization:

3 - Initialize this endpoint with Dummy data:

4 - Set up your workflow that will actually use this data (Backend Workflow)

5 - on the original endpoint, fire the info that’s received to this Backend workflow (scheduled on list) - the request data is a list of JSON objects (from the 1TCSV)

6 - on each of these workflows - do what you will with the information

7 - on the front end, once the 1TCSV information has been processed, send this to your app endpoint.

8 - test and validate that everything is working :slight_smile:

That’s an example using scheduled workflows on a list. My preference is recursive workflows but it’s a bit more complex.

Good luck with that!

What’s the scale of your CSV uploads? 100s, 1000s, 10,000s?

@DjackLowCode Happy new year - thanks so much for this, and sorry for the late reply.

I haven’t fully tested yet but I’ve been able to at least start passing the information through , I think I can work with it from there.

One thing I’m stuck on though - I need to pass through the “customer” that’s doing the upload, so that I can constrain the search in the backend workflow.

For some context, it is possible (and likely) that one of my customers (a company), will be uploading a file that contains the same records as another (they use generic “ratebooks” which contain the same offers a lot of the time).

I tried using headers but it threw everything out, so figured I’d just ask before I mess it all up and have to start over.

Any ideas?

Hey @jamie.robson.89 - happy new year!

Yea so with the 1TCSV uploader at least you can add a header to each ‘row’ (item in the JSON), which would look something like this:

[
{
“customer”:“12093219083x12094129084”
“id”: 1,
“first_name”: “John”,
“last_name”: “Doe”,
“email”: “john.doe@example.com”,
“ip_address”: “192.168.1.1”
}
]

You could also add some custom parameters to the upload so that you don’t need to add the “customer” ID to the 100s or 1000s of rows your uploading (not that it really matters at that scale).

I’ve done this before to match the customer, company etc. or any other data that needs to be consistent but not explicity added with the CSV - it’s a great plugin.

Thanks @DjackLowCode

I’m having a really bizzare issue where some of the values are getting stripped out.

Creating a debug after this step shows that the values are retained:

We then get to the first backend workflow, and a few of the fields have their values stripped out:

What’s even more weird, is that the formatting is the same for other fields on CSV and they are being retained - the data types are correct and the privacy rules are set up correctly.

Clearly something is happening between the JSON being passed from the front end to the first API endpoint call in the backend, but I can’t for the life of me figure out what it is.

Do you have any ideas?

Hey @jamie.robson.89 , take a look at the JSON that’s being returned from your CSVs and see if there’s any missing information there.

Yeah I added a debug entry after the “CSVUploader generates file” with the JSON texts and all the info is in there (checked in the backend logs as well)

Managed to get around this, but I’m wrestling with the next part which is deleting a list of offers.

Example:

In month 1, I upload a CSV that contains 100 offers. In month 3, I upload a CSV that contains 90 offers. The 10 offers that do not appear, should be deleted.

An offer is determined to be unique based on a few critiera:

CAP ID (Number)
Annual Mileage (Option set)
Initial Term (Option set)
Contract Length (Option set)
Finance Provider (Option set)
Parent Customer (Data type)

Admittedly, these back end workflows and running on lists spins my head a bit, so grateful for any help on this!