Parse json to database

Hi guys, I need your help. Stuck with writing json to database.
I got returned value in Api connector. Something like this:

{
    "date": "2024-02-29T10:19:09",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 178843482,
    "techSize": "146-152",
    "incomeID": 15913531,
    "isSupply": false,
    "isRealization": true,
    "sticker": "15394342537",
    "gNumber": "3144450735350521411",
    "srid": "8018159578207718045.2.0"
  },
  {
    "date": "2024-02-29T13:13:03",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 158759695,
    "techSize": "128-134",
    "incomeID": 16930118,
    "isSupply": false,
    "isRealization": true,
    "sticker": "14029674630",
    "gNumber": "3482529735899348088",
    "srid": "33094427589042951.0.0"
  },
  {
    "date": "2024-02-29T17:07:29",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 145152791,
    "techSize": "146-152",
    "incomeID": 17325557,
    "isSupply": false,
    "isRealization": true,
    "sticker": "16791625543",
    "gNumber": "8364182271309335382",
    "srid": "10738265589479100.0.0"
  }...

How to wright value in each row in db?
Now I have something like this

And I need each item in each row in db

Bubble doesnā€™t handle JSON particularly well, though I have had a few instances where it was necessary to store the whole response as you are trying to do above.

You can create a field of type ā€œwhatever your response datatype is calledā€ in your db then drop the entire response in there, bubble will recognise the structure but bear in mind the huge caveat that you cannot run ā€œdo a search forā€ on a JSON object or its nested data. This means that you end up needing to use a lot of filtering for complex db requests and it can get very slow/messy.

The other option is to save the entire responseā€™s raw text into a field, then use regex to extract what you need, though again this is pretty messy.

In your case however, since the JSON structure looks pretty simple, I would suggest using a bulk create action via bubbleā€™s data API, and restructuring your db to expect a single object per ā€œthingā€.

Hope that helps

This depend of what you need to do with the data. But in most case, you will use a backend WF (recursive or scheduled on a list) to process each item in the list separately.

I need to show data in table cells

Maybe there is an easier way?

Very interesting :thinking:. Can you share where I may find example of fw?

Thanx! I need any help I can get :+1:

@zzsnowballzz This is your answer basically, though my preference is to use the bulk API to create 100 entries or so at a time.

If you make a bulk create API call, you can pretty much just pass the response straight to bubbleā€™s db, the only change required would be formatting as text and adding newline as the delimiter (as opposed to a comma). Then simply schedule again only when your list ā€œfrom entry 101ā€ is more than 0.

Thereā€™s a solid case for scheduling on a list now however, since bubble have just launched some upgraded functionality for it, and it is significantly cheaper in WF units now. The downside there being that you have no control over each workflow run and donā€™t know when the job has completed.

2 Likes

Absolutely. Bulk api is also an option. @zzsnowballzz if you search forum, you will find many example of each of them, including bulk import using API.

1 Like

Thanx! Your tips and this video https://www.youtube.com/watch?v=ZaD9AIegH5A are great combination.

I just have to figure out how to solve this problem.
image

The above error (if you ignore all the fun encoded quotes) is telling you to only create 1000 entries at a time via the bulk creation API. As mentioned above I normally run on 100 entries (sometimes up to 250) depending on other external APIā€™s rate limits etc.

Try reducing the creation volume in each instance of the workflow, then iterate over the list in bitesize chunks recursively until the list is empty.

Thanks for the explanation. I solved this issue using a different way of writing data to the database. Here it is.

@ed19 by the way, maybe you know, how to set WF so that it doesnā€™t write duplicate data?

{
    "date": "2024-02-29T10:19:09",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 178843482,
    "techSize": "146-152",
    "incomeID": 15913531,
    "isSupply": false,
    "isRealization": true,
    "sticker": "15394342537",
    "gNumber": "3144450735350521411",
    "srid": "8018159578207718045.2.0"
  },
  {
    "date": "2024-02-29T13:13:03",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 158759695,
    "techSize": "128-134",
    "incomeID": 16930118,
    "isSupply": false,
    "isRealization": true,
    "sticker": "14029674630",
    "gNumber": "3482529735899348088",
    "srid": "33094427589042951.0.0"
  },
  {
    "date": "2024-02-29T17:07:29",
    "lastChangeDate": "2024-03-01T03:13:37",
    "nmId": 145152791,
    "techSize": "146-152",
    "incomeID": 17325557,
    "isSupply": false,
    "isRealization": true,
    "sticker": "16791625543",
    "gNumber": "8364182271309335382",
    "srid": "10738265589479100.0.0"
  }...

I have ā€œsridā€. It is unique ID. So when I parse json >1 times on the same date, I got alot duplicated rows in db. I really want data to be saved only with a unique srid.

If you just make sure you only start from the next entry that would normally suffice. If you want to be extra safe, add a constraint to the search for your format as text operator that looks for existing entries in the db

1 Like

This topic was automatically closed after 70 days. New replies are no longer allowed.