Using Data api returns almost infinite data (error)

I am extracting data from my database using the data api and cursor.

my table has 163,000 items in it. That should be 1630 pages of data. BUT i get data back when i increment cursor all the way through cursor 49999.

how do i increment through these pages appropriately?

this is what i see on that last page which is still weird. a single object with a count in remaining that is way higher than what it should be on page 49,999. again, there’s only 163,000 things in the db

{
    "response": {
        "cursor": 49999,
        "results": [
            {
               {REMOVED THIS OBJECT DATA FOR PRIVACY}
        ],
        "count": 100,
        "remaining": 113040
    }
}
1 Like

I do this quite a lot … why is your cursor on anything other than a multiple of 100? (Shouldn’t you be grabbing 100 at a time?) :thinking:

Also it seems to me 49,999 + 113040 = 163,039 (you have 1131 “pages” to go) - and the Data API connector thinks you have 163,039 in your database - not 163,000 …

The way the cursor works (sorry if you already get this and I am telling you what you already know :slight_smile: ) is the cursor “grabs” forward 100 from the cursor position - so you will do 1,630 GET requests of 100 and one last GET request to retrieve 39

1 … 100
101 …200

etc etc so you increment the cursor by 100 everytime …

1 Like

Ahh! That makes sense. Thought cursor was page number.

After cursor 50,000 nothing comes back?

In fact, shouldn’t there still be 100 items on this page? It seems like it’s some kind of bug retrieving items at at that number. Can someone only retrieve the first 50,000 items via data api?

1 Like

I’ve gone up to 400,000+ before - I don’t remember hitting a ceiling at 50,000

2 Likes

Thanks, gents. @lindsay_knowcode @exception-rambler

Will try this approach again with the understanding cursor is like the starting item index and not page number.

[update]
Cursor 100000 returns no results but says I have 63000 remaining

2 Likes

The issue was due to the plan that I am on. I am on professional, not dedicated.

1 Like

Hey, in case of deleted items. How do you align the pagination process? It seems that it is possible to copy the entire table without being able to listen to updates/deletions.

Yes - it’s an interesting problem - there is no concept or method I know of to “lock” changes to a Bubble table (lock in the traditional DB sense), say for example if started a job to download it.

You can “listen for changes” with a Backend Workflow “trigger on change” Trigger - but I worry about performance, and if you’ve a busy DB - this could be pointless.

The way I thought about aligning the db after cloning it, is to maintain an additional table that will keep track the changes (update + delete).
Later we can query this table using the data api with cursor and run an additional api calls or deletion of rows depends on the type of the change.

1 Like