Is it possible to trigger event when data from external PostgreSQL database changes? The database is connected via SQL Database Connector. I would like to send notifications to users when there is a new row of data in the PostgreSQL database. The workflow “trigger events when data changes” allows me to use only one type of data -> User. Thank you for any help!
It would take some effort, but here’s an idea that would probably work:
- Set up a trigger in the PostgreSQL database that calls a stored procedure each time the change occurs where you would want to notify the user. You could fire the trigger when rows are inserted or deleted or columns are modified.
- That trigger creates a row in a new table called Notifications that contains all the notification info for your Bubble users. Have a unique column called NotificationId that identifies the notification. Also, put a column on there with a CreateDateTime timestamp.
- Setup a new data type in Bubble called UserNotifications which would be used to store the NotificationId when the notification is sent to each User (Each time you send a notification, you then create a UserNotification thing to log that you sent the notification to that User).
- In Bubble, create a workflow that fires every x seconds. Using the SQL connector, it grabs the most recent Notifications using the CreateDateTime column (you decide how far you’d want to go back in time before a notification becomes stale) from the SQL database and then compares against the UserNotifications thing in Bubble to see if they were sent yet. If not, send the notification and create the UserNotification thing so you don’t send it again the next time the workflow fires.
Thank you for your ideas!! I managed to set a loop which checks if there is a new row in PostgreSQL database every 5 minutes. If yes, the row is saved to Bubble database. Setting up trigger from PostgreSQL might be more smooth but I don´t exactly know how to do it. Creating UserNotification thing is also a good idea for checking the notifications!