In the light of the Facebook breach, some of the questions we are getting from prospective customers in regards who else has access to the the data. At the moment this is Bubble team and I don’t know who and when so we have no idea who is accessing the App/personal information or when. bubble.is is in a unique position here as they have access to our data but also that of our customers data.
I suspect this issue will now start to filter down through the technology community as the fallout from the Facebook breach starts to influence IT policy.
I understand the company may have employee use policies, but this only goes half way to answering the question and providing assurance. I don’t know the answer, but maybe it needs to be in such a state where Bubble needs to request access to our App data to be able to conduct debugging with our app. Maybe this is in place, but at the moment it is a fuzzy area?
How do we instil confidence in our customers that Bubble.is team have a close enough contract relationship with us that we have oversight control over data access?
Hi guys, the Bubble team does have access to application data, which we only do in order to debug issues. We have internal tools that allow us to track administrative access, so that we can audit to make sure that people on the team are only using this in response to bug reports. Right now, we’re a pretty small team (6 people), so keeping an eye on this isn’t too hard, but as we grow, we’re definitely going to put more controls in place, and I’m currently auditing all of our security practices as part of our GDPR compliance work. I do like the idea of explicitly notifying or asking permission from our users when the Bubble team needs to access their user’s data, and I’ll keep that in mind as I do my review.
@josh would it be possible also to do something with “Run as”. Like limiting this option to QA environment. I believe that having something like “run as” in Live system is very dangerous. It’s ok to have admin access to the live version as owners of the app of course. But being able to impersonate a user in the live environment could raise some questions and eyebrows. Or maybe make it so even if running as another user any changes in data are saved under the admin user, not the “Run as” user.
We do log behind the scenes when someone uses “run as…” which we retain for two weeks, so we can audit in an emergency. But I agree it would be good to expose those audit logs to the app owner (or possibly provide the option to disable that feature for your app). I’m adding it to my list.
Josh, firstly, thanks for proactively looking into data privacy/security in the first place.
Data privacy is very important and it can quickly become an out of control overhead when improperly applied to a business or, in this case, service. Bubble applications range from simple, to highly complex. Some apps have 10 users, others have 1000’s. The key for Bubble is to strike the right balance of investment when it comes to feature development.
If I’m a simple app creator, where user data is not in any way sensitive (for example no highly sensitive trade secrets or information being kept) - I don’t expect to be charged for the overheads required to audit and manage it.
However, if I have a simple app, that has 1000’s of users, where a strict audit trail and data retention schedule is required - I would of course have some minimum expectations in place of any service provider or platform the I use.
I feel that in order to serve those who /require/ advanced audit trail, logging, and data retention… it should be available to purchase in addition to the standard monthly service fee.
I recommend this because when you mentioned that you only retain logs for “Run as…” for two weeks, I instantly realised that this restriction is only in place from a capacity/efficiency/overhead point of view. One which I’m sure certain developers, business owners, and CDOs would be more than happy to pay to have extended accordingly.
Again, I want to reiterate gratitude towards you and the team for already having it on the radar… I just wanted to contribute my couple of cents worth of opinion towards an important topic.
Thanks @josh for considering it and adding to the list.
On the other side @universe I can only agree with you partially. I don’t know if you are aware of EU GDPR but it’s goal is to make companies protect data by design and by default. This means that every company should do it’s best to restrict as much as possible data.
What you are suggesting is not Data Protection by design and by default, but by premium which clearly goes against the general ideal of the GDPR.
If I’m a simple app creator, where user data is not in any way sensitive (for example no highly sensitive trade
secrets or information being kept) - I don’t expect to be charged for the overheads required to audit and manage it.
You shouldn’t be charged no matter how complex your application is. PII data has only one type of complexity and it’s independent of the complexity of your app design.
So yeah, you shouldn’t be expected to be charged for overhead no matter what.
After Facebook scandal it’s just a matter of time that other countries start copying GDPR approach of Data Protection by design and by default and therefore these overheads you mention need to be part of the default package.
Hi @JonL, I’m fully aware of the GDPR and of the aim to protect data by default. I’m actually really grateful to the EU for beginning the movement. Bubble have already commented that they are using it as the foundation moving forward - which is great news. I was mainly referring to the ‘beyond’ GDPR use cases - such as data retention specifically. Where an enterprise wants or needs to provide more than the minimum as mandated by law.
The overheads for ‘additional’ retention of data - such as “Run as…” logs that are kept beyond two weeks, will come down to data storage. There’s a cost associated with that data storage. At an individual level that may appear small, but when multiplied by 115,000 customers, that number becomes a cost very quickly. One which would be unfair to hold over customers who don’t need extended retention of such data.
My recommendation was that while they’re reviewing how they manage data, I’m just asking that they consider providing those who require additional protections that go beyond the minimum - with the option - even if it costs extra.
Hi, i’m considering to build something here but after reading this is a complete deal breaker. This is unacceptable to have the database open to your team without explicit permission. This post is many years old and still a huge breach of privacy, why is this not addressed, why are other users not expressing as much concern as they should for this issue. Let me know when you have fixed this poor security flaw.