How the Tea App Got Blindsided on Data Security
A Women‑First Safety Tool - and a Very Public Breach
Tea is billed as a “women‑only” community where users can swap tips, background‑check potential dates, and set red or green flags. In late July 2025 the app rocketed to No. 1 in Apple’s free‑apps chart, boasting roughly four million users and a 900 k‑person wait‑list.
On 25 July 2025 a post on 4chan revealed that anyone could download an open Google Firebase Storage bucket holding verification selfies and ID photos. Technology reporters quickly confirmed the issue and confirmed the bucket had no authentication or even listing restrictions.
What Was Exposed?
About 72,000 images were taken. Roughly 13,000 were verification selfies that included “driver's license or passport photos; the rest - about 59,000 - were images, comments, and DM attachments from more than two years ago. No phone numbers or email addresses were included, but the IDs and face photos are now mirrored on torrent sites, according to public reports.

Tea’s Official Response
On 27 July Tea posted the following notice to its Instagram account:
We discovered unauthorized access to an archived data system. If you signed up for Tea after February 2024, all your data is secure.
This archived system stored about 72,000 user‑submitted images – including approximately 13,000 selfies and selfies that include photo identification submitted during account verification. These photos can in no way be linked to posts within Tea.
Additionally, 59,000 images publicly viewable in the app from posts, comments, and direct messages from over two years ago were accessed. This data was stored to meet law‑enforcement standards around cyberbullying prevention.
We’ve acted fast and we’re working with trusted cyber‑security experts. We’re taking every step to protect this community – now and always.
(Full statement: instagram.com/theteapartygirls)
How Did This Happen?
At the heart of the breach was a single, deceptively simple mistake: the Firebase bucket that stored user images had been left wide open to the internet and even allowed directory‑listing. Whoever set it up apparently assumed that the object paths were obscure enough to stay hidden, but obscurity is never security. Once one curious 4chan user stumbled on the bucket, it took only minutes to write a script that walked the entire directory tree and downloaded everything. The files were zipped, uploaded to torrent trackers, and instantly became impossible to contain. In other words, a configuration left on its insecure default setting turned a women‑safety tool into a privacy disaster.
What Developers and Security Teams Can Learn
For engineering teams, the lesson is straightforward: always start from “private” and add access intentionally. Google Cloud Storage supports Signed URLs and Firebase Auth rules precisely so you can serve content without throwing the doors wide open; using those controls should be the norm, not the exception. Meanwhile, security leaders need to accept that misconfigurations are inevitable and build continuous monitoring around them.
Modern Data Security Posture Management (DSPM) platforms watch for sensitive data, like face photos and ID cards, showing up in publicly readable locations and alert the moment they do. Finally, remember that forgotten backups or “archive” buckets often outlive their creators’ attention; schedule regular audits so yesterday’s quick fix doesn’t become tomorrow’s headline.
How Sentra Would Have Caught This
Had Tea’s infrastructure been monitored by a DSPM solution like Sentra, the open bucket would have triggered an alert long before a stranger found it. Sentra continuously inventories every storage location in your cloud accounts, classifies the data inside so it knows those JPEGs contain faces and government IDs, and correlates that sensitivity with each bucket’s exposure. The moment a bucket flips to public‑read - or worse, gains listing permissions - Sentra raises a high‑severity alert or can even automate a rollback of the risky setting. In short, it spots the danger during development or staging, before the first user uploads a selfie, let alone before a leak hits 4chan. And, in case of a breach (perhaps by an inadvertent insider), Sentra monitors data accesses and movement and can alert when unusual activity occurs.
The Bottom Line
One unchecked permission wiped out the core promise of an app built to keep women safe. This wasn’t some sophisticated breach, it was a default setting left in place, a public bucket no one thought to lock down. A fix that would’ve taken seconds ended up compromising thousands of IDs and faces, now mirrored across the internet.
Security isn’t just about good intentions. Least-privilege storage, signed URLs, automated classification, and regular audits aren’t extras - they’re the baseline. If you’re handling sensitive data and not doing these things, you’re gambling with trust. Eventually, someone will notice. And they won’t be the only ones downloading.
<blogcta-big>