When Good Faith Gets You Lawyered

Yannick Dixken was diving in Cocos Island when he discovered something worse than a shark: a critical vulnerability in a major diving organization's member portal. Default passwords everywhere. Personal data exposed. Including minors.

He did the right thing. Responsible disclosure on April 28, 2025, with a 30-day embargo to give them time to fix it.

Their response? Legal threats under Maltese law and an NDA so restrictive it banned discussing the disclosure process itself.

Not great.

The Reputation Management Playbook

Eight months later, the vuln is fixed. Cool. But here's the thing: Dixken reports no confirmation that affected users were ever notified. The org prioritised reputation over actually telling people their data was exposed.

The NDA they pushed wasn't about protecting users. It was about protecting their image. Classic corporate security theatre: fix the bug quietly, threaten anyone who mentions it, hope nobody notices you never told the people whose data was compromised.

This is the exact behaviour that makes security researchers think twice before reporting vulnerabilities.

The Community Responds

Lobsters and Hacker News are having a field day with this one, and rightfully so. The consensus is clear: Dixken acted responsibly, the org acted like cowards.

Some key takeaways from the discussion:

On legal protection: Multiple commenters suggested going through third-party security teams for protection. If you're disclosing to an org without a proper VDP (Vulnerability Disclosure Program), you're rolling the dice on whether they'll thank you or sue you.

On deterrence: This kind of response doesn't just hurt one researcher. It sends a message to everyone: "Find our bugs at your own legal risk." That's how you end up with vulnerabilities being sold on the dark web instead of responsibly disclosed.

On accountability: Some commenters advocated for going after both the insurance company and their law firm. Make it expensive to behave like this, or it'll keep happening.

One commenter noted the historical precedent: back in the spam-fighting days, these kinds of legal threats were called "Cartooneys." Often sent by spammers, sometimes from fake attorneys, always absurd. Some things never change.

Why This Matters for Developers

If you're building anything that touches user data, you need a proper vulnerability disclosure policy. Not a "contact us and we'll figure it out" page. An actual VDP with:

  • Clear safe harbor provisions
  • Defined response timelines
  • Commitment to user notification
  • No legal threats for good faith disclosure

GitHub has templates. HackerOne has guides. There's no excuse for not having one in 2024.

And if you're a security researcher? Document everything. Use third-party disclosure platforms when possible. Know the legal landscape of where the company operates.

The Chilling Effect

The real tragedy here isn't one org mishandling disclosure. It's that every developer who reads this story will think twice before reporting the next vulnerability they find.

That diving org's members still don't know their data was exposed. They got legal protection, sure, but their users got nothing.

Meanwhile, Dixken published the story anyway, because sometimes doing the right thing means accepting the risk.

Props to him for that. The rest of us should take notes.

T
Written by TheVibeish Editorial