Dev Reports Bug, Gets Threatened With Lawsuit Instead of Thanks

Yannick Dixken was on a diving trip in Costa Rica when he discovered something wild: a major diving organization's member portal used the same default password for every single account. Every. Single. Account.

We're talking personal data, passport info, and yes, information about minors. The kind of vulnerability that makes you wonder how it shipped in the first place.

The Responsible Disclosure That Went Sideways

Dixken did everything right. On April 28, 2025, he reported the vulnerability through proper channels with a 30-day embargo. Standard practice, right? Give them time to fix it before going public.

The organization's response? Legal threats under Maltese law. They accused him of criminal offenses and demanded he sign an NDA that would prevent him from even discussing the disclosure process itself.

Not "thanks for the heads up." Not "we're fixing this immediately." Just lawyers.

Eight Months Later, Users Still Don't Know

Here's the part that really hits different: after eight months, the organization still hasn't properly notified affected users. That's not just bad practice, that's a GDPR compliance nightmare waiting to happen.

The focus was clearly on reputation management over user protection. Classic corporate playbook: contain the story, threaten the messenger, hope it goes away.

Dixken finally published his account after the eight-month delay he gave them to remediate. The post dropped on Lobste.rs and Hacker News, and the developer community is rightfully pissed.

Why This Matters for Every Dev

This isn't just one bad actor. This is a pattern that's actively deterring ethical security research. When you respond to vulnerability disclosures with legal threats instead of appreciation, you're telling the security community: "Don't help us. Just drop a zero-day and walk away."

The Cyber Resilience Act (Regulation 2024/2847) is supposed to fix this. It mandates vulnerability disclosure programs and sets implementation deadlines for 2024. But regulations mean nothing if organizations can still weaponize lawyers against researchers who follow responsible disclosure protocols.

One commenter on the original post said it perfectly: "If you make legal threats the standard response, you're just incentivizing people to sell bugs to the highest bidder instead of reporting them."

The Real Cost of Legal Intimidation

Here's what happens when organizations pull this move:

  1. Researchers stop reporting vulnerabilities to avoid legal liability
  2. Bug bounty programs become meaningless if legal teams can override them
  3. Users stay vulnerable longer because fixes get delayed by legal posturing
  4. The entire security ecosystem gets weaker

Platforms like HackerOne exist specifically to facilitate responsible vulnerability disclosure with clear legal protections. But they only work if organizations actually participate in good faith.

What Should Have Happened

The correct response to Dixken's disclosure:

  1. Acknowledge the report immediately
  2. Fix the vulnerability
  3. Notify affected users (as required by GDPR)
  4. Thank the researcher publicly
  5. Maybe throw some bug bounty money their way

Instead, they chose reputation management through legal intimidation. And now their reputation is taking an even bigger hit because the story's out anyway, just with added context about how they handled it.

The Bigger Picture

This story landed right as the developer community is already stressed about liability under new regulations. The Cyber Resilience Act sets clear requirements for vulnerability disclosure programs and developer responsibilities. Government vulnerability disclosure policies are being standardized.

But none of that infrastructure matters if organizations can still respond to good-faith disclosures with legal threats.

The diving organization's lawyer won in one sense: Dixken didn't name them publicly. Their immediate reputation was preserved. But the broader message to the security research community? That's the loss that'll keep compounding.

What This Means for You

If you're building anything that touches user data:

  • Set up a proper vulnerability disclosure policy now
  • Make it clear that researchers won't face legal action
  • Have an actual plan for handling disclosures that doesn't start with "call the lawyers"
  • Remember that researchers reporting bugs are doing you a favour, not committing crimes

And if you find a vulnerability? Document everything. Use established platforms. Set clear timelines. And maybe have a lawyer on speed dial yourself, because apparently that's where we're at now.

The security community is watching how organizations respond to disclosures. Choose wisely.

T
Written by TheVibeish Editorial