Breach? Jackpot. How the Legal System Profits from Failure

Part 4 of the “Accountability in Cybersecurity is Broken” Series

NOTE: Let me preface this by stating that I am not a lawyer, nor am I qualified to give you legal advice. I am a 30+ year information security veteran who’s tired of seeing people get screwed over by a system that’s not working (for them). Also, to be fair, this is me sharing my opinion/perspective, I’m sure lawyers would have something to say about this (don’t lawyers always have something to say?).

The Legal Gold Rush

Here’s the thing: every time a major data breach hits the headlines, it’s not just the criminals who strike gold. Lawyers do, too. In 2024 alone, more than 1,488 class action lawsuits were filed in the U.S. over data breaches, nearly tripling the number from 2022. Take the Equifax breach of 2017—147 million people’s personal data exposed, and the settlement? A whopping $700 million, with a chunk going to legal fees while victims got as little as $125 or a promise of credit monitoring.

NOTE: The lawyers representing consumers in the Equifax data breach class action lawsuit were awarded $77.5 million in attorney fees by a Georgia federal judge in December 2019.

Or take AT&T’s 2024 breaches, settled for $177 million, where lawyers pocketed millions while affected customers scrambled for up to $7,500—if they could prove their losses.

NOTE: The total attorney fees requested amount to approximately $59 million across both AT&T breach settlement funds. This amount is pending final court approval, scheduled for a hearing on December 3, 2025, and could be subject to adjustments or appeals.

This is the dirty little secret of cybersecurity accountability: the legal system isn’t fixing the problem; it’s profiting from it. And worse, the people driving these lawsuits—lawyers—often don’t understand what a good cybersecurity program even looks like. Meanwhile, the industry lacks clear standards for what’s negligent, leaving companies and victims stuck in a cycle of failure and litigation. Let’s unpack why this is a root cause of broken accountability in cybersecurity.

The Legal System’s Payday

Data breaches are a jackpot for class action law firms. In 2023, data breach class actions became the fastest-growing segment of filings, with over 2,000 cases clogging federal and state courts. The math is staggering:

  • T-Mobile (2021): $350 million settlement, with millions in legal fees and $150 million earmarked for “data security improvements” that should’ve been in place before the breach.
  • Capital One (2019): $190 million settlement, plus an $80 million fine from regulators, with law firms taking a hefty cut.
  • Meta (2024): A jaw-dropping $1.4 billion for biometric data violations, dwarfing most other settlements.

These numbers sound like accountability, right? Wrong. The victims—whose Social Security numbers, bank details, or call records are now floating on the dark web—often get pennies, credit monitoring they might not need, or nothing at all if they can’t prove “harm.” Law firms, on the other hand, walk away with millions. It’s a system where failure pays, and the biggest winners aren’t the ones securing networks or protecting consumers—they’re the ones filing the paperwork.

Lawyers Aren’t Cybersecurity Experts

Here’s where it gets frustrating for practitioners like me. Most (not all) lawyers handling these cases aren’t cybersecurity experts. They’re not out there assessing risk, implementing controls, or debating the merits of NIST vs. ISO 27001. Yet, they’re the ones arguing in court about what constitutes “reasonable” cybersecurity. How can you judge negligence when you don’t know what a good information security program looks like?

Take the H&R Block breach lawsuit from 2024. The plaintiff claimed the company “failed to implement adequate cybersecurity measures,” exposing Social Security numbers and financial details. Sounds reasonable, but what does “adequate” mean? The lawsuit doesn’t specify whether H&R Block lacked encryption, multi-factor authentication (MFA), or a proper incident response plan. It’s a vague accusation that lets lawyers cast a wide net without needing to understand the technical details. And courts, which often lack cybersecurity expertise themselves, are left to decide based on generic claims of “negligence.”

This disconnect is a root cause of broken accountability. Without technical expertise, lawsuits focus on the aftermath of a breach—damages, settlements, fines—rather than addressing why the breach happened or how to prevent the next one. It’s like suing a doctor for malpractice without understanding what standard medical care entails.

The Murky Definition of Negligence

Let’s talk about negligence, because this is where the rubber meets the road—or rather, where it skids off. As a practitioner, I’d argue it’s negligent to run a business without an asset inventory. How can you secure what you don’t know you have? Yet, there’s no universal standard in the industry that says, “Thou shalt have an asset inventory, or thou art negligent.” Same goes for exposing a login page to the internet with just a username and password—no MFA. To me, that’s asking for trouble in 2025, when MFA is table stakes. But is it legally negligent? The answer is a maddening “it depends.”

The lack of clear negligence standards is a massive gap in cybersecurity accountability. The Federal Trade Commission (FTC) uses “reasonableness” as a benchmark, requiring companies to implement “reasonable” security measures to protect consumer data. But what’s reasonable? The FTC doesn’t provide a checklist, and neither do most state laws. California’s Consumer Privacy Act (CCPA) comes close, allowing private lawsuits for breaches involving unencrypted personal information, but even that’s limited to California residents.

Courts are all over the map. In some cases, like In re Fortra File Transfer Software Data Sec. Breach Litig. (2024), plaintiffs successfully argued that a company’s failure to configure software properly was negligent. In others, like Beck v. McDonald (2017), courts dismissed claims because the risk of harm wasn’t “substantial” enough. This inconsistency means companies don’t know what’s expected, and lawyers exploit the ambiguity to file lawsuits that prioritize profit over progress.

The Accountability Gap

The legal system’s profit-driven approach to breaches creates a vicious cycle:

  1. A company gets breached, often due to basic oversights (no MFA, outdated systems, no asset inventory).
  2. Lawyers file class actions, citing “negligence” without clear technical grounding.
  3. Settlements are reached, with law firms taking a big cut and companies promising to “improve security” (often vaguely).
  4. Nothing changes systemically—companies patch the bare minimum, and the next breach is just a matter of time.

This cycle doesn’t hold companies accountable for robust cybersecurity; it incentivizes them to budget for lawsuits as a cost of doing business. Meanwhile, the absence of clear negligence standards leaves practitioners like me screaming into the void: “Why aren’t we talking about prevention?”

What’s the Fix?

If we want real accountability, we need to break this cycle. Here are a few ideas:

  • Define Negligence Clearly: The industry needs universal standards for what constitutes negligent cybersecurity. For example, mandate asset inventories, MFA, and regular penetration testing as baseline requirements, with legal consequences for non-compliance.
  • Educate the Legal System: Judges and lawyers need basic cybersecurity literacy to evaluate claims fairly. Partner with organizations like NIST or CISA to create training programs.
  • Shift Incentives: Reward companies for proactive security (e.g., tax breaks for NIST-compliant programs) instead of punishing them after the fact. Make prevention cheaper than litigation.
  • Empower Practitioners: Give cybersecurity professionals a seat at the legal table. Expert testimony from practitioners could bridge the gap between technical reality and legal arguments.

Stop the Jackpot

The legal system’s current approach to data breaches is a (not “the”, but “a”) root cause of broken accountability in cybersecurity. Lawyers profit handsomely, but their lack of expertise and the absence of clear negligence standards mean lawsuits rarely drive meaningful change. Instead, they turn breaches into a perverse jackpot where failure pays—for everyone except the victims and the practitioners trying to prevent the next one.

What do you think? Are we doomed to keep paying lawyers to clean up after breaches, or can we build a system that holds companies accountable before the hackers strike? Let me know in the comments, and stay tuned for Part 5, where we’ll dig into another piece of this broken puzzle.

In the meantime, Matt Goodacre and I will chat about this on the next episode of InfoSec to Insanity. Join the conversation if you’d like.

Subscribe

I don’t do spam. I don’t eat it and I don’t send it. Not to mention, it’s also illegal!

I’ll write a privacy policy soon (that you won’t read).

About the Author

One thought on “Breach? Jackpot. How the Legal System Profits from Failure

  1. Thank you Evan. You’re right, the incentive system is badly broken. But it’s not the fault of the lawyers. The lawyers actually can’t change the system. It is the fault of the legislators and regulators. The incentive systems now in place encourage negligence and criminal recklessness. They encourage short-term thinking instead of long-term needs. This must change. But unfortunately our system doesn’t change until things get really REALLY bad (the proverbial Pearl Harbor moment). I would have thought that we would have passed that point already, but evidently we have not. The lawyers are simply trying to get compensation for the wronged parties, settle disputes, etc. The damages paid, the bad press, the high fees of the involved lawyers, etc., should be considered a strong deterrent to bad behavior on the part of boards and executive management. Thank you for addressing this serious problem, which is unfortunately all too often “swept under the rug.” I write this as an attorney and also a long-time specialist in information security and privacy (such animals do exist).

Leave a Reply

You may also like these