Chris Schreiber

Technical defenses alone can no longer protect universities from class-action lawsuits following a data breach. The legal burden has shifted, demanding institutions prove they exercised "reasonable care" through rigorous documentation rather than just technical perfection. Learn how to transform your security program by aligning with standard frameworks and documenting your ongoing risk management journey.
I spent years building technical defenses at major universities. Firewalls, detection systems, incident response playbooks. The whole stack.
Then something shifted. I started seeing a pattern I hadn't expected.
Class-action lawsuits started landing within days of breach notifications. Sometimes before the institution even finished its forensic analysis. The lawyers weren't waiting to see whether data was stolen or if anyone was harmed. They were filing based on the breach announcement alone.
Nearly 4,000 class actions involving data privacy issues are estimated to be filed in federal courts by the end of 2024. That number tells you everything about where this is headed.
The technical work I was doing hadn't changed. We were still stopping attacks, containing incidents, and recovering systems. But the legal landscape had shifted underneath us.
Here's what changed.
Courts have started finding that organizations owe consumers an independent legal duty of care to take reasonable measures to safeguard personal information. In the landmark Equifax ruling, the court held that organizations owe consumers an independent duty of care to safeguard personal information when they know of foreseeable risks to their systems. The court allowed negligence claims to proceed where plaintiffs alleged an "increased risk of harm" as a concrete injury, even before identity theft had occurred.
The burden of proof now falls on the organization. You have to prove you acted reasonably. The plaintiff doesn't have to prove you were negligent.
If you experience a breach and can't provide evidence that you implemented reasonable cybersecurity measures, a court may find you liable under the law of negligence.
Technical excellence alone doesn't protect you anymore. You need documented evidence of reasonable care that exists before the bad day happens.
The medical malpractice parallel is useful here.
Doctors don't promise patients will never get sick. They promise a standard of care. They keep meticulous notes of treatments, consultations, and difficult decisions. If a patient's health declines, these notes prove the doctor was diligent rather than negligent.
Cybersecurity is following the same trajectory.
The standard of care is typically expressed as a duty to maintain reasonable cybersecurity practices. Judge Learned Hand's negligence formula explained it clearly: a duty to take precautions exists if the burden of a safeguard is less than the probability of an accident times the gravity of harm.
In plain terms, if a security measure is cheaper than the expected loss from a breach, failing to implement it can be deemed negligent.
This creates a documentation problem many higher education institutions haven't solved.
Most incident response plans focus on technical recovery. Containment, eradication, recovery, notification. The standard playbook.
What's missing is the pre-breach legal strategy.
You need three distinct layers of proof. First, show that you designed a program specifically for your unique risks. You didn't just copy a generic standard. Second, demonstrate that you actually implemented that design. Third, and this is where many institutions fall short, document that you are sustaining it.
A thick binder of policies gathering dust on a shelf doesn't count as defensibility.
Real defensibility requires a living record of decision-making. When a regulator or opposing counsel looks for evidence of reasonable care, they want to see the friction of daily operations. If you claim you patch systems, show the variance reports. If you claim you have board oversight, show the minutes where you discussed a hard resource trade-off when prioritizing projects.
Reasonable care is the visible, documented trail of a team actively managing risk.
There's a dangerous instinct to avoid documenting what you didn't fund.
CIOs and CISOs may fear that writing down accepted risks creates ammunition for plaintiffs. The opposite is true.
If you tell the board you have a detection capability, but it's really just software running without staff to maintain it, you are building a trap. When that system fails, the argument won't be that you were unlucky. The argument will be that you claimed protection while knowing the structure was unsound.
It's safer to tell the board directly: "We are not building this yet. We don't have the resources to make it sturdy."
No university has the budget to fortify every perimeter. You have to make hard choices. Documenting those choices turns a security gap into a strategic decision.
When you document what you didn't fund and why you selected other priorities, you prove you analyzed the risk and made a conscious leadership decision to accept it. You are showing your work. This protects you because it proves the breach didn't happen from a lack of attention. It happened because you were operating within known, approved constraints.
That's the difference between negligence and reasonable risk management.
In medicine, the standard of care is defined by what a reasonable physician would do in similar circumstances. In cybersecurity, since we lack a universal certification board, our equivalent of peer consensus is published security frameworks.
Texas S.B. 2610 shields businesses that implement and maintain a cybersecurity program conforming to an industry-recognized framework from exemplary damages in data breach lawsuits. The law specifically mentions NIST, FedRAMP, and the ISO 27000 series.
Nineteen states have comprehensive data privacy statutes requiring reasonable security. At least six states have safe harbor laws that reward voluntary adoption of cyber best practices.
This isn't theoretical. Framework alignment has become literal legal armor.
Alignment requires more than reading a PDF and inserting references in your policies. You should translate generic standards into your specific environment and show how your security decisions satisfy multiple regulatory requirements simultaneously.
This is where planning tools become essential. For strategic planning, I built Cyber Heat Map to help show how capabilities support framework alignment and to simplify building a living roadmap.
If you need deeper compliance automation, platforms like Vanta, Drata, or Secureframe map your daily operations back to regulatory standards. For a framework approach you manage internally, the Unified Compliance Framework or Secure Controls Framework provide robust mapping structures.
Most incident response plans are static scripts designed to put out a fire. What's missing? The same element that many security programs overlook: a rigorous, documented commitment to post-incident analysis.
Courts and regulators understand systems fail. What they do not forgive is stagnation.
In the incident report templates I've built, the most valuable section is the analysis. We ask the hard, mechanical questions: Did the detection tools actually trip the wire? Did the containment walls hold? Did recovery happen as designed?
This is legal evidence. When you can show you identified a failure, figured out what broke, and fixed the process so it doesn’t happen again, you're showing reasonable care.
Defensibility isn't about being perfect. It's about being responsive.
Connect the incident directly to your project backlog. When something breaks, use the friction to justify adjusting investment priorities. Look at your capabilities and assess what they need. If something on your roadmap would have prevented this specific problem, you have data to justify moving it up the list. Explain how it would have helped.
This turns documentation into portfolio management. It stops being a compliance chore and becomes a tool for continuous, funded improvement.
Proofpoint's 2024 Voice of the CISO report found that 72% of CISOs would not join an organization without D&O liability insurance coverage. Separately, Fastly research found that 93% of organizations have introduced policy changes over the past 12 months to address rising CISO personal liability risks, including 41% increasing CISO participation in strategic decisions at the board level.
This is the professionalization of cybersecurity liability.
CISOs wanted a “seat at the table” and, now that we have it, we realize it's a heavy responsibility to translate technical noise into business reality.
One non-negotiable skill for the next generation of information security leaders is the ability to narrate imperfection. The old mindset was the compliance auditor who wanted a dashboard full of green checkboxes. The new mindset is the risk advisor who is comfortable showing the board a messy, constrained reality.
Universities cannot afford to implement every potential cybersecurity solution. You must be skilled enough to explain why you chose Option A over Option B, what the trade-offs were, and why the remaining risk is acceptable.
This demands massive professional range. You need the technical chops to debate architecture with the CTO in the morning, and the strategic fluency to explain to a research dean in the afternoon how a secure data enclave helps them win more grants.
We are no longer just guarding the walls. We are helping the institution build them, knowing exactly where the mortar is strong and where it is weak.
Security work never finishes. That's an operational reality.
On the surface, the litigation threat seems to demand proof of completion. Proof that you did enough.
You can bridge these conflicting truths.
The legal standard does not demand that you “finish” your security program. It demands that you remain active.
Reasonable care means proof of an ongoing, rigorous process. Courts want to see the cycle: assessment, recommendation, discussion, and remediation. They want to see that you are continually communicating the current state and the current risks to the stakeholders who hold the checkbook.
We need to abandon the fantasy that CISOs are superheroes who shield organizations from all harm. That expectation is dangerous.
We are internal risk advisors. Our job is to help IT leaders, administrators, researchers, and the board understand where we stand today, what options we have to get safer, and what it will cost.
An accurate record of that journey is the bridge between operational reality and legal defensibility. You can't prove you reached the destination, because there isn't one. Instead, prove you steered the ship with competence, intention, and transparency.
Documenting that process is essential.
The litigation threat is real. But it's not insurmountable. Institutions that will survive it are the ones that understand documentation of their security program is not a burden. It's the strongest defense you have.

Start Your 30‑Day Diagnostic - $399
Build a data‑informed, board‑ready cybersecurity plan in 30 days.
Includes expert guidance, 30‑day access to the Cyber Heat Map® platform, and weekly group strategy sessions.
No long‑term commitment. Just results.
Secure your seat today.