As of late April 2026, the House Financial Services Committee is actively discussing “additions” to the Gramm-Leach-Bliley Act.
A “GLBA 2.0” moment is arriving because the state‑by‑state privacy patchwork has finally collided with the operational reality of how national lenders use, store, and protect financial data. The current Hill conversations are really about whether Congress can impose a single federal standard for access, deletion, and data minimization without breaking the Safeguards machinery that keeps the financial system running.
The New GLBA Debate in DC
The House Financial Services Committee has spent early 2026 openly examining how to “modernize” Title V of GLBA for a data‑driven, app‑driven financial ecosystem. Members and staff are balancing three goals: more consumer control, clearer obligations for institutions, and relief from the growing state privacy maze.
Recent discussion drafts and Republican‑backed bills would explicitly make Title V the uniform national standard for financial data privacy and security, with broad preemption of stricter state rules in this space. The pitch to lenders is simple: trade a tighter federal privacy regime for freedom from dozens of divergent state frameworks.
The State Patchwork Problem
In 2026, roughly 20 states have comprehensive privacy laws either in effect or set to take effect, with new statutes in Indiana, Kentucky, and Rhode Island joining earlier regimes in places like California, Colorado, Texas, and Connecticut. Each law brings its own spin on access, deletion, opt‑out rights, and definitions—often with different thresholds, exemptions, and timelines.
Most of these laws nominally “exempt” GLBA‑regulated data or GLBA‑regulated entities, but that exemption is far from clean in practice. As lenders expand into fintech‑style services, data aggregation, and ancillary revenue streams, more of their data falls outside the classic GLBA bucket, squarely under these state laws, and must be handled differently from traditional customer information.
For a national lender, this produces three concrete headaches:
- Different rights: California’s deletion rules are not identical to Virginia‑style statutes; some states emphasize opt‑outs, others lean into impact assessments or special treatment of financial data.
- Different scopes: “Consumer,” “sale,” and “sensitive data” are defined differently, which changes who is covered and what data falls inside the regime.
- Different timelines and procedures: Cure periods, response times, and required notices vary, forcing lenders into a tangle of overlapping workflows.
At scale, this means multiple versions of privacy logic baked into systems, multiple flavors of consumer rights, and overlapping inventories of where data lives—none of which maps neatly onto a single GLBA compliance program. That is the pressure point pushing Congress toward a GLBA 2.0.
Access vs Deletion: What Consumers Want
Every modern privacy law tries to give people two intuitive levers: “Show me what you have” and “Delete what you don’t really need.” Access and deletion rights are easy to explain in public hearings and press releases, and they resonate with consumers who assume their financial data is being monetized in opaque ways.
In a GLBA 2.0 context, “access” will likely expand beyond the classic privacy notice to include:
- A clear list of what categories of financial data a firm holds.
- At least some ability to see specific items, especially when used for profiling, underwriting, or targeted marketing.
Deletion is trickier. Pure consumer‑centric deletion would allow people to say: “Delete everything you don’t absolutely need to keep, and stop using what you keep for marketing or analytics.” But in financial services, that runs into a wall of practical constraints: regulatory retention rules, fraud monitoring, dispute resolution, audit obligations, and basic risk management all depend on historical data.
The debate in Congress is not whether consumers should have these rights, but how far they can go before they undercut the safety and soundness of lending and payments systems.
The Operational Reality: Why “Just Delete It” Doesn’t Work
For lenders and other financial institutions, wholesale deletion on demand is rarely possible without undermining other laws or core controls. At a minimum, financial institutions have to keep certain records to:
- Comply with record‑keeping rules (e.g., anti‑money laundering, fair lending, tax, and audit obligations).
- Investigate fraud, respond to chargebacks or disputes, and defend against litigation.
- Demonstrate compliance to regulators—something that inherently requires historical evidence.
Well‑designed state laws already nod to this by carving out exceptions where data must be retained to comply with legal obligations or to detect security incidents. But when you multiply slightly different exceptions across 20+ states, you get a maze of “delete here, retain there, but only for these purposes, and only in this jurisdiction.”
From a practitioner’s perspective, the frontline friction shows up in three places:
- Systems design: You need data architectures that can honor a deletion request for some uses (say, targeted marketing) while preserving the same records in immutable audit logs.
- Vendor chains: Cloud providers, data brokers, fraud vendors, and core processors each implement rights differently, creating gaps between what your policy promises and what your tech stack can actually do. DPAs help, but just add to the overall system strain.
- Identity proofing: Before you delete or disclose, you must be sure the requester is the right person; in a high‑value financial context, misdirected access or deletion can, and likely does, constitute its own security incident.
Any GLBA 2.0 regime that ignores this operational layer will either be ignored in practice or will generate a wave of technical debt and enforcement risk.
Data Minimization vs Safeguards Rule
Data minimization: collect less, keep less, use it only for what is necessary, has become the normative backbone of modern privacy statutes. GLBA, by contrast, grew up around notice and opt‑out, with the FTC’s Safeguards Rule focusing on how data is secured and governed rather than whether it should exist in the first place.
The FTC’s updated Safeguards Rule now requires covered financial institutions to implement a written, risk‑based information security program, with concrete expectations around access controls, encryption, logging, testing, and oversight of service providers. It also adds breach notification obligations for significant incidents, further raising the stakes for poor security. But the Safeguards Rule does not, on its own, embed a strong data‑minimization mandate; it assumes the data set and tells you how to protect it.
Introducing a federal data‑minimization standard into GLBA raises a few tensions:
- Scope conflict: Minimization tells you to avoid collecting or retaining “unnecessary” data, while Safeguards pushes you to inventory, monitor, and protect what you have, often encouraging more detailed logging and telemetry.
- Logging paradox: Security best practices (like detailed security logs and device telemetry) can massively expand the data you hold about an individual; strict minimization could be read to cut against that.
- Retention vs. risk: Minimization will push you toward shorter retention windows, but fraud analytics and fair lending models often depend on long‑term data sets.
A well‑crafted GLBA 2.0 could reconcile this by defining “necessity” in terms of both consumer expectations and regulatory obligations. In other words, you collect and keep what you reasonably need to deliver the product, comply with the law, and prevent harm; then the Safeguards Rule tells you how to secure that trimmed‑down dataset.
How a Federal Standard Could Actually Help
Despite the friction, a thoughtful federal minimization and rights framework could streamline life for lenders if implemented as a true ceiling rather than a mere floor.
Here is what that might look like in practice:
- One set of rights, one set of exceptions: A single federal rubric for access, deletion, and correction, plus clearly articulated retention exceptions tied to other federal and state financial laws, would allow institutions to design uniform processes and systems nationwide.
- Harmonized definitions: If GLBA 2.0 standardizes key privacy concepts for the financial sector (consumer, sensitive data, sale/share, profiling), lenders can draft a single set of privacy notices, a single playbook for data inventories, and a single core vendor contract template.
- Aligned with Safeguards: Embedding data minimization into GLBA while cross‑referencing the Safeguards Rule could encourage firms to (1) shrink the attack surface by collecting less, and (2) secure what remains to a clear, risk‑based standard.
From a practitioner’s standpoint, the real efficiency gains come if Congress is willing to preempt inconsistent state privacy obligations governing GLBA‑regulated data and GLBA‑regulated entities, including newer data‑rich activities like aggregation and alternative underwriting. Without strong preemption, national lenders will still be forced to run a dual system: the federal baseline plus bespoke state add‑ons.
Listen, I am a compliance attorney; this is how I earn a living. But even I think the complexity is overly burdensome. Sure, your large industry giants can absorb the associated costs, but the regulatory burden can bury small, innovative start-ups. I have seen this in consultations with potential clients. They want to do the right thing, but it can be a gut punch when they see everything that is required of them.
A Practitioner’s Take: “Humble” Laws in a High-Tech World
GLBA was, by design, a humble statute: short, principle‑based, and heavily delegated to regulators for implementation. It assumed a world of branch banking and paper statements, not mobile super‑apps, embedded finance, and constant data sharing with third‑party platforms.
Twenty years in practice show that this humility is both its strength and its liability:
- Strength, because its general principles (limit sharing, protect security, give notice) still map reasonably well onto modern data flows.
- Liability, because the statute does not speak in the vocabulary that now defines privacy debates: minimization, dark patterns, algorithmic profiling, and automated decision‑making.
The current legislative push is best understood as an attempt to retrofit a modest, late‑90s law onto a world of APIs, behavioral scoring, AI, and cross‑platform identities. From my perspective, that will only work if Congress is honest about three realities:
- You cannot offer meaningful deletion rights in finance without clearly acknowledging other laws requiring retention and the risk controls that require historical data.
- You cannot demand aggressive minimization and, at the same time, penalize institutions for failing to detect complex fraud patterns that are only visible in long‑term data.
- You cannot promise uniformity to lenders while preserving a free‑for‑all of state‑level experimentation on the same financial data sets.
GLBA 2.0 does not need to become a sprawling, tech‑specific codebook. A carefully updated, still “humble” statute, anchored by robust rulemaking authority, could establish crisp principles, uniform rights, contextual minimization, and aligned security, while allowing regulators to iterate as technology and threats evolve. The real test will be whether Congress is willing to trade some state flexibility for a genuinely coherent national framework that both consumers and lenders can live with. So we are relying on Congress being able to work collaboratively to develop or update the law in a fairly contentious area with multiple divergent interests. I won’t hold my breath, but I will look with hope and positivity.
If you were advising a national lender today, would you rather see GLBA 2.0 aggressively preempt state laws, or preserve some space for states to experiment at the margins of financial privacy? There is no absolutely right or wrong answer. Discussing this is how we come to a rational decision…which may be the biggest obstacle to overcome.