Detecting & governing Model Context Protocol (MCP) connections is the new security frontier
Watch our demo

Coinbase's $400 Million Wake-Up Call: Why DLP Must Monitor Behavior, Not Just Content

On this page

In May 2025, Coinbase disclosed a data breach that exposed nearly 70,000 customer records—not through a sophisticated external attack, but through bribed customer service agents. The cryptocurrency exchange refused a $20 million ransom demand and instead pledged that amount toward catching those responsible. One arrest has been made in India, but the incident highlights a fundamental problem in modern security: your people can become your greatest vulnerability.

The estimated $400 million cost of this breach should concern every security professional. But the real lesson isn't about the financial impact—it's about the blind spots in how we think about data loss prevention.

The Real Problem: Behavior, Not Just Patterns

Most organizations approach data loss prevention by looking for known patterns: PII, PCI, PHI, secrets. This made sense when the primary concern was compliance. But Coinbase's breach exposes a different reality: the most valuable assets leaving your organization are intellectual property and confidential data that don't fit neat regulatory definitions.

Customer service agents at Coinbase had legitimate access to customer information. They could view names, addresses, government IDs, Social Security numbers, and bank details as part of their normal workflow. Traditional DLP solutions would see this as authorized activity. Pattern matching would find nothing wrong.

The problem wasn't what they accessed—it was what they did with it, and where it went next.

Where Data Actually Leaves

Consider how an insider actually exfiltrates data in 2025:

Personal accounts become the exit path. An employee doesn't download a CSV file and email it to a competitor. They paste customer details into a personal Gmail draft. They upload a spreadsheet to their personal Dropbox "just to work on it at home." They copy information into ChatGPT or Claude to "help write better support responses."

Code doesn't leave as files. For organizations protecting source code or proprietary algorithms, the threat isn't someone downloading your entire codebase. It's incremental changes pushed to a personal GitHub repository. It's snippets pasted into AI coding assistants. Git commits don't look like data exfiltration to traditional DLP tools—they look like normal development activity.

Physical vectors still matter. While we focus on cloud and AI, USB drives and print activities remain effective data transfer methods precisely because they're often unmonitored.

The Coinbase incident involved customer service agents who could view sensitive data as part of their job. The question isn't whether they accessed the data—it's whether their behavior suggested they were doing something inappropriate with it.

Identity Context: The Missing Layer

Here's what would have been visible with the right monitoring in place:

A customer service agent accesses customer records during their shift. Normal. They access the same records from a personal account after hours. Not normal. They're suddenly using multiple browsers. They're accessing customer data, then immediately switching to personal cloud storage or AI tools. They're printing far more customer records than their peers.

None of these individual actions trips a traditional DLP alert. But the pattern tells a story.

This is what identity context provides: the ability to distinguish between corporate and personal accounts, to understand user roles and groups, to establish baselines for normal behavior and spot deviations. When you know who is accessing what, from where, using which tools, behavioral anomalies become visible.

Watch this short demo on Nightfall's personal vs. corporate session differentiation for a deeper dive.

How Nightfall Addresses Insider Threats

Preventing breaches like Coinbase's requires visibility into how employees actually work—and where data actually flows. This is what Nightfall is built to provide:

Personal cloud and AI account monitoring. Nightfall monitors when employees use personal Gmail, Dropbox, OneDrive, or AI tools like ChatGPT to handle work data. Not just "did they upload something," but "what did they upload, when, and does it align with their normal behavior?" The platform tracks activity across corporate and personal accounts, making it visible when an employee starts moving sensitive data to unauthorized locations.

Multi-browser and AI-native tool visibility. Modern work happens across Chrome, Edge, Safari, Arc, and within AI assistants and copilots. Nightfall monitors across all of these environments, ensuring that security teams can see what employees are accessing and what they're doing in AI-native tools—the primary channels for data movement that traditional DLP solutions miss entirely.

Physical data transfer tracking. Nightfall tracks USB drive usage and printing activity, correlating these physical exfiltration vectors with digital behavior. When a customer service agent suddenly starts printing volumes of customer records or copying data to USB drives, these actions are captured and correlated with their other activities across the environment.

Identity context for behavioral analysis. This is where Nightfall's approach differs fundamentally from traditional DLP. The platform provides deep identity context—distinguishing between corporate versus personal accounts, tracking users across roles and groups, and establishing behavioral baselines. When that Coinbase customer service agent accessed records from a personal account, or switched between corporate and personal browsers while handling customer data, Nightfall would surface these anomalies.

AI-powered investigation with Nyx. The challenge with comprehensive behavioral monitoring is volume—every employee generates thousands of events daily. Nightfall's AI copilot, Nyx, operates directly within the Nightfall console to solve this problem. Rather than overwhelming security teams with alerts, Nyx correlates events across users, tools, and time to surface genuine risks. When suspicious patterns emerge—like an agent accessing customer records, then immediately switching to personal cloud storage, then printing those same records—Nyx identifies the correlation, provides full context about the user's behavior across all monitored channels, and accelerates investigation by surfacing the most relevant information security teams need to make decisions. Security analysts can query Nyx directly within the console to understand data access patterns, investigate anomalies, and get answers in seconds rather than hours.

Building Defense in Depth for Insider Threats

The Coinbase breach is a reminder that insider threats require a different approach than external attacks:

Visibility must extend beyond the corporate perimeter. Your employees work in personal accounts, use AI tools, switch between devices. Your DLP solution needs to see all of it.

Context matters more than content. Pattern matching finds known sensitive data. Behavioral analysis finds suspicious activity with any data, including proprietary information that doesn't match PII patterns.

Physical and digital vectors both need monitoring. USB drives, printing, screen sharing—these aren't legacy concerns. They're active exfiltration methods.

Human analysis needs AI assistance. Security teams are overwhelmed. The solution isn't more alerts—it's better correlation and prioritization that surfaces what actually matters.

Coinbase's $400 million lesson shouldn't be that insider threats are inevitable. It should be that traditional approaches to data loss prevention aren't built for how work actually happens in 2025—or how insiders actually operate when they turn malicious.

The question isn't whether your employees can access sensitive data. It's whether you can see what they're doing with it across every tool, account, and device they use. And whether you can spot the behavioral patterns that indicate a problem before the data walks out the door.

See how Nightfall provides comprehensive visibility across cloud applications, AI tools, and physical data transfer vectors in a personalized demo here.

Schedule a live demo

Tell us a little about yourself and we'll connect you with a Nightfall expert who can share more about the product and answer any questions you have.
Not yet ready for a demo? Read our latest e-book, Protecting Sensitive Data from Shadow AI.