Blog

When a Heartbeat Becomes Data: What Ghost Murmur Means for Internal Audit

Subscribe now to join the Risk Register community:

In early April, the CIA confirmed something remarkable.

A classified system called Ghost Murmur detected a downed American airman's heartbeat from roughly 40 miles away in southern Iran. The technology combines quantum magnetometry sensors with artificial intelligence to isolate the electromagnetic signature of a single human heartbeat from environmental noise.

That sounds like a defense story. A rescue story.

But if you're responsible for assessing data privacy risk in your organization, it should make you pause.

Here's why.

The Architecture Already Exists in Your Environment

Ghost Murmur represents a category of technology that collects biometric data without the subject's knowledge, consent, or proximity. The sensor detects a biological signal at a distance measured in miles, processes it through AI to confirm identity and location, and delivers an actionable output.

Now remove the military context.

Your organization already deploys AI-driven monitoring tools that track employee behavior, communications, productivity, and engagement. Wearables collect heart rate, movement, and stress indicators. Session replay tools capture user interactions on websites and applications. AI-enhanced chatbots process conversational data in real time.

The underlying capability is identical: detect a signal the subject doesn't know is being collected, process it algorithmically, and act on it.

The governance gap between what these systems can do and what your organization has documented, disclosed, and controlled is the risk we need to address.

We're Measuring Risk Wrong

Most internal audit risk assessments measure two things: impact and likelihood. That framework worked when the risk landscape moved at a pace governance structures could match.

It doesn't work now.

Risk velocity measures how fast a risk materializes and how quickly your organization feels its effects. The IIA Standards reference risk-based planning under Standard 9.1, but most audit shops still don't operationalize velocity as a distinct metric in their risk scoring.

Consider how fast the data privacy landscape moved in 2026 alone:

CPRA cybersecurity audit mandates took effect January 1, 2026. California now requires annual, independent cybersecurity audits for businesses processing 250,000+ personal information records with gross revenue above $26.625 million. Fines run up to $7,988 per violation. Enforcement is active.

The DOJ Data Security Program went operational. Bulk sensitive data transfers involving countries of concern are now a regulated risk category with affirmative due diligence, access control, and monitoring obligations.

AI governance became enforceable. California's Privacy Protection Agency finalized regulations covering automated decision-making technology, cybersecurity audits, and risk assessments. Organizations deploying AI must demonstrate documented processes, controls, and accountability.

Geopolitical risk jumped from 26% to 45% as a top CAE concern in the IIA Risk in Focus 2026 report. That's the largest single-year increase in the study's history.

Each of these developments reached operational impact within months. Not years.

That's velocity.

Three Questions Internal Audit Teams Should Ask Now

What biometric and behavioral data is your organization collecting, and does governance keep pace with capability?

Many organizations adopted AI-powered monitoring tools during and after the pandemic without updating their data privacy impact assessments. The 2026 IIA Focus on the Future report found that only 25% of internal audit teams actively use AI in their work.

That means 75% have limited firsthand understanding of the tools their organizations are deploying.

You can't audit what you don't understand.

Is your risk assessment methodology capturing velocity, or just impact and likelihood?

Adding velocity to your risk scoring doesn't require a new framework. It requires two additional questions in every risk discussion: "How fast can this happen to us?" and "At what point will we feel it?"

For data privacy risks specifically, the answer to both questions is increasingly "faster than our current controls can respond."

Are you treating data privacy as a standalone compliance topic or as a horizontal risk theme?

KPMG's 2026 Risk in Focus analysis lists data governance and privacy as the number one key risk area for internal auditors—ahead of AI disruption and cybersecurity.

The firms that treat privacy as a checkbox exercise (one audit, one year, move on) are the firms that will be caught off guard when a regulatory inquiry or breach event materializes at velocity.

What We're Doing About It

The Ghost Murmur story is dramatic. A heartbeat detected from 40 miles away in a desert. A rescue that hinged on sensing a signal the subject couldn't control or conceal.

But the principle scales down to every organization.

Data is being collected at distances and depths that subjects - employees, customers, users—don't fully understand. AI is processing that data faster than governance structures can adapt. Regulatory enforcement is accelerating.

Internal audit's role here isn't to slow technology adoption. It's to make sure your organization's awareness, documentation, and controls keep pace with what the technology actually does.

That starts with three moves:

Add velocity to your risk assessment scoring. Make it a standing dimension alongside impact and likelihood. Score it on a 1-to-5 scale. Update it quarterly.

Conduct a data privacy inventory that reflects actual capability, not just stated policy. Map every tool collecting personal, biometric, or behavioral data. Compare what the tool can do to what the privacy notice discloses.

Embed data privacy into every technology audit, not as a separate engagement. Privacy isn't a silo. It's a horizontal risk that touches AI governance, cybersecurity, vendor management, and regulatory compliance simultaneously.

The Pattern You Need to See

Military and intelligence capabilities consistently migrate into commercial applications within 3-7 years of declassification or public acknowledgment.

Quantum sensing, AI-driven biometric detection, and remote signal processing are no longer experimental. They're operational and beginning commercial translation.

Organizations that dismiss these as "defense-only" technologies will face governance gaps when vendors embed similar capabilities into workplace monitoring, customer analytics, or security systems.

Risk doesn't stay still. Technology doesn't wait for your audit plan.

The organizations that treat data privacy as a velocity problem, not a compliance checkbox, are the ones that will be positioned when the next Ghost Murmur equivalent arrives in the commercial sector.

It's not a question of if.

It's a question of how fast.

Mike Levy is CEO and Managing Principal of Cherry Hill Advisory, a global practitioner-built internal audit and risk advisory firm. and is a former Big Four alumnus and CAE.

Subscribe now to join the Risk Register community: