Signals & Safeguards Issue 5 • Wednesday, April 15, 2026

A concise weekly scan of surveillance, privacy, cybersecurity, and the safeguards public officials should keep in view.

Signals & Safeguards newsletter masthead

At a glance

  • Sensitive government data systems are expanding in troubling directions at the same time oversight fights are intensifying.
  • Commercial and government surveillance tools continue to blur lines that once required warrants, audits, or clearer public approval.
  • Federal officials are still pushing major surveillance authorities even as new compliance concerns come into view.

Main stories

Federal personnel agency seeks broad access to workers’ medical records

The Office of Personnel Management is seeking personally identifiable medical and pharmacy claims data for millions of federal workers, retirees, and family members covered by federal health plans. Reporting says the proposal could give OPM access to highly sensitive information about prescriptions, diagnoses, and treatment patterns, and experts have raised serious legal and privacy questions about whether such disclosures are justified or even permissible. For a broader overview of the same issue, see CBS News’ reporting here.

Why it matters: when government asks for deeper access to health information, the question is not only whether the data might be useful, but whether the access is necessary, proportionate, and secure. That concern is sharper here because OPM itself was at the center of the massive 2015 breach that exposed records tied to roughly 22 million people.

Why it matters for Bend: local officials should treat sensitive health or benefits data as high-risk by default. The safest rule is simple: if a public agency cannot clearly explain why it needs detailed personal data, it should not collect it.

ICE confirms use of Graphite spyware

ICE has acknowledged using Paragon’s Graphite spyware, a tool reportedly capable of accessing communications on targeted devices, including encrypted apps. House Democrats are now demanding answers about the legal basis, safeguards, and procurement history behind its use.

Why it matters: this is not just another surveillance-software headline. It is a reminder that strong encryption can still be bypassed when authorities gain access to the device itself. The policy question is no longer hypothetical. It is whether government agencies are using highly intrusive tools under rules the public can actually see and contest.

Why it matters for Bend: surveillance debates should not focus only on local cameras and sensors. Device exploitation tools can be just as consequential, and officials should ask what oversight exists before trusting assurances that a tool will be used narrowly.

California opens a fusion center audit

California lawmakers approved an audit of several state fusion centers after privacy and civil-liberties concerns, including allegations that information sharing may have reached beyond appropriate bounds. The audit is important because fusion centers are often described in abstract terms even though they can shape how intelligence, law enforcement, and immigration-related data move across institutions.

Why it matters: oversight is most useful where systems are complicated, multi-agency, and hard for the public to see. Fusion centers sit exactly in that category. If data-sharing systems are lawful and well-governed, audits should help show that. If they are not, audits may be one of the few ways the public finds out.

Why it matters for Bend: information-sharing arrangements should never be treated as purely technical back-office systems. They are governance systems, and they deserve policy scrutiny before they become routine.

Section 702 faces renewed pressure despite major compliance concerns

The debate over Section 702 has intensified again as Congress faces a looming deadline and critics warn that serious compliance problems remain unresolved. Senator Ron Wyden says the Foreign Intelligence Surveillance Court found major compliance issues tied to Americans’ constitutional rights, while civil-liberties advocates are warning against a clean extension with no meaningful reforms.

Why it matters: Section 702 is often defended as a foreign intelligence authority, but the recurring fight is about what happens when Americans’ communications are swept in and later queried. The core safeguards question is whether a powerful surveillance authority should continue when the public still lacks a full picture of how serious the compliance failures are.

Why it matters for Bend: federal surveillance rules shape the broader privacy environment local governments operate inside. When higher-level safeguards weaken, local restraint matters more, not less.

Shared pattern: this week’s strongest stories point in the same direction: more access to sensitive information, more powerful surveillance tools, and more pressure to normalize those powers before the public fully understands the costs.


Signals

Signals section header

These items matter because they suggest where surveillance systems, legal theories, and accountability fights may be heading next. Some of the most important changes do not arrive as one dramatic announcement. They arrive through contracts, court fights, pilot programs, and legal carveouts that make the next expansion easier.

Nevada quietly signs up for Fog location tracking

AP reports that Nevada signed a contract with Fog Data Science allowing police to query location data derived from smartphone apps, with more than 250 queries a month permitted under the arrangement. The tool can reportedly reveal “patterns of life,” including where people sleep, work, travel, and associate.

Why it matters: this is the commercial-data loophole in practical form. The issue is not just whether police can track location. It is whether they can buy access to location data in ways that sidestep the warrant rules people assume still apply.

Webloc shows how ad-tech data becomes mass surveillance

Citizen Lab says a system called Webloc uses advertising-derived location data to monitor hundreds of millions of people and that customers include U.S. government and law-enforcement entities. The report suggests that app and ad ecosystems are continuing to feed a surveillance market far more powerful than many users realize.

Why it matters: the deeper lesson is that surveillance power does not require a visible camera on every corner if commercial data markets already map movement at scale. That makes procurement, data brokers, and app ecosystems part of the same policy conversation.

Data can be copied faster
than rights can be restored.

Deleted Signal messages were still recoverable through iPhone notifications

404 Media reported, and The Verge summarized, that the FBI was able to recover incoming Signal message content from an iPhone’s notification database even after the app had been deleted. The important lesson is not that Signal encryption failed. It is that privacy can break at the edges when operating systems store previews and logs users do not expect.

Why it matters: secure tools are only as private as the surrounding defaults. Notification previews, backups, and system-level logs can quietly create a second path to sensitive information.

Richmond’s Flock records show officials thinking about liability and narrative at the same time

Records reported by the Richmond Times-Dispatch indicate city officials were aware of racial-bias and liability concerns around Flock camera deployment while also discussing how to “saturate the public narrative” with success stories. That is a useful signal because it shows surveillance debates are often about public messaging and political management as much as technical capability.

Why it matters: when officials are already thinking about liability, equity concerns, and narrative control, the public should ask whether real safeguards are keeping pace or whether the communications strategy is outrunning the governance strategy.

OpenAI backs bill that would limit liability for catastrophic model harms

Wired reports that OpenAI supported an Illinois bill that would limit when frontier AI firms can be sued for large-scale harms caused by their systems, so long as the companies did not act intentionally or recklessly and produced specified safety and transparency reports.

Why it matters: this is an early warning sign about how the AI industry may try to shape the accountability rules around its own products before courts and lawmakers settle them. Liability is one of the few tools that forces organizations to internalize risk, so proposals to narrow it deserve close attention.

Oakland County’s Flock drone pilot points toward the next surveillance layer

A new Flock-linked drone pilot in Oakland County has already sparked privacy concerns. It fits a broader trend in which police technology is moving from fixed cameras and plate readers toward integrated aerial response, real-time feeds, and larger sensor networks.

Why it matters: pilot programs often become normalized infrastructure faster than communities expect. That makes the pilot stage one of the most important moments for public scrutiny.

Direction of travel: taken together, these signals suggest a common pattern: more location data, more integrated surveillance layers, and more pressure to soften accountability before the public has time to understand the system being built.

A principle worth keeping in view
If a human right is in the way of your innovative technology, the expected solution should be to modify your technology to respect that right, not to reduce the protections for that right.
Technology and innovation must be in service of humanity, not the other way around.
Safeguards are not obstacles to innovation. They are what make innovation fit for public life.


Safeguards

Safeguards section header

Practical habits that lower risk. A safeguards section works best when it stays practical. This week’s strongest takeaways are about reducing exposure, treating infrastructure seriously, and refusing to let “pilot” become a shortcut around policy. Good safeguards are often less about dramatic technology than about narrowing access, shrinking visibility, and asking harder questions earlier.

Keep industrial control systems off the open internet

CISA warns that Iranian-affiliated actors are targeting internet-connected programmable logic controllers across U.S. critical infrastructure, including sectors like water, energy, and manufacturing. The advisory stresses that weak passwords, direct internet exposure, and loose remote access are doing much of the attacker’s work for them.

Safeguard lesson: critical systems should not be exposed like ordinary web services. Public agencies and utilities should tighten remote access, eliminate default credentials, and treat operational technology as security infrastructure, not set-and-forget equipment.

Treat home and small-office routers like real security devices

The FBI’s IC3 says Russian GRU actors have been exploiting vulnerable routers worldwide, changing DNS settings and intercepting sensitive traffic tied to military, government, and infrastructure targets. The warning is a reminder that old or neglected routers can quietly become part of somebody else’s espionage chain.

Safeguard lesson: update router firmware, replace unsupported devices, disable unnecessary remote administration, and stop treating network gear as invisible furniture. For many people, the router is part of the security perimeter whether they think of it that way or not.

Treat surveillance pilots as real deployments

EFF argues that “free” or subsidized surveillance technology often bypasses local scrutiny because it arrives as a trial, grant, or donated program rather than a fully debated procurement. But the privacy risks, data-sharing consequences, and normalization effects begin immediately, not after the pilot becomes permanent.

Safeguard lesson: pilots should face real rules on retention, access logs, public notice, audits, sharing, and exit conditions before they begin. A temporary deployment can still create permanent habits.

Hide notification content for sensitive messaging apps

The Signal/iPhone reporting this week offers a simple citizen-facing reminder: if message previews are visible in notifications, they may be stored in places users do not expect. The convenience is real, but so is the privacy cost.

Safeguard lesson: for sensitive messaging, reduce notification content or disable previews entirely. Strong encryption helps, but system defaults still matter.

Bottom line: the best safeguards this week are not flashy. Keep critical systems off the open internet, treat routers as infrastructure, force real scrutiny at the pilot stage, and remember that privacy often fails first through defaults that feel convenient.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *