Surveillance • Privacy • Local Oversight

Bend’s Flock Data Shows Why Surveillance Oversight Must Be Built Into the System

Recent reporting about Bend’s Flock Safety cameras and Oregon’s sanctuary-law lawsuit point to the same lesson: privacy rules only work when the databases enforce them.

The most important surveillance story in Bend this week is not hypothetical. According to The Source Weekly, federal immigration officials made 279 queries into Bend’s Flock Safety data in the first three weeks after the cameras went live. Bend Police reportedly did not authorize those searches, and it remained unclear what, if any, data may have been retrieved.

That should concern anyone who cares about local control, public accountability, or civil liberties.

The issue is not simply whether a particular search was improper. The larger problem is governance. When a surveillance system is installed for one stated purpose, it can quickly become useful to other agencies, vendors, or outside users who were not central to the original public debate.

If city officials cannot later answer basic questions — who searched the system, why they searched it, what they saw, whether the search was tied to a case number or documented purpose, and whether any result was shared — then the public is being asked to trust a system that cannot be independently verified.

That is not meaningful oversight. That is a blind spot.

Audit logs are not a technical detail. They are the oversight system.

Surveillance oversight often gets discussed as if it is mainly a policy question: Should the city approve the tool? What is the stated purpose? What does the vendor promise? What does the agency say it intends to do?

Those questions matter, but they are not enough.

Once a system is live, the real questions become operational:

  • Who can access the system?
  • Can outside agencies search it?
  • Can federal agencies search it?
  • Can vendors access camera feeds, search tools, dashboards, support logs, or stored data?
  • Does every search leave a usable record?
  • Can elected officials or independent reviewers verify how the system was used?
  • Are searches tied to a case number, warrant, emergency, or documented purpose?
  • Are improper searches detectable?

Without those controls, the public has no practical way to know whether the system is being used as promised.

This does not require assuming bad faith by local officials. In fact, it shows why good-faith intentions are not enough. A system can be approved by people with narrow intentions and later become available to people with very different priorities. That is why access controls, audit logs, outside-agency limits, vendor-access limits, and public reporting must exist before a system goes live.

“You must first enable the government to control the governed; and in the next place oblige it to control itself.”
— James Madison, The Federalist No. 51 (1788)

Oregon’s sanctuary-law lawsuit is also about database access

The Bend Flock story now sits inside a larger Oregon data-sharing dispute.

OPB reported that a lawsuit filed in Multnomah County Circuit Court alleges Oregon State Police allowed federal immigration authorities to access Oregonians’ data through shared law-enforcement databases for years, despite Oregon’s sanctuary laws. Oregon State Police denies wrongdoing.

The Source Weekly also reported that the complaint alleges federal immigration authorities queried state-run data about Oregonians 1.4 million times between February 2025 and February 2026, an average of 3,835 queries each day.

The practical lesson is straightforward: a sanctuary policy is only as strong as the database permissions behind it.

If an agency is legally barred from using data for immigration enforcement, the system should not rely on informal restraint. It should include technical controls that prevent improper access, audit logs that expose improper use, and consequences when policy and practice diverge.

A privacy rule that lives only on paper is fragile. A privacy rule built into permissions, logging, retention limits, and reporting is much harder to ignore.

Why this matters for Bend

Bend residents’ information may pass through city, county, state, vendor, and law-enforcement systems. That information can include license plate scans, police records, driver information, public records, utility records, permits, emergency-service data, and other civic information collected for ordinary public purposes.

The danger is that data collected for one purpose can become useful for another.

A license plate reader approved for local public-safety purposes can become searchable by outside agencies. A state database can become an immigration-enforcement tool. A vendor platform can create access points that city officials did not fully understand when the contract was approved. A policy promise can fail if the technology underneath it does not actually enforce the rule.

That is why surveillance oversight cannot stop at approval. Local officials need to know who can search local systems, what outside agencies can access, what vendors can see, how long data is retained, and whether every search leaves a record that can be reviewed.

The safeguard standard should be simple

Before Bend approves, renews, or expands any surveillance or sensitive data system, the city should be able to answer:

  • Who can access the data?
  • Who can search the system?
  • Can outside agencies access it?
  • Can federal agencies access it?
  • Can vendors access it?
  • Is every search logged?
  • Are searches tied to a documented purpose?
  • How long is data retained?
  • Can improper access be detected?
  • Can elected officials and the public receive meaningful reports about system use?

If those questions cannot be answered clearly, the system is not ready.

This is not about being anti-technology. It is about making sure public technology remains accountable to the public. Tools that collect sensitive information should not depend on trust alone. They should be designed so that misuse is difficult, detectable, and provable.

Bottom line

The Bend Flock story and the Oregon sanctuary-law lawsuit point to the same conclusion: access is the story.

Who can search the data? Who can receive it? Who can share it? Who can buy it? Who can combine it with other systems? Who can turn an ordinary resident’s movement, record, or routine interaction with government into an investigative lead?

Privacy safeguards fail when access is undefined, unlogged, or routed through systems the public cannot see.

The best safeguards are practical and boring by design: collect less data, connect fewer systems, limit outside access, require documented purposes, log every search, review the logs, shorten retention, make vendor access visible, and make misuse provable.

Surveillance oversight works best when restraint is built into the system before the data becomes too useful to give up.


Related reading:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *