

We risk becoming a confederation of involuntary witnesses.

By Matthew A. McIntosh
Public Historian
Brewminate
Prelude: The New Panopticon
Jeremy Bentham’s nineteenth-century prison design, the Panopticon, relied on the psychological weight of perpetual observation. In 2025, the architecture has inverted: the tower is gone, and the watchman’s lantern has been distributed into every thermostat, doorbell, car, and coffee pot around us. Crucially, many of those devices are not even ours. The fact that a stranger’s gadget can still capture, profile, and monetize our behaviour marks a decisive break with every earlier privacy regime of the modern age.
What follows is an exploration—historical, legal, technical, and moral—of how bystander surveillance now unfolds, why it resists traditional consent models, and what redress a democratic society might yet muster.
The Relentless Growth of Third-Party Data Capture
Estimates place more than 20 billion connected devices in circulation worldwide this year, from smart speakers to “silent” Bluetooth beacons sewn into retail ceilings. While owners may click “I Agree,” bystanders never see the prompt.
Recent scholarship confirms the asymmetry. A May 2025 ACM systematic review found that most non-owners are unaware that smart home microphones or motion sensors harvest their presence, let alone that the data may be stored indefinitely. Even sophisticated users misjudge which device classes collect what information.
Case Studies in Ambient Extraction
Doorstep Dragnet – The Ring Paradigm
Amazon’s Ring cameras, long plagued by weak default security, have produced viral footage of package thieves—and of unsuspecting neighbours, children, and passers-by. Hackers have repeatedly commandeered indoor units to harass families, underscoring the ease with which third parties exploit poorly secured IoT optics.
The Sidewalk Spillover
Amazon’s Sidewalk mesh piggy-backs on Echo speakers and Ring hubs, sharing a slice of the owner’s bandwidth to blanket nearby streets with connectivity. The network extends well onto public sidewalks and into neighbouring homes, raising questions about incidental device discovery and packet sniffing of phones that never opted-in.
Cars That Testify
In January 2025 the U.S. FTC banned General Motors and OnStar from selling precise location and driving-behaviour data for five years after discovering the information had been routed to insurance underwriters without consent. Anyone riding in or even borrowing such a vehicle may therefore leave a trail whose use they never contemplated.
Wearables and “Borrowed Bodies”
Fitness trackers on one gym patron can map the heartbeat rhythms and Wi-Fi signatures of everyone sharing the room. More troublingly, insurers and employers increasingly buy these data streams en masse, creating actuarial portraits of spaces rather than individuals.
Why Traditional Consent Collapses
Proximity Capture – Mere presence within sensor range can trigger data collection. The legal fiction of “notice and choice” evaporates on a public street.
Data Fusion – Seemingly innocuous packets—MAC addresses here, gait signatures there—combine into powerful dossiers once aggregated in the cloud.
Invisible Brokers – Third-party analytics firms, not device makers, often monetize the by-catch, making accountability diffuse.
Asymmetric Power – A homeowner values security footage; a guest values discretion. Current U.S. law privileges the owner’s property right over the bystander’s personhood.
Regulatory Terrain in the Trump Era
President Trump’s January 2025 return has not produced the sweeping federal privacy statute many advocates hoped for. Instead:
- The FTC continues piecemeal enforcement—e.g., the GM/OnStar decree—but its authority remains grounded in the flexible (and litigable) “unfair or deceptive practices” standard.
- States (California, Colorado) extend protections via sector-agnostic privacy acts, yet carve-outs for device owners’ “legitimate interests” often override bystander claims.
- The EU’s Digital Markets Act and GDPR, by contrast, treat bystanders as data subjects in their own right, forcing firms like Ring to honour opt-out at source for Europeans—a compliance headache that rarely carries over to U.S. versions of the same products.
Technological Counters and Civic Hygiene
Strategy | Mechanism | Limitations |
MAC-address randomization in smartphones | Masks unique device IDs from Wi-Fi sniffers | Defeated by longer session times or Bluetooth correlation |
Guest-aware camera modes | Ring’s “Privacy Zones,” Google Nest’s presence-based shutdown | Optional, owner-controlled |
Router-level fire-walling | Blocks outbound traffic from unknown IoT devices | Requires skilled configuration, breaks updates |
Detection apps (IoT Snoopers) | Scans local RF spectrum for hidden cams | Loses to encrypted traffic, new protocols |
Ultimately, community norms must supplement code. Social contracts—“No recording at dinner parties,” “Disable microphones when guests arrive”—re-embed ethics where statutes lag.
Historical Echoes and Philosophic Stakes
The forum of republican Rome presupposed a degree of public visibility; yet the citizen could retreat behind the household Limes. Today, the Limes has been perforated. Theologian Reinhold Niebuhr warned that “Man’s capacity for justice makes democracy possible; but man’s inclination to injustice makes democracy necessary.” The IoT crisis flips the aphorism: our capacity for convenience makes connectivity irresistible, yet our inclination to observe makes privacy indispensable.
Policy Prescriptions
Bystander Opt-Out by Design – Mandate “non-participant masks” for audio/video feeds, similar to GDPR’s automatic licence-plate blurring on Google Street View.
Duty of Clarity – Statutorily require exterior signage or broadcast beacons announcing active surveillance devices, akin to chemical hazard placards.
Data Sunset Default – Compel vendors to erase non-owner data after a short, pre-set interval unless affirmative, secondary consent is obtained.
Collective Redress – Empower municipalities to enact quiet zones (libraries, clinics, voting precincts) where ambient data capture is civilly actionable.
Algorithmic Privacy Impact Audits – Extend the FTC’s Safeguards Rule to force annual disclosure of how fused IoT datasets could harm bystanders.
Conclusion: Re-asserting the Private Self
Franklin’s quip—“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety”—has been hauled into every surveillance debate since the Patriot Act. Yet the IoT dilemma is stranger. We are not trading liberty for safety but ceding other people’s liberty for our marginal convenience. A republic that forgets the sanctity of bystander privacy risks becoming a confederation of involuntary witnesses. The task before historians, technologists, and citizens is therefore two-fold: to cultivate devices worthy of a free society, and to cultivate a society willing to refuse the devices that are not.