
Is San Francisco Tracking You? The Mass Surveillance Lawsuit
By Darius Spearman (africanelements)
Support African Elements at patreon.com/africanelements and hear recent news in a single playlist. Additionally, you can gain early access to ad-free video content.
A new legal battle has emerged in the heart of the tech world. In late December 2025, a retired teacher named Michael Moore filed a class-action lawsuit against the City and County of San Francisco. This lawsuit claims that the city has built a massive surveillance dragnet. The legal filing alleges that the police department uses Automated License Plate Readers (ALPRs) to track citizens without warrants. Furthermore, it claims the city shares this private information with outside agencies (cbsnews.com).
The case represents a major turning point for privacy rights in California. For years, San Francisco stood as a leader in protecting citizens from digital monitoring. However, recent changes in local laws have allowed for a rapid expansion of police technology. The lawsuit suggests that this expansion violates both the state constitution and specific privacy laws (aclunc.org). This struggle highlights a deep tension between the desire for public safety and the fundamental right to move freely without being watched by the government.
The Evolution of Surveillance in the City
The use of license plate readers in San Francisco is not new. The police department first began using mobile units nearly twenty years ago. These early cameras were mounted on the roofs of patrol cars. The city marketed them as digital assistants to help officers find stolen vehicles. During this time, the technology was limited to where a patrol car happened to drive (sanfranciscopolice.org). It was not yet a permanent part of the city landscape.
By 2009, the technology began to show its flaws. One specific incident changed the way the legal system viewed these cameras. An African American city worker was stopped at gunpoint because a camera misread her license plate. This event led to a major court ruling in 2014. The court decided that a computer “hit” is not enough reason to stop a driver. Officers must verify the information with their own eyes first (uscourts.gov). This history shows that the technology has always carried a risk of human trauma.
In 2019, San Francisco took a bold step to limit surveillance. The city passed the Stop Secret Surveillance Ordinance. This law required city departments to get permission before using any new tracking tools. It also banned the use of facial recognition technology. At that time, many people believed the city had found a balance. However, the political climate began to shift as concerns about crime increased (eff.org). The protections of 2019 would soon face a massive challenge from new legislation.
The struggle for control over technology mirrors the larger Black Power struggles throughout history. Just as communities fought for justice in the past, today they fight for the right to privacy. The move from mobile units to fixed cameras at every intersection represents a permanent change. It is no longer a tool for specific tasks. Instead, it has become a system that records everyone, all the time (aclu.org). This transformation is at the heart of the current lawsuit.
The Trauma of a Computer Error
The story of Denise Green is a powerful reminder of the human cost of surveillance. In March 2009, she was driving her burgundy Lexus when police officers surrounded her. They pointed shotguns at her and forced her to her knees. The reason for this high-risk stop was an error by an ALPR unit. The camera had confused a “3” for a “7” on her license plate. This small mistake led to a terrifying experience for a woman with no criminal record (uscourts.gov).
The officers did not check the details of the “hit” before acting. The system reported a stolen gray GMC truck, but Green was driving a Lexus sedan. Despite this obvious difference, the police continued with the arrest. This case proved that relying too heavily on automated systems can lead to dangerous situations. The Ninth Circuit Court of Appeals eventually ruled that the stop violated her rights. San Francisco had to pay a settlement of nearly five hundred thousand dollars (thenewspaper.com).
Cases like this illustrate how surveillance technology can harm marginalized people. Historically, African American families have faced over-policing and extra scrutiny. When technology is added to this mix, it can amplify existing biases. Even if the technology is meant to be neutral, the way it is used by law enforcement can lead to unfair treatment. The Denise Green case remains a foundational moment in the history of San Francisco’s policing (aclunc.org).
Today, critics argue that the new network of cameras increases the chance of similar errors. With five hundred cameras taking three million photos a day, the potential for mistakes is high. Each mistake represents a person who might be stopped at gunpoint. The lawsuit by Michael Moore aims to prevent these kinds of constitutional injuries from happening again. It seeks to hold the city accountable for the way it manages this powerful data (cbsnews.com).
Shifting the Rules of Engagement
The current expansion of surveillance began in earnest in 2024. San Francisco voters passed a ballot measure called Proposition E. This measure significantly changed the rules for the police department. It allowed officers to use new technology for a full year before needing approval from the Board of Supervisors. This effectively bypassed the privacy protections established in 2019 (sf.gov). The goal was to give police more tools to fight retail theft and violent crime.
Following the passage of Proposition E, Mayor London Breed announced a major rollout of new cameras. The city used a grant of over seventeen million dollars to fund the project. They planned to install four hundred fixed cameras at one hundred strategic intersections. These cameras, provided by a company called Flock Safety, are designed to capture more than just license plates. They can identify the make, model, and even specific features like roof racks or stickers (texaslawreview.org).
This rapid deployment happened under the leadership of the current national administration. President Donald Trump has often encouraged local police to take a more aggressive stance on crime. This political environment has created a sense of urgency for cities to adopt high-tech solutions. However, the speed of the rollout has raised concerns about transparency. Many citizens feel that the city is moving too fast without considering the long-term impact on civil liberties (eff.org).
The shift in policy represents a change in the sharing of power between local and state governments. While California has state laws to protect privacy, local ballot measures like Proposition E can create loopholes. This creates a confusing landscape where rights can vary from one city to the next. The lawsuit argues that no matter what local voters say, the city must still follow the state constitution and basic privacy requirements (aclu.org).
Out of 42,000 “hits,” only 140 resulted in arrests.
Data Without Borders and Sanctuary Laws
One of the most serious claims in the lawsuit involves the sharing of data. California has a law called SB 54, also known as the Sanctuary State law. This law forbids local police from helping federal immigration authorities with deportations. It is meant to ensure that immigrant communities feel safe interacting with local government. However, investigations show that the ALPR system may be violating this principle (aclusocal.org).
Records indicate that federal agencies, including ICE, have searched the San Francisco license plate database. At least nineteen searches were specifically marked for federal immigration enforcement. This suggests that the automated system allows federal agents to bypass local sanctuary protections. For many residents, this is a betrayal of the city’s promise to be a safe haven (aclunc.org). It creates a digital trail that can be used to track and deport vulnerable people.
Additionally, the city is accused of sharing data with states that have strict anti-abortion laws. Another state law, SB 34, prohibits sharing ALPR data with out-of-state agencies without strict authorization. Yet, millions of searches have been conducted by agencies from states like Texas and Georgia. Advocates worry that this data could be used to track people seeking reproductive healthcare in California (aclu.org). The “30-day map” of a person’s movements could reveal visits to specialized clinics.
This issue highlights the dangers of economic justice and privacy. People who cannot afford private transportation are often the most monitored. The lawsuit argues that the city’s data-sharing practices put these individuals at risk. By allowing outside agencies to access local data, San Francisco may be indirectly supporting legal actions that are against California’s own values. This lack of control over data is a central pillar of the legal challenge (eff.org).
The Mosaic Theory and the Fourth Amendment
The legal argument against mass surveillance often relies on the “Mosaic Theory.” This theory suggests that while one photo of a car is not a search, thousands of photos over time create a complete picture of a person’s life. Like small tiles in a mosaic, these data points reveal private habits. They can show where a person goes to church, which doctors they see, and who their friends are (texaslawreview.org). This level of detail is something the founders of the country never imagined.
The Fourth Amendment protects citizens from unreasonable searches. In the past, the Supreme Court has ruled that long-term GPS tracking requires a warrant. Critics argue that the ALPR network is essentially a “virtual GPS” that tracks everyone. Because the cameras are fixed at major intersections, they can reconstruct a person’s movements without ever placing a physical tracker on their car (eff.org). This creates a constant state of surveillance that feels unconstitutional to many legal experts.
Civil rights groups argue that this technology creates a “dragnet” that treats everyone as a suspect. Instead of following a specific lead, the police collect data on millions of law-abiding citizens. They hope to find a small number of criminals within that mountain of information. The success rate for this method is very low. Statistics show that less than one percent of “hits” lead to an arrest. This means that the privacy of thousands of people is compromised for very little gain (sanfranciscopolice.org).
The lawsuit seeks to force the city to get a warrant before accessing this historical data. It argues that the government should not have a “time machine” to look back at where people have been. Without judicial oversight, the potential for abuse is high. The mosaic of a person’s life should remain private unless there is a specific reason to believe they have committed a crime. This fight is about maintaining the boundary between public space and private life (aclu.org).
The Private Power of Flock Safety
The cameras used in San Francisco are not owned by the city. They are part of a private network run by a company called Flock Safety. This company provides what it calls a “public safety operating system.” Their network connects cameras from over five thousand communities across forty-nine states (texaslawreview.org). This private database creates a level of surveillance that traditional police departments could never achieve on their own.
Flock Safety’s business model involves bridging public and private data. Their system includes cameras from Homeowners Associations and private businesses. Police can often access this private data without a warrant. This creates a massive, privatized surveillance web that lacks the transparency of government systems. Because Flock is a private corporation, it is not subject to the same public record laws as a police department (texaslawreview.org). This makes it harder for the public to know how their data is being used.
The company has also partnered with the FBI to monitor vehicles on federal “wanted” lists. This deep connection between private industry and federal law enforcement is a major concern for privacy advocates. It allows for a level of tracking that spans the entire country. If a car is spotted in San Francisco, that information can be instantly available to an officer in another state (texaslawreview.org). This “outsourced” surveillance model is a central part of the new class-action suit.
The shift toward private surveillance companies reflects a broader trend in policing. It allows cities to deploy technology quickly without the burden of maintaining it. However, it also means that a private company holds the keys to the movements of millions of people. The lawsuit questions whether it is legal for a city to hand over such significant power to a venture-backed corporation. The profit motive of a private company may not always align with the constitutional rights of the people (aclu.org).
From Mobile Patrols to 500 Fixed Intersections
100% City Coverage in Strategic High-Incidence Zones
Mapping Bias in the City
The placement of surveillance cameras is rarely random. In San Francisco, the one hundred intersections chosen for the new cameras are concentrated in specific areas. Many of these cameras are located in neighborhoods like the Mission and Bayview districts. These areas have large populations of African American and Latino residents. For many who live there, the cameras feel like a new form of over-policing (aclunc.org).
Historically, marginalized communities have been the primary targets of government monitoring. The placement of cameras in these neighborhoods can lead to a cycle of “adaptive criminalization.” This happens when police focus all their tools on one area, which naturally leads to more arrests there. These arrests are then used to justify even more surveillance. This process can reinforce racial biases and create a sense of being constantly under watch (eff.org).
The city justifies the placement by pointing to crime rates. However, advocates note that more affluent neighborhoods do not face the same level of scrutiny. Even when crimes occur in wealthy areas, the response is often different. By focusing on “high incidence” areas, the surveillance network effectively maps out the city’s racial and economic divides. This contributes to a feeling of two different versions of San Francisco (aclunc.org).
Educational resources like the life of Mary McLeod Bethune teach the importance of fighting for equal treatment. In the digital age, this means fighting for equal privacy. The lawsuit by Michael Moore argues that the city must consider the impact of its technology on all communities. Surveillance should not be a burden that falls only on the shoulders of the poor and people of color. True safety requires justice and fairness for everyone (aclu.org).
Balancing Safety and Liberty
The debate over ALPRs often comes down to one question: Do they work? The city administration points to a thirty-three percent drop in property crime in 2024. They argue that the cameras act as a deterrent and help catch dangerous criminals. High-profile cases, such as arrests for carjacking and sexual assault, are often cited as proof of the system’s value (cbsnews.com). For many residents who have been victims of crime, these results are very important.
However, experts caution that crime is dropping across the country. Peer cities that did not install massive camera networks also saw significant decreases in crime. In San Francisco, car break-ins dropped by over fifty percent before the cameras were even fully operational. This suggests that other factors, like increased police patrols, may be the real reason for the improvement (cbsnews.com). It is difficult to isolate the impact of the cameras from other police work.
Critics also point to the high number of “false hits.” When a camera misreads a plate, it can lead to dangerous interactions between police and citizens. With a success rate of only zero point three percent, the vast majority of camera alerts do not lead to an arrest. This means that for every criminal caught, thousands of innocent people are being tracked and cataloged (sanfranciscopolice.org). This high price for a small gain is what the lawsuit aims to address.
Ultimately, the struggle in San Francisco is about what kind of city residents want to live in. Is a slight reduction in crime worth the loss of anonymous movement? The lawsuit argues that the current system has gone too far. It has moved beyond a simple tool for police and into a permanent dragnet. The outcome of this case will set a precedent for cities across the nation. It will determine if the “history behind the headlines” is one of progress or one of increasing control (aclu.org).
Conclusion
The class-action lawsuit against San Francisco is a landmark event. It challenges the idea that technology should be allowed to expand without limits. By looking at the history of these cameras, from the Denise Green case to the passage of Proposition E, we can see a clear pattern. The city has moved from targeted tools to a citywide system of monitoring. This change has profound implications for privacy and civil rights (cbsnews.com).
The legal battle will likely continue for years. It will force the courts to decide how the Fourth Amendment applies to the digital age. As more cities adopt systems like Flock Safety, the questions raised in San Francisco will become even more urgent. Citizens must remain informed and engaged to ensure that their rights are protected. The future of privacy depends on our ability to hold the government and private corporations accountable for the data they collect (eff.org).
San Francisco has always been a place of innovation and activism. This lawsuit is the latest chapter in that history. It is a reminder that even in a high-tech world, the basic rights of the individual must come first. Whether through the courts or at the ballot box, the people of the city will continue to define the boundaries of surveillance. The world will be watching to see how the “City by the Bay” handles this digital crossroad (aclu.org).
About the Author
Darius Spearman is a professor of Black Studies at San Diego City College, where he has been teaching for over 20 years. He is the founder of African Elements, a media platform dedicated to providing educational resources on the history and culture of the African diaspora. Through his work, Spearman aims to empower and educate by bringing historical context to contemporary issues affecting the Black community.