COLUMBUS, Ohio — Drive through Chillicothe and you might not notice them. They blend into the streetscape — mounted on poles, angled toward traffic, quietly scanning every vehicle that rolls past. Upload a blurry screenshot from a convenience store camera to the right database, and within seconds, you may have a name, a face, and a social media profile attached to it.

Two surveillance technologies have taken root across Ohio with remarkable speed and remarkably little debate. One scans faces. One tracks cars. Together, they represent a transformation in how law enforcement investigates crime in this state — and both operate in a legal vacuum that has left Ohio residents with almost no protection and almost no recourse.

The Technology That Identifies You

Clearview AI built its business on a simple but unsettling premise: that the billions of photographs people post publicly online could be harvested, indexed, and turned into the world’s largest facial recognition database. The company scrapes images from social media platforms, news sites, mugshot databases, and anywhere else photos appear publicly — without asking permission from the people in them — and packages that database as a law enforcement tool.

The pitch to police agencies is straightforward. Upload a photograph from a crime scene, a surveillance camera, or a social media account. The software searches its database and returns potential matches, along with the websites where those images were found, the names attached to those accounts, and other identifying details. Investigations that once stalled on an unidentifiable face now have a place to start.

What Clearview does not promise — and is careful to disclaim in its own terms of service — is that those matches are accurate. Results are described as “investigative leads, not identifications.” The technology is not admissible as evidence in court. The company’s own literature cautions that its output should be a starting point, not a conclusion.

That caveat has not always made it into the field.

A Murder, A Match, And A Courtroom Reckoning

Valentine’s Day 2024 began with a killing on a Cleveland street. A man was robbed from behind, shot, and left to make his way home on foot. He did not survive the night. Security cameras in the area captured the attack, but the footage was too degraded to identify the person responsible. For five days, investigators had nothing.

On the sixth day, police observed a man walking from an apartment building near the crime scene to a convenience store nearby. He looked, they said, like the figure in the video. They had no other connection between him and the murder. That observation — one officer’s visual assessment of a grainy image — was enough to set the next steps in motion.

The video went to the Northeast Ohio Regional Fusion Center, a multi-agency intelligence unit, which fed it into Clearview AI. The software returned eight photographs. Two of them were pictures of Qeyeon Tolbert, a 23-year-old Black man who lived in the area. Six depicted other people entirely.

What happened next is what turned the case into a national legal flashpoint. When detectives applied for a warrant to search Tolbert’s apartment, the affidavit they submitted to the judge described the Fusion Center as having identified Tolbert as a suspect. It made no mention of Clearview AI. It did not disclose that the search had also returned photos of six other individuals. The judge, working from an incomplete picture, signed the warrant.

Inside Tolbert’s apartment, officers reportedly found a handgun. Murder charges followed.

When Tolbert’s defense attorneys discovered how the identification had actually been made, they moved to suppress everything. Their argument was pointed: a facial recognition result, from a technology the manufacturer itself says cannot be used as a definitive identification, is no different in legal terms than a tip from an anonymous informant. Courts have long held that an anonymous informant tip, standing alone, is not enough to establish probable cause. The same logic should apply here.

The trial court agreed. The warrant was quashed. The evidence was thrown out.

Prosecutors fought back, appealing to the Eighth District Court of Appeals, which ordered the trial court to make more specific findings about whether the affidavit had been deliberately misleading. The trial court made those findings — and reached the same conclusion. On January 6, 2026, the judge voided the warrant a second time, ruling that Tolbert had demonstrated the affidavit contained statements that were either intentionally false or showed reckless disregard for the truth. Strip those statements out, the court found, and nothing credible remained to justify the search.

The state filed yet another appeal nine days later. The case, backed by the ACLU and the National Association of Criminal Defense Lawyers, continues to wind through the courts. Tolbert has waived his speedy trial rights through July 1, 2026.

Statewide, Unregulated, And Expanding

The Cleveland case is extraordinary in that it surfaced at all. Most uses of Clearview AI by law enforcement never reach a courtroom challenge. The technology operates in the background of investigations — generating leads that officers pursue through other means.

What is clear is that its use across Ohio is widespread and growing. The Northeast Ohio Regional Fusion Center and Cleveland police continue to use it. The Ohio Attorney General’s Bureau of Criminal Investigation has deployed it. Federal agencies operating in Ohio — including Immigration and Customs Enforcement and Customs and Border Protection — hold active contracts with the company. In May 2026, Clearview AI is scheduled to appear at the Ohio Association of Chiefs of Police Annual Exhibit Showcase, marketing its services to law enforcement leaders from across the state.

Ohio has enacted no law restricting how facial recognition can be used by police. No statute requires officers to disclose to judges when a facial recognition result influenced their warrant application. No public database tracks how often the technology is used, in what types of cases, or against which populations.

Civil liberties organizations have pushed for legislative guardrails. The statehouse has not responded.

A Problem Built Into The Technology

Beyond the constitutional questions, there is the simpler matter of whether the technology works — and for whom.

Federal researchers at the National Institute of Standards and Technology have studied facial recognition algorithms extensively and found that most of them produce significantly higher rates of false matches when applied to Black women and people with darker skin. The pattern appears consistently across commercial platforms and has been replicated by independent researchers. An algorithm that performs well in controlled testing can behave very differently when fed low-quality surveillance images of real people in real situations.

The consequences have not been hypothetical. Through early 2026, documented cases of wrongful arrests tied to false facial recognition matches have continued to accumulate nationwide. In each case, the sequence was similar: a false positive, a warrant, an arrest, and eventually the discovery that the technology had been wrong. The damage done in the interim — to the arrested person’s life, employment, family, and reputation — is not easily undone.

Part Two: The Cameras On The Streets

6,358 Searches. Eleven Cameras. One City.

While facial recognition operates largely out of public view, a different kind of surveillance network has been building in plain sight on Ohio’s streets — and in Chillicothe, the Scioto Valley Guardian obtained records that put specific numbers to how aggressively it is being used.

Audit logs from the Chillicothe Police Department show officers queried the city’s automated license plate reader network 6,358 times over the past 12 months. The department operates at least 11 Flock Safety cameras positioned at various points throughout the city — devices that capture the license plate, make, model, and color of every vehicle that passes within their field of view, around the clock, without any requirement that the driver be suspected of anything.

The Flock cameras are mounted to a new or existing pole and are controlled by solar power. The information is sent to police departments in real-time.

The audit data breaks down why officers accessed the system. Burglary investigations drove the largest share of searches at roughly 2,550 queries. Drug trafficking investigations generated approximately 2,050 more. Interdiction operations — proactive efforts to intercept vehicles linked to criminal activity — accounted for about 925 additional searches. The logs identify when the system was accessed, the stated reason for the query, and in some cases, the specific officer who ran the search.

Flock camera locations in the city of Chillicothe

How Flock Safety Works

Flock Safety, founded in Atlanta in 2017, markets automated license plate reader technology to law enforcement agencies and private communities. Its flagship Falcon camera uses machine learning to read plates in real time and cross-reference them against law enforcement databases, triggering alerts when a vehicle linked to a crime, a stolen car report, or a wanted person is detected.

The financial commitment is relatively modest by government technology standards. Each Falcon unit runs approximately $2,500 annually, with a one-time installation cost between $250 and $650 that covers setup and ongoing cloud storage. For a department operating 11 cameras, the annual expense is roughly $27,500 — a fraction of a single officer’s salary — making the technology accessible even to smaller agencies.

What makes the system powerful is also what makes it concerning. The cameras do not wait for suspicion. They record every vehicle, every time, building a searchable history of movement through public spaces that can be queried retroactively. A detective investigating a burglary that happened three weeks ago can search the system for every vehicle that passed a given location on a given night — including vehicles belonging to people who had nothing to do with the crime and no reason to expect they were being documented.

It Is Spreading

Chillicothe is not alone. In Wheelersburg, at least two Flock cameras are in operation. In Hillsboro, two cameras are positioned near the Lowe’s on North High Street — their locations confirmed through satellite imagery and the public tracking database at deflock.org, which maps Flock camera deployments nationwide.

The company’s footprint across the country runs into the thousands of agencies. In Ohio, adoption has accelerated steadily with no state framework governing how the data collected can be used, how long it can be retained, who can access it, or under what circumstances it can be shared with federal agencies.

What Residents Are Saying

Not everyone has accepted the cameras without question. Residents in communities where Flock cameras have been installed have raised concerns on social media and in public forums, with some describing the passive, warrant-free recording of vehicle movements as a form of creeping mass surveillance. One commenter’s description of the technology as “Big Brother in Training” has circulated widely in local discussions.

Law enforcement agencies that operate the cameras have generally defended them as a targeted crime-fighting tool that helps recover stolen vehicles, locate missing persons, and build cases against serious offenders. The audit data from Chillicothe supports the argument that the cameras are being used primarily for serious investigations — but critics argue the issue is not only how the data is being used today, but what could be done with it tomorrow, and by whom.

The Larger Question

Clearview AI and Flock Safety are different technologies designed for different purposes — but they share a common trajectory in Ohio. Both arrived before any legal framework was in place to govern them. Both expanded into widespread use without public debate or legislative action. Both now sit in a regulatory void that leaves residents without clear answers about their rights, without transparency about how often the tools are deployed, and without meaningful recourse when something goes wrong.

A Cleveland courtroom has given the clearest signal yet of what that void can cost. A man was charged with murder based in part on a facial recognition result that a judge has twice ruled was concealed from the court that authorized the search of his home. Two years later, the case is still in litigation, the technology is still in use, and Ohio still has no law requiring that any of it be disclosed.

The cameras keep running. The database keeps growing. And the rules governing all of it remain unwritten.

The Scioto Valley Guardian will continue to report on surveillance technology and law enforcement practices in Ohio. If you have information about the use of facial recognition or license plate readers in your community, contact our newsroom.

Jason Salley is a Certified Human Rights Consultant, investigative journalist, and former News Editor for the Scioto Valley Guardian. His investigative reporting spans true crime, environmental justice,...