A few years after a racist texting scandal sparked outcry over biased policing, San Francisco leaders said they figured out how to catch cops who send hateful messages.
But a new audit from the city’s police watchdog suggests otherwise.
The system created in 2019 is overwhelmed by too much data, too few people to review it and countless false starts, according to an audit from the Department of Police Accountability (DPA), which investigates allegations of wrongdoing by the city’s main law enforcement agency.
That review of the department’s system to catch officers who send biased messages found major faults, including the fact that officers are forced to manually review thousands of potentially biased emails because a list of words meant to be flagged by computers are mistaken for other words.
The audit offers a first-of-its-kind look at how the San Francisco Police Department screened for bias from 2019 to 2021.
And at least one citizen on the Police Commission, which oversees the city’s police force, says it should serve as a wake-up call.
“DPA’s report raises important questions about whether our system of bias monitoring is working as intended,” said commission Vice President Max Carter-Oberstone. He told The Standard that the commission must make sure the department is meeting its obligations when it comes to monitoring all devices used by police and make sure that any discovered bias is dealt with appropriately and reported to the commission.
Carter-Oberstone wants fellow commissioners to focus on solutions for the myriad problems raised by auditors.
The racist texts emerged from the federal prosecution of a group of corrupt officers. Court filings first reported by KQED in 2015 revealed the shockingly racist language of the messages—like the following from a thread involving ex-Officer Ian Furminger:
(Editor's note: Be advised that the following transcription contains offensive language.)
“We stole California from the Mexicans too! would have had Baha too but felt it wasn't worth it.”
“The Indians never had shit Columbus thought he landed where he was headed India So He named them Indian They never had a name of their own And the n re is evidence the moors niggers were here first.”
“Gunther Furminger was a famous slave auctioneer.”
“I can’t imagine working At costco and hanging out with filthy flips. hate to sound racist but that group is disgusting.”
Those were followed in 2016 by a similar scandal involving another group of officers—all of which amplified calls for systemic reform.
So, San Francisco sought federal guidance. That led to the U.S. Department of Justice handing down a long list of recommendations that included creating a system to flag biased messages. While the department says it met almost all of its benchmarks for reining in bias, the new audit raises questions about how deep those changes go.
Specifically, the audit points to problems like the system getting it wrong virtually every time it flags a message for bias. The false positives stem from potentially problematic terms that are really just fragments of larger words—such as alerts for “fun” flagging every email that mentions a “funeral.”
The rampant false positives create tedious work for investigators, who are described in the audit as having to review the emails one at a time. Over the three-year period examined in the report, the system flagged 3,819 emails for problematic language—but SFPD confirmed just 10 as actually biased. The report did not include details of those cases.
A separate review of internal communication about the Jan. 6, 2021, attack on the U.S. Capitol turned up 800,000 hits. The overwhelming false positives were due to the system flagging messages even if they just included links to articles shared among the rank and file.
The Department of Police Accountability review warned how such a massive volume of false alarms could lead to “alert fatigue” that desensitizes investigators.
Auditors said the system is stymied by information overload as much as its huge blind spots. One of those blind spots: department-issued cell phones.
The report said the phone company had no contractual obligation to register the devices for monitoring, so it would only do so when specifically asked to by the department.
The department also failed to provide the Police Commission with details about any discipline resulting from the bias reviews. Since 2020, the department also failed to update the words on the watchlist used in searches of police devices.
The implications of falling short of accountability goals extend well beyond the Department of Police Accountability, its chief Paul Henderson said.
“I’m always concerned about the practical ability of our systems to work, and for them to work, they need to be effective,” he said. “We need to have best practices in place and reliable systems […] in order to protect our most vulnerable.”
SFPD didn’t respond to The Standard’s request for comment.
Jonah Owen Lamb can be reached at [email protected]