We don’t just follow stories, we follow up. In May, Coda Story’s Brett Bachman wrote about the tech-surveillance company Banjo after the use of its multi-million dollar surveillance system was suspended by government agencies in Utah, following reports its CEO once had ties to the Ku Klux Klan.

Banjo’s contract with the state gave the company live access to an unprecedented number of government data streams, including 911 calls, traffic and CCTV cameras. The real time access to this vast amount of information used artificial intelligence to alert first responders across Utah to crimes and other public safety threats as they happened.

Banjo CEO Damien Patton resigned from the company days after the revelations surfaced and the Utah Attorney General’s office is moving forward with an audit of the company’s software. 

Patton’s departure has reignited the debate over issues of bias within law enforcement and artificial intelligence. Coda Story has written about some of these concerns before. 

Mutale Nkonde, an AI policy advisor and a fellow at the Berkman Klein Center for Internet & Society at Harvard University, warns that the kind of surveillance technology used by companies like Banjo could potentially harm minority communities by targeting already vulnerable groups.

“The way that I think algorithmic decision making can be made less biased is by having human decision makers audit the decisions that those algorithms make,” said Nkonde, via Zoom. “So by all means, use the technology, but understand that that technology is drawing from a particular social and political moment.”