A screenshot of the ShotSpotter website.

AI Evidence Withdrawn after Attorneys for Accused Request Review

With all the major advancements that have been produced by the use of AI, it is rare when AI comes up short on a problem. We found a serious shortfall that involved AI, police, and an innocent man who spent 11 months in jail. We first found this anomaly in an AI review in theregister.com. 

Katyana Quach wrote about how a man was charged with murder when an AI algorithm pointed the finger at him.

A key piece of evidence against him came from ShotSpotter, a company that operates microphones spread across U.S. cities, including Chicago, that with the aid of machine-learning algorithms, detect and identify gunshot sounds to immediately alert the police.

Prosecutors said ShotSpotter picked up a gunshot sound where Williams was seen on surveillance camera footage in his car, putting it all forward as proof that Williams shot Herring right there and then. Police did not cite a motive, had no eyewitnesses, and did not find the gun used in the attack. Williams did have a criminal history, though, having served time for attempted murder, robbery, and discharging a firearm when he was younger. However, he said he had turned his life around significantly since. He was grilled by detectives and booked.

Later when defense attorneys pressed the judge to further review the AI program, the prosecution pulled the AI evidence and requested charges be dropped. It is an interesting snapshot of how much AI is being relied upon in the criminal justice system, for better and sometimes for worse.

Using AI To Design Smut Backfires

Startup Kapwing, which built a web application that uses computer-vision algorithms to generate pictures for people, is disappointed netizens used the code to produce dirty pictures.

The software employs a combination of VQGAN and CLIP – made by researchers at the University of Heidelberg and OpenAI, respectively – to turn text prompts into images. This approach was popularised by artist Katherine Crowson in a Google Collab notebook; a Twitter account is dedicated to showing off this type of computer art.

Kapwing had hoped its implementation of VQGAN and CLIP on the web would be used to make art from users’ requests; instead, we’re told, it was used to make filth.

Without giving away everything that users did with this AI tool let’s just say it was mostly NSFW stuff. Evidently what naughty users wanted, was not exactly what the algorithm wanted to design.

“Since I work at Kapwing, an online video editor, making an AI art and video generator seemed like a project that would be right up our alley,” Eric Lu, co-founder and CTO at Kapwing said.

It turns out users had other ideas. The program had other ideas as well.

read more at theregister.com