Sony’s Gran Turismo Sophy, an AI-based virtual driving program, has learned how to beat human drivers at virtual races. (Source: Sony AI)

Sony’s Gran Turismo Sophy Leaves Top-Rated Human Drivers in the Virtual Dust

By training with human drivers starting in 2020, the program Gran Turismo Sophy began to learn a lot of strategies for competing on the virtual track. Now it regularly beats people at the top of the virtual racing field, like Melbourne, Australia-based Emily Jones, a top sim-racing driver.

Sony created Gran Turismo, a video game known for its super-realistic simulations of actual vehicles and tracks. In a series of events held behind closed doors last year, Sony put its program up against the best humans on the professional sim-racing circuit. They discovered how the program learned, which could help shape the future of machines that will work with humans, or which will drive them on real roads.

While the AI racer initially failed in competitions, Sony rebuilt its neural network to be more robust and retrained it. Previously, the program was too aggressive, racking up penalties for reckless driving, and at other times too timid, giving drivers leeway though it wasn’t necessary. Peter Wurman, head of Sony AI America, said the AI was trained on “etiquette”: the ability to balance its aggression and timidity, making it better at choosing the appropriate behavior for the situation on the track.

“I don’t think we’ve learned general principles yet about how to deal with human norms that you have to respect,” says Wurman. “But it’s a start and hopefully gives us some insight into this problem in general.”

Instead of reading pixels off a screen, the program takes in updates about the position of its car on the track and the positions of the cars around the AI-driven racecar and gets information about virtual physical forces affecting the vehicle. In response, GT Sophy tells the car to turn or brake. This back-and-forth between GT Sophy and the game happens 10 times a second, which Wurman and his colleagues claim matches the reaction time of human players.

According to Sony, by using reinforcement training, it only took nine days before GT Sophy stopped learning and cutting fractions of a second off its lap times. Then it was faster than any human.

read more at technologyreview.com