Biologist, AI Experts Work to Break Down Whale Language in Massive Study
David Gruber, a professor of biology at the City University of New York, is working toward becoming a modern-day Dr. Doolittle—at least when it comes to communicating with sperm whales.
A story on newyorker.com describes in great detail how Gruber developed his obsession with talking to whales and understanding their language and how he formed an alliance with a “pod” of AI experts who want to make his wacky dream come true by using the latest technologies.
One day, Gruber was sitting in his office at the Radcliffe Institute, listening to a tape of sperm whales chatting, when another fellow at the institute, Shafi Goldwasser, happened by. Goldwasser, a Turing Award-winning computer scientist, was intrigued. At the time, she was organizing a seminar on machine learning, which was advancing in ways that would eventually lead to ChatGPT. Perhaps, Goldwasser mused, machine learning could be used to discover the meaning of the whales’ exchanges.
“It was not exactly a joke, but almost like a pipe dream,” Goldwasser recollected. “But David really got into it.”
A third Radcliffe fellow joined the project, Michael Bronstein, a computer scientist who is now the DeepMind Professor of AI at Oxford.
“This sounded like probably the most crazy project that I had ever heard about,” Bronstein told me. “But David has this kind of power, this ability to convince and drag people along. I thought that it would be nice to try.”
The team set up shop at CETI headquarters on Dominica, a volcanic island in the Lesser Antilles. They plan to use a network of underwater microphones to capture the codas of passing whales. They’ll also put recording devices on the whales and the data will be used to “train” machine-learning algorithms.
https://www.newyorker.com/video/watch/can-we-decipher-a-whales-first-sounds
“In addition to bugging individual whales, ceti is planning to tether a series of three “listening stations” to the floor of the Caribbean Sea. The stations should be able to capture the codas of whales chatting up to twelve miles from shore. (Though inaudible above the waves, sperm-whale clicks can register up to two hundred and thirty decibels, which is louder than a gunshot or a rock concert.) The information gathered by the stations will be less detailed than what the tags can provide, but it should be much more plentiful.”
Daniela Rus, a roboticist at the Computer Science and Artificial Intelligence Laboratory (csail), at MIT, said the challenge will be to correlate whale codas with behavior. An extensive library of sounds already exists, but the new data will record activity, too.
“The question I’ve been asking myself is: Suppose that we set up experiments where we engage the whales in physical mimicry,” Rus said. “Can we then get them to vocalize while doing a motion? So, can we get them to say, ‘I’m going up’? Or can we get them to say, ‘I’m hovering’? I think that, if we were to find a few snippets of vocalizations that we could associate with some meaning, that would help us get deeper into their conversational structure.”
read more at newyorker.com
Leave A Comment