NEW WORLDS COME INTO FOCUS AS HUMANS EXTEND THEIR SENSORY APPARATUS
Eavesdropping Renaissance
Our entire world is awash with information-rich signals that exist beyond our human senses — sounds our ears cannot hear, magnetic fields unnoticed by our skin, and scents our noses do not perceive.
The breath you just exhaled can tell a story to a doctor armed with the NASA e-Nose. It is a non-invasive diagnostic device that attaches to a phone and uses olfaction-based AI. It can determine whether you have COVID or other diseases by scanning the molecules in your exhale for biomarkers. The results are nearly instant.
And it is not just signals hidden in an exhale. We are experiencing an eavesdropping renaissance.
A Planetary Scale Hearing Aid
For decades, researcher Karen Bakker explains, “Western scientists just assumed turtles were mute and deaf. But, as it turns out,” she tells us, “it was humans who couldn’t hear.” Much of the zoological world, in general, tends to converse in frequencies below or above our hearing range – in the infrasound or ultrasound spectrums. Bakker says that the use of modern “digital bioacoustics functions as a planetary-scale hearing aid” and though it may take us 20 years, we will eventually create a botanical and ‘zoological version of Google translate.”
A Brotherly Call Across the Largest Habitat on Earth
In the global effort to gather audio and contextual data on sperm whales, Marine biologist Shane Gero and his team witnessed a remarkable conversation. It was between two sperm whale siblings, one in Norway and the other off the Caribbean Island of Dominica, over 6,000 miles apart. The two whales were communicating in codas, which is a whale language that sounds like a cross between fax machines and the clicking taps of a Morse code keyboard.
“We could tell the whales were related,” said Gero in a lecture at the American Museum of Natural History, “because of the coda dialect they used.”
This Brotherly Call Across the Largest Habitat on Earth occurred in a particular ocean depth called the SOFAR channel, Sound Fixing and Ranging. It is about one thousand feet below sea level, and due to the unique temperature and pressure of the water layers above and below this channel, sound maintains integrity far further.
Attempting to translate an ancient, non-human language this complex is unprecedented. It is believed that sperm whales with the characteristics they have today, started communicating in the early Miocene, about 20 million years ago. In contrast, humans, are estimated to have begun talking about 300,000 years ago. One AI found that whale recipients are replicating extended patches of these messages, moments, days, and even years later.
Perhaps this is why a whale communication coda, compared to human language, appears to have an exponential of information. Crammed into a typical 2 to 40 broadband omnidirectional whale pings are approximately 1,600 precise micro clicks of information. Another complicating factor says Gero, is that “it’s not rude in sperm whale society to talk at the same time and overlap one another,” so translating these dense recordings will be a colossal feat.
The Cetacean Translation Initiative, CETI, was created to understand the language of whales. They have an international footprint, 15 institutions in 8 countries, with about 30 experts in computer science, marine biology, engineering, satellite cameras, and acoustics. They designed a wearable tech device called a DTAG. It has soft suction cups inspired by octopus tentacles that allow it to firmly attach to the head of a 50-foot sperm whale moving 1000 feet below the surface in high-pressure darkness. To date, with spectrometric visualization of whale audio data and activity, researchers believe they have isolated a few whale language vowels.
Other organizations use hydrophones, low-orbit satellites, and signals from an animal’s wearable device to support goals far outside the realm of language. They power an app called WhaleSafe. It lets tanker captains know the likely trajectory of a whale near their ship so they can alter their course to avoid vessel strikes. Collisions with tankers are the leading cause of death for cetaceans. The apps are voluntary, but to date, 60% of container captains subscribe and actively use the alerts to avoid strikes.
WhaleSafe is one Eco Band-Aid currently being rolled out, Pop-Up Wetlands is another.
Pop-up Wetlands
Shorebirds can only fly for so long. With the increased density in California’s Central Valley, the aerial nomads have had trouble finding a safe place for a rest-stop during their seasonal migrations. Ninety percent of the marshlands, shorebirds used to frequent in the US are now farms.
When numerous shorebirds showed up on the endangered species list, the Nature Conservancy, Cornell’s Ornithology Lab, and NASA worked together to develop a creative short-term solution. They use AI to filter real-time satellite footage for migrating flocks and alert the conservancy team when birds are approaching. Then, the conservationists put out offers online to pay farmers (via a reverse auction) to flood their property and create a resting oasis for the tired birds.
The farmers welcomed the opportunity to create Pop-Up Wetlands, and it worked so well for the avian travelers that this year California announced the largest shorebird population ever recorded in their state.
Listening In On A Garden
The lab of biologist Lilach Hadany, is focused on the Umwelten, or sensory world, of plants. AI compiles ultrasonic sounds, the electric missives of insects, with data of a plants nectar production. They discovered that when a leaf is nibbled by a caterpillar or when a plant grows parched, it cries out at frequencies far above human hearing and these ultrasonic distress calls evoke responses from nearby plants and insects.
It is also true that the communication goes both ways. A primrose, upon hearing the distinct buzz of a bee, boosts its nectar production by 20% to attract the pollinator.
Without extensive sensory aids and AI as the universal translator, this nuanced interplay of electromagnetic signals between flora and fauna would have remained hidden. As would the invisible olfactory signals that formed the basis for an Eco Band-Aid used by the farming industry. It lures moths away from apple and other fruit orchards by releasing mating pheromones in an adjacent field.
By letting an AI synthesize multiple data streams, another silent pattern was found in the flow of blood across our own heads. It turns out that once trained, a Chat GPT was able to nearly translate our innermost thoughts.
A Penny For Your Thoughts
Cerebral hemorrhages, such as strokes, occur every 40 seconds in the United States. They are the leading cause of long-term disabilities and scientists strive to find a way to help people -with varying degrees of locked-in syndrome -regain their ability to communicate with the world.
Last year, four researchers, Jerry Tang, Amanda LeBel, Shailee Jain & Alexander Huth, at the Univ of Texas, published their results in Nature Magazine related to mind reading. They call it semantic decoding because, unlike brain-to-computer interfaces, it is non-invasive. Using an AI training technique called active learning, they connected an early version of ChatGPT to real-time fMRI images of three subjects’ brain activity as well as a data stream of the entertainment that they were hearing, reading, and seeing. The subjects listened to stories from the radio show, The Moth. They also read poems aloud and watched movies with captioning. After 16 hours of training, they tested the AI to see if it could understand in real time what the participants were thinking solely from looking at the fMRI data of blood vessel movement in three places on a subjects’ heads.
One participant listened to the sentence, “I don’t have my driver’s license yet.”
The AI translated her thoughts to, “She has not even started to learn to drive yet.”
Another participant heard the following sentences “I didn’t know whether to scream, cry or run away. Instead, I said, ‘Leave me alone!’”
The AI translated that into “Started to scream and cry, and then she just said, ‘I told you to leave me alone.’”
Another participant heard, “We start to trade stories about our lives. We’re both from up north.”
The AI decoded, “We started talking about our experiences in the area he was born. I was from the north.”
Uncannily, the AI was able to accurately glean the gist of their thoughts. The study has been peer-reviewed, and multiple labs have confirmed that semantic decoding by artificial intelligences, such as ChatGPT, can train on individual blood vessel activity in the skull to reveal some thoughts.
Our Near Future Will Make The Bar Scene In Star Wars Look Tame
Mind reading and inter-species communications are only the beginning of this eavesdropping renaissance. Biologist Michael Levin, whose lab is focuses on translating the electromatic language cells use to collaborate when building an eye, believes we’ve hardly scraped the surface of what we will soon understand. He says that our near future is going to make the bar scene in Star Wars look tame.
Future Intelligence columns will delve into the emerging worlds brought into focus by our planetary hearing aids and advancing AI tools. We will look at whether AI’s potency to reveal exceeds our ability to wield it judiciously and we will watch how the boundary of self has both expanded and collapsed our individual agency.
Ecologist Stephen Jay Gould believes that the survival of our species depends upon experiences of awe that bring humans closer to the natural world. He writes, “we cannot win this battle to save species and environments,” Gould writes, “without forging an emotional bond between ourselves and nature,” as “we will not fight to save what we do not love.”
The intelligence discoveries are ceaseless, the new possibilities for connection are endless, and it is my hope that the short term fixes we call Eco Band-Aids will buy us a bit more time on this beautiful earth.
_____________________
Petra Franklin is a steward of the Seattle tech community, a prolific venture capitalist, and the cofounder of Dwehl. She is a pro-science public policy thinker dedicated to the ethical issues of neuroscience, biotech, and emerging technologies. She was a founding member of the Women’s Bioethics Project, chair of the National Science Foundation’s advisory board for the Material Science Center, and worked for Sun Microsystems, PBS, and CBS. She is a regular participant in the global investment business TechStars and news site GeekWire communities.