Jump to content

4-minute read

How music led Daniel DeLeon to study Marine AI

Daniel didn’t know what engineering was when he started community college. Now he’s making breakthroughs, using AI to track endangered whales.

When Daniel DeLeon's mother, Betty, first met his father, Narciso, on a church trip in San Blas, Mexico, she didn't speak Spanish and he didn't speak English, so they communicated in a different way - with music.

It's no surprise Daniel, now a 24-year-old student at California Polytechnic State University, grew up with an ear for music himself. His parents formed a traditional Mexican trio band called Trio Guadalupano, and their weekly rehearsals and performances at local quinceañeras, baptisms, and parties formed the soundtrack of his childhood.

In physics class I learned how sound waves propagate into our ear. It was mind-blowing. Those waves create emotions that make us happy, that get us pumped up.

Daniel DeLeon

It was during a community college physics class that Daniel’s love for music would evolve into a fascination with the science of sound. His musical background paired with a newfound understanding of acoustics landed him a highly competitive internship at the Monterey Bay Aquarium Research Institute (MBARI), where he helped scientists Danelle Cline and John Ryan study the ocean by listening to whale communication calls.

The whales are using acoustics to understand each other, just like my parents did when they first met. It made me think about their music and how important it was.

Daniel DeLeon

Daniel DeLeon listening to whale calls on through headphones
A waveform of a whale call Daniel DeLeon using machine learning software to identify whale calls from audio feed of the ocean
Daniel at Cabrillo College with his physics professor Daniel turning up volume of a whale call

By tracking the calls of endangered blue and fin whales and their changing migration patterns, scientists can learn a lot about the broader implications of human influence on marine life. Given Daniel’s passion for music, John and Danelle figured he wouldn’t mind spending a summer listening to ocean sounds from the institute’s hydrophone – an underwater microphone that sits 900 meters beneath the ocean. But the task ahead of Daniel was a little more complicated than just listening.

The ocean covers 70 percent of the earth’s surface and is very deep. But you dive down 23 meters, 99 percent of light is gone. In contrast, sound travels thousands of miles. So marine mammals use sound for all of their essential life activities. Just by listening we learn a lot about their lives.

John Ryan, Biological Oceanographer

In recording sound around the clock, the hydrophone presents scientists with a dilemma: too much data. It would take a dozen lifetimes to thoroughly analyze every bit of recorded audio. So Daniel’s job was to use TensorFlow, Google’s open-source AI tool, to do the tedious work of parsing through the audio files and identifying whale communications in a matter of days, instead of years.

Daniel arrived at his internship having never used TensorFlow, but he was good at math, and at its core, that’s all AI is: a series of algorithms that analyze data and learn to recognize patterns.

Blue and fin whales are some of the loudest animals on earth. Their low-frequency calls can travel across long stretches of ocean, making them excellent candidates for study. MBARI’s hydrophone can hear whales up to 500 kilometers away.

The sound waves recorded by the hydrophone have to be converted into visual data in the form of a spectrogram, a map of sound over time. Daniel feeds those spectrograms into the TensorFlow model to teach it what both blue and fin whale communication calls look like. Like a puppy being house-trained, AI models learn by repetition. The more examples Daniel feeds into it, the more accurate the model becomes. All told, Daniel trained the TensorFlow model with over 18,000 examples of isolated whale calls.

AI is just getting the computer to be able to pick up patterns.

Daniel DeLeon

Over time, Daniel was able to teach TensorFlow to identify whale calls with an accuracy of 98.05 percent. The model can differentiate between a blue whale and a fin whale, help confirm what time of day each call happened, how loud it was, and how long it lasted.

Blue Whale

Fin Whale

We’re at a very pivotal point in ocean science. It’s a really interesting time for AI too, because we are finally starting to solve problems that we weren’t able to even five years ago.

Danelle Cline, Senior Software Engineer

Daniel’s research with marine AI has helped John and Danelle build a foundation for automating detection and classification of whale calls. They’re now able to spend more time focusing on the big questions – like how eons-old migration patterns of these massive creatures are changing and what that teaches us about the broader impact human influence above water is having on the marine life below, from noise pollution to climate change.

Daniel DeLeon at the Monterey Bay Research Institute looking out at the ocean Two fin whales above water

I never thought about being a scientist. I didn’t think I was capable of doing it. My curiosity for the world, for the universe in general, fed the fire [in me] to give it a shot.

Daniel DeLeon

Watch Daniel’s journey unfold in the film below.

whales-video-large

Related Stories

Meet the team using machine learning to help save the world's bees

Meet the team using machine learning to help save the world's bees

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

How six young women are using technology to tackle unsafe drinking water

How six young women are using technology to tackle unsafe drinking water

How Traktivist Highlights Asian American Artists

How Traktivist Highlights Asian American Artists

How AI is helping provide advance warnings in emergency situations

How AI is helping provide advance warnings in emergency situations

Identifying wildlife with AI and motion-triggered cameras

Identifying wildlife with AI and motion-triggered cameras

Meet the team using machine learning to help save the world's bees

Meet the team using machine learning to help save the world's bees

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

How six young women are using technology to tackle unsafe drinking water

How six young women are using technology to tackle unsafe drinking water

How Traktivist Highlights Asian American Artists

How Traktivist Highlights Asian American Artists

How AI is helping provide advance warnings in emergency situations

How AI is helping provide advance warnings in emergency situations

Identifying wildlife with AI and motion-triggered cameras

Identifying wildlife with AI and motion-triggered cameras

Meet the team using machine learning to help save the world's bees

Meet the team using machine learning to help save the world's bees

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

A tribe uses cell phones and TensorFlow to fight illegal logging in the Amazon

How six young women are using technology to tackle unsafe drinking water

How six young women are using technology to tackle unsafe drinking water

How Traktivist Highlights Asian American Artists

How Traktivist Highlights Asian American Artists

How AI is helping provide advance warnings in emergency situations

How AI is helping provide advance warnings in emergency situations

Identifying wildlife with AI and motion-triggered cameras

Identifying wildlife with AI and motion-triggered cameras

Back to top