What do a decapitated whale, an x-ray scanner for solid-fuel rockets, and noise pollution have in common? This is not the start of a really weird joke–they are all major elements in a new paper by Ted Cranford and Petr Krysil, published this week in PLOS One. In a word, the first went through the second to help understand the impact of the third.
This was, believe it or not, a very logical plan for a research project. The modern ocean is increasingly filled with man-made noise, and some of them may be affecting marine mammals. Blue whales, for instance, have lowered the pitch of their songs over the past few decades, possibly in reaction to the rising volume of ship noise in the ocean. However, we don’t actually know how well many of the largest whales hear, because there is no way to measure it.
We can do experiments with whales small enough to be held in captivity (these are all toothed whales, or odontocetes–dolphins, orcas, etc.). Give them a fish whenever they react to some noise, then turn down the volume until they stop responding. But large, baleen whales cannot be kept in captivity, and similar experiments in the wild are much more difficult, if not impossible. So instead of an experiment, Cranford and Krysil used a finite-element computer model to simulate sound waves traveling through head of a whale. In order to do this, however, they first needed a detailed, 3D model of a whale’s head and skull.
Which is where the dead whale comes in. On November 20, 2003, a newborn fin whale washed up on the beach in Orange County, CA, where an unsuccessful attempt was made to save his life (the necropsy suggested he may have been born prematurely). At this point, the whale was decapitated, and the head was frozen inside a four-foot diameter cardboard tube. It was then shipped to an Air Force base in Utah. The Air Force possesses CT scanners which are used, not for medical purposes, but to inspect rocket engines for engineering flaws. And apparently, if you show up with the appropriate docs from the Navy and NOAA, they will let you put a whale head through one. Who knew?
With the 3D internal structure of the whale’s head and skull in hand, Cranford and Krysil got down to modeling. They found that the skull transmitted sound to the ears much more effectively than the direct path through the soft tissue over the ears. In the paper, they refer to these two alternatives as the “bone conduction mechanism” and the “pressure mechanism.” Low-frequency sounds actually set the whale’s whole skull to vibrating, as shown (with the vibrations greatly slowed-down, and exaggerated in magnitude) in the memorable animation below:
The two egg-shaped bones doing the shimmy at left are the tympanic bullae, and they are part of the internal ear.
More practically, the authors used their simulations to predict how sensitive the whale’s hearing is across a range of frequencies, usually displayed on a plot called an audiogram. Their fin whale audiogram does involve elements of speculation: their model is based on a “small” whale, only 5.5 meters long (adults grow up to 27 meters), and they had to base the absolute hearing sensitivity on measurements from another species. But with an audiogram in hand, we can predict what kind of sounds and sound exposure whales might be able to detect, and what might interfere with their behavior and communication. And if we want to reduce our impact on the ocean’s sonic environment, this is crucial information.
Cranford, T., & Krysl, P. (2015). Fin Whale Sound Reception Mechanisms: Skull Vibration Enables Low-Frequency Hearing PLOS ONE, 10 (1) DOI: 10.1371/journal.pone.0116222
Pingback: atomic blonde lektor pl
Pingback: parti rni
Pingback: RNI - المهندسون المغاربة