Newswise — Meek and mighty animal sounds are all around us. In a few weeks, some of the most interesting among them will be discussed and heard at the largest meeting ever devoted to acoustical science, the Acoustics '08 Paris meeting, to be held Monday June 30 through Friday July 4 at the Palais de Congrès in Paris, France.

Animal bioacoustics is just one dimension of Acoustics '08 Paris. It is the field within acoustical science devoted to such things as analyzing the sounds of animals near and far -- the laughing, whooping, and groaning sounds of hyenas, for instance, or the chirp of birdsongs. Another hot area within animal bioacoustics is how sonar and other human sounds affect marine and land animals.

Some of the most interesting animal sounds are those we cannot hear -- like the croaks made by a rare species of frog that communicates ultrasonically or the high-pitched noises bats use to find their way through the dark. Acoustical research helps us better understand the wide world of nature, and it is also guiding the development of new technologies that will improve many people's lives, such as a new sonar device to help the blind navigate their environment.

This news release highlights just a few of the animal bioacoustic talks at Acoustics '08 Paris. More details on the other 3,500 presentations at the meeting and instructions for journalists who wish to cover the meeting are contained at the bottom of this release.

HIGHLIGHTS IN THIS RELEASE1) How Noise Affects Marine Mammals2) Birds Changing Their Tune3) Hyena Giggles and Groans4) Chinese Frogs Go Ultrasonic5) Better Recording of Animals in the Wild6) Bats Can Direct Their "Gaze" 7) How Bats Compensate for Ranging Errors8) A Sonar System for the Blind9) Taking Aural Cues from Flipper10) Acoustic Techniques for Monitoring Bird Migration11) Conservation and the Tiger's Roar

1) HOW NOISE AFFECTS MARINE MAMMALSEffects of sonar and other manmade ocean noises on marine mammals have traditionally been defined either as injury or disruption of behavior. The earliest concern was that elevated noise could reduce the range of communication by masking faint signals. Few studies have documented this effect, according to Peter Tyack of the Woods Hole Oceanographic Institute, but recent work emphasizes the various mechanisms animals use to compensate for elevated noise.

Tyack will present initial results from a study on behavioral responses of beaked whales and other whales to sonar and other sounds. The study was conducted at the Atlantic Undersea Test and Evaluation Center (AUTEC) range near Andros Island in the Bahamas, where beaked whales can regularly be detected using passive acoustic monitoring of their echolocation clicks. The tagged beaked whale responded to both sonar and killer whale sounds by premature cessation of clicking during foraging dives, and an unusually slow and long ascent. [Papers 1aID1 and 1pAB3]

Several other scientists who conduct research on the effects of noise and other human activities on marine mammals are also presenting papers at the Paris meeting. James Finneran of the U.S. Navy Marine Mammal Program will report on recent data for temporary threshold shifts (TTS) -- an increase in threshold for species such as bottlenose dolphins and belugas that persists after a noise has ended [Paper 1pAB7]. TTS depends on the exposure frequency of the noise, as well as sound pressure, duration, and temporal pattern, according to studies comparing hearing thresholds before and after subjects are exposed to intense sounds. Carmen Bazua of the University of Mexico (UNAM) will report on her studies of the effect of vessels and swimmers on the behavior of spinner dolphins [Paper 1pAB2]. And NOAA's Marla Holt ([email protected]) will present new result from her investigation of noise effects on the call amplitude of endangered killer whales. [Paper 1pAB6].

- Paper 1aID1, "How sound from human activities affects marine mammals" will be presented at 10:30 a.m. on Monday, June 30 in room 342B.- Paper 1pAB3, "Effects of sound on the behavior of toothed whales" will be presented at 1:40 p.m. on Monday, June 30 in room 342B.- Paper 1pAB7, "Effects of noise on hearing in odontocetes" will be presented at 3:00 p.m. on Monday, June 30 in room 342B. - Paper 1pAB2, "Effects of vessels and swimmers on the spinner dolphins (Stenella longirostris) off the Big Island of Hawaii" will be presented at 1:20 p.m. on Monday, June 30 in room 342B. - Paper 1pAB6, "Investigating noise effects on the call amplitude of endangered Southern Resident killer whales (Orcinus orca)." will be presented at 2:40 p.m. on Monday, June 30 in room 342B,

2) BIRDS CHANGING THEIR TUNE. With urban noise on the rise as human activity encroaches further and further into the natural habitats of wildlife, how flexible a species is in terms of acoustic adaptation could determine which birds will remain common in urban environments in the future, and which will retreat to quieter habitats. Hans Slabbekoorn of Leiden University in The Netherlands studies variations in birdsong; birds rely on these vocalizations to defend their territory and attract a mate, but more and more avian breeding areas are being affected by human activities. Some birds have developed counter strategies involving changes in loudness, pitch, and timing of their birdsong. Observational data collected to date have revealed numerous interesting birdsong patterns, evincing remarkable flexibility in different species, such as great tits, song sparrows, house finches, and European blackbirds.

At the meeting, Slabbekoorn will present data for a new species able to shift the frequency of its birdsong upward to adapt to urban noise: the chiffchaff (Phylloscopus collybita), recorded along a highway in the Netherlands. "The field data reveal for the first time that this species is able to adjust immediately to exposure to highway noise via artificial playback in quiet territories," he says. "Acoustic flexibility may be key to efficient use of the 'left-over acoustic space,' and may determine whether individual birds can maintain their territory and breed successfully." At risk are those species of bird unable to adapt their songs in response to increased traffic noise by shifting frequencies, reducing their chances of mating successfully; these populations may be declining in urban areas because of the interference in communication.

Paper 1pAB4, "Acoustics flexibility in singing birds under noisy urban conditions." will be presented at 2:00 p.m. on Monday, June 30 in room 342B.

3) HYENA GIGGLES AND GROANS. A hyena's laugh might be less telling than its groan, according to researchers at the University of California, Berkeley, who have studied the acoustic properties of various vocalizations of the spotted hyena (Crocuta crocuta), also known as the "laughing hyena." The laugh - dubbed a "giggle" by those who study the animal - is a vocalization used more frequently in competitive situations, such as haggling over prey, while loud whooping calls are used for long-distance communication. However, groans are the most common type of vocalization for communicating in short-range settings among this highly social species. Types of groans vary from a growling noise to more tonal sounds, according to UCB's Frederic Theunissen , who will report on new results from recent studies of hyenas in the field.

To decipher the meaning of these vocal signals, he and his colleagues, Suzanne Page and Steve Glickman, presented adult hyenas with three objects: unfamiliar spotted hyena cubs, meaty bones, and the empty transport cage used to contain bones or cubs on other experiments. The cubs elicited more groans from more hyenas than other objects, and the groans elicited by other objects were less tonal in nature, with lower fundamental frequency. The researchers conclude that hyena groans can be classified into different groups, based on acoustic characteristics, and that the hyenas modulate the sounds they produce in response to different behavioral contexts.

However, the exact meaning of specific types of groans remains unclear. The groans directed to the cubs might be friendly if produced by a mother toward her cub, or it might be a warning signal for an unrelated cub. The number of recorded interactions is still too small to draw definitive conclusions. The UCB group would like to record the vocalizations of hyena cubs in future studies, to better establish the meaning of the communications.

Paper 4pABd1, "Vocalizations of the Spotted Hyena (Crocuta crocuta): Eliciting Acoustic Variation in Groans" will be presented at 5:20 p.m. on Thursday, July 3 in room 343.

4) CHINESE FROGS GO ULTRASONICAmong vertebrates, only a few species are known to produce and detect ultrasonic frequencies for communication and for echolocation -- bats, dolphins and whales, and some rodents -- which suggests that this ability could be limited to mammals. However, UCLA's Peter Narins and his colleagues recently uncovered the first evidence of ultrasonic communication in an amphibian: the concave-eared torrent frog (Amolops tormotus), found in Huangshan Hot Springs, China. The males of the species produce diverse birdlike melodic calls with strong ultrasonic components, and the females respond with similar calls, enabling the males to locate them with extraordinary accuracy.

Narins will present recent acoustic recordings of the frogs in their natural habitat. Narins hypothesizes that the extension of the frog's hearing sensitivity into the ultrasonic range may have evolved in response to the intense, predominantly low-frequency ambient noise from local streams in that region of China.

Paper 5aABa21, "Ultrasonic production and reception in frogs: Lessons from Asia" will be presented at 5:20 p.m. on Friday, July 4 in room AMPHI BLEU.

5) BETTER RECORDING OF ANIMALS IN THE WILDIt can be quite challenging for researchers to record and analyze animal sounds in the field, because of obstructions, odd sound wave propagation patterns, the diversity of bioacoustic sources, and ambient noise. To address this problem, a team of researchers led by MIT's Lewis Girod and Samuel Madden, along with UCLA's Daniel Blumstein, has developed VoxNet, a hardware and software platform for distributed acoustic monitoring applications. The hardware must be robust enough to survive deployment in the field, and it must operate wirelessly. Wired connections are logistically difficult in the field: cords become tangled, connectors fail, and wildlife may chew on the wires.

Furthermore, the network and the sensors must be able to configure and calibrate themselves for orientation, instead of relying on GPS, which is often not available in heavily shaded environment. Using many nodes can complicate matters even more, since all must be maintained individually and in sync. Girod's objective is to develop a system that does not need to be programmed up front, collecting data and analyzing it later in the lab. "Our vision is to make VoxNet an interactive tool for use in the field," says Girod. "A scientist should be able to note something that just happened a second ago, and immediately 'drill down' on that event with an interactive signal-processing tool box."

Each VoxNet node is a portable, self-contained processor with a small four-channel acoustic array. Using a distributed set of nodes, a forested habitat can be monitored and the behavior of animals can be recorded and analyzed acoustically. The team deployed their VoxNet system in a recent bioacoustic census, collecting data during a trip to Chiapas, Mexico at the Chajul Biological Field Station, located in a region of dense rain forest. It is home to Mexico's most diverse ecosystem. VoxNet performed well in this harsh environment, despite a few audio glitches resulting from the high humidity. So far, the team has recorded a set of raw data as part of an ongoing project to obtain local census estimates based on observation of bird calls.

Paper 2aAB15, "Experience with VoxNet: a rapidly-deployable acoustic monitoring system for bio-acoustic studies." will be presented at 1:00 p.m. on Tuesday, July 1 in room 342B.

6) BATS CAN DIRECT THEIR "GAZE" The echolocating bat controls the direction and distance of its acoustic "gaze," according to Cynthia Moss of the University of Maryland. It produces ultrasonic vocalizations and uses information contained in the returning echoes to build a 3D auditory "image" of its surroundings. The timing, bandwidth, and duration of echolocation signals directly impact the information available to the bat's acoustic imaging system. In turn, the bat's auditory representation of space guides its actions: ear movements, head aim, flight path, and the features of subsequent ultrasonic vocalizations. Moss's latest research indicates that bats encountering a complex environment were found to shift the direction and distance of their sonar gaze to inspect closely spaced objects and targets sequentially.

At the meeting, Moss will summarize the bat's adaptive vocal behavior as it engages in complex spatial tasks. Her team recorded the bat's 3D flight path with high-speed stereo infrared video, and recorded its sonar signals with a microphone array that permitted them to reconstruct the emission. They found that the big brown bat (Eptesicus fuscus) "points" its sonar beam in different directions to inspect objects in its environment. The bat's beam is sufficiently wide (within a 60-90 degree cone) to enable it to gather information about closely spaced objects (a complex environment) simply by directing its beam in the general direction of those objects. Instead, the bat points the center of its beam sequentially at these objects.

According to Moss, this suggests that the bat is carefully analyzing acoustic information separately from closely spaced objects. The bat also modifies the duration of its calls to avoid overlap between vocalizations and echoes. When the bat encounters objects that are at a close distance, it produces shorter sonar calls than when it encounters objects that are further away. In Moss's latest research, the bat encountered an obstacle that was closer to it than an edible food reward, and made adjustments in the duration of its calls, indicating to the researchers whether it was paying attention to the near or the distant objects.

Paper 2pABa1, "The echolocating bat controls the direction and distance of its acoustic gaze." will be presented at 2:20 p.m. on Tuesday, July 1 in room 342B.

7) HOW BATS COMPENSATE FOR RANGING ERRORSEcholocating bats obtain 3D images of their surroundings in complete darkness by emitting sonar signals and evaluating returning echoes. When flying close to objects, bats risk collision and therefore depend on the accuracy of those auditory images -- particularly in the perceived distance of obstacles/targets, which is coded by the time delay between call and echo. Yet during flight, such accuracy is perturbed -- first because bats call and receive echoes at different positions and second because echoes are modified by Doppler shifts, according to Marc Holderied of the University of Bristol, who studies how flying bats compensate for these sorts of ranging errors.

Holderied has found that combining the two types of errors -- both of which depend upon flight speed -- results in an interesting spatial distribution of the combined ranging errors. Objects at one particular distance from the bat have zero ranging errors, while ranging errors increase for closer or more distant objects. This distance of zero ranging error depends on signal design, in particular the sweep rate, according to Holderied. By adjusting signal design, flying bats could shift this distance adaptively to their target of interest. Because this has similarities with focusing -- accommodation in vision -- this distance is called distance of focus.

Holderied will report on new results from his studies of how 18 different species of bat control the spatial distribution of ranging errors by signal sweep rate. "The challenging task is to relate their echolocation behavior to their flight behavior, particularly with respect to obstacles and prey," he says. To that end, he employs 3D tracking techniques combined with 3D laser scans of bat habitats to study their adaptive calling behavior. At the meeting, he will present examples of actual distances of focus for different bat species in different behavioral contexts, such as search flight, obstacle avoidance, and target approach.

Paper 2pABa9, " Acoustic focussing: how flying bats control spatial distribution of Doppler-ranging errors by signal sweep rate" will be presented at 6:20 p.m. on Tuesday, July 1 in room 342B.

8) A SONAR SYSTEM FOR THE BLINDEcholocation is a method of perceiving the world by emitting noises, then listening to the reflections of these noises off objects in the environment. Animals use echolocation for hunting and navigation, but visually impaired humans also employ echolocation as part of their orienting repertoire while navigating the world. There are a few rare individuals who can echolocate very well without assistance. However, researchers at Boston University have developed a prototype device that can enhance auditory cues while navigating an environment. The device repeatedly emits an inaudible (to humans) ultrasonic click several times per second, and each click reflects off any objects in the environment. The reflections are then detected by special head-mounted microphones, and computer processing converts the ultrasonic signals into audible signals, which the user then can hear over custom open-ear earphones.

The end result is an "auditory image" in which objects in the environment seem to emit "sounds" to the user, with objects of different shapes and textures emitting subtly different sounds, such that the user can distinguish between them. According to BU researcher Cameron Morland , the unique acoustic characteristics of the reflections enable the user to better distinguish the location and size "surface" properties of objects. For instance, sounds emitted by an object to the left will arrive at the left ear a bit sooner and louder (interaural time difference and interaural level difference).

Furthermore, sweeping the device over a surface while remaining the same distance from it, will produce a reflection with unchanged velocity of the surface of an object is flat. If the surface is tilted so it moves closer to the user, it will sound higher in pitch; tilted the other way, it will sound lower in pitch (a Doppler shift). A roughly textured surface will have some regions that are closer, and others that are further away, and users can easily learn to recognize those differences, and discern the resulting pattern of increased and decreased pitch. "Venetian blinds sound quite different than a flat surface, or a bookshelf packed with different-sized books," says Morland.

The BU team has built a prototype capable of simple detection of objects and open spaces, and preliminary tests show that most people can echolocate a little using the device, and improve quickly with practice. They are now refining their prototype to function in more complex, real-world environments. Morland believes that given enough practice, people should be able to echolocate very well using the device - perhaps better than they could unassisted, since higher frequencies outside the normal range of human hearing are more useful for echolocation. (Movies of the device can be found at http://cns.bu.edu/~cjmorlan/research)

Paper 2pUWa6, "What it is like to be a bat: A sonar system for humans" will be presented at 5:20 p.m. on Tuesday, July 1 in room AMPHI BORDEAUX.

9) TAKING AURAL CUES FROM FLIPPERDolphins have a very keen sonar system that is able to make fine distinctions between complex targets such as buried mines. But what are the cues these animals use for fine target discrimination? Whitlow Au of the University of Hawaii will present the latest results from a series of human listening experiments, using echoes from real targets and a simulated broadband dolphin echo-ranger. The echoes are stretched in time to shift them into the lower frequencies of the human auditory range. He finds that human performance is usually as accurate as the dolphins when it comes to object discrimination. Participants are then asked to identify which aural cues were most important in enabling them to make those determinations. These include click pitch, echo duration, time-separation pitch, and timbre. "Unfortunately, we can't ask a dolphin the same question," Au admits, and humans might not rely on the same cues as dolphins, given the differences in their auditory systems. However, "Human listening studies can quickly identify salient combinations of echo features that permit object discrimination and also help refine dolphin experiments," he says.

Paper 2pUWa2, "Insights to dolphin sonar discrimination capabilities with human listening experiments" will be presented at 2:20 p.m. on Tuesday, July 1 in room AMPHI BORDEAUX.

10) ACOUSTIC TECHNIQUES FOR MONITORING BIRD MIGRATIONMost bird migration occurs under the cover of darkness, and presently there are almost no reliable or robust techniques for identifying which species are passing as they migrate. By recording the unique flight calls of birds as they fly by night, researchers can develop migration maps for each species that depict the routes and timing of migration. Knowing about these migration patterns is crucial for bird conservation because any plans to conserve birds' stopover habitats require detailed knowledge of the timing and location of their passage.

Andrew Farnsworth of Cornell University's Laboratory of Ornithology has been using acoustical techniques to study bird migration patterns since 1991. For the past three years, a grant from the U.S. Department of Defense Legacy Resource Management Program has enabled his team to make great strides in the field.

Farnsworth will discuss the methods and scope of his work, which include deploying a recording device to record entire nights of migration for periods of up to 70 days and specially designed software for automatically extracting the sounds of interest from these recordings in order to map the patterns of bird calls as a function of time and location. His presentation will show that acoustic data results from vocal nocturnal migrants correspond with the more traditional methods of bird banding and ground observations and can be an effective and complimentary method of monitoring bird migration. However, his results will also show that there are certain groups of more elusive and rare species that are much better represented in the acoustic data.

The presentation will also feature audio playback of bird records that highlight the ephemeral and unique qualities of these calls.

Paper 2aAB2, "The value of acoustic technologies for monitoring bird migration" will be presented at 8:20 a.m. on Tuesday, July 1, 2008 in Room 342B.

11) CONSERVATION AND THE TIGER'S ROAR In Paris, Edward Walsh from Boys Town National Research Hospital, Douglas L. Armstrong from the Henry Doorly Zoo, and their colleagues will present the most recent findings of the Omaha Tiger Project, one of the first and most detailed analyses of the auditory properties of tiger calls and tiger hearing. The research has confirmed the previously determined notion that the dominant frequency in at least some tiger calls is in the low frequency range of 200 Hz to 300 Hz. A question of interest to biologists is whether tigers produce calls in the infrasonic ranges, frequencies below 20 Hz that humans are not capable of hearing. The research has shown that although tigers are in fact capable of making these types of calls, and that they may be used when tigers are communicating over long distances, infrasonic energy is not a common feature of most calls studied thus far.

The greater goal of the Omaha Tiger Project is to contribute to ongoing efforts to conserve free ranging tigers, all of which are seriously endangered. Because effective conservation strategies require an accurate knowledge of the number of individuals living in a given territory, and because existing census numbers for tigers are notably inaccurate, the group plans to develop methods to identify individual animals in the wild on the basis of the acoustic properties of their calls. The presentation will feature audio recordings of tiger calls.

Paper 4aABa7, "Acoustic communication in Panthera tigris: A study of tiger vocalization and auditory receptivity revisited" will be presented at 10:00 a.m. on Thursday, July 3 in Room 342B.

***********************MORE INFORMATION ABOUT ACOUSTICS '08 PARISThe science of acoustics is a cross-section of diverse disciplines, including fields such as architecture, speech science, oceanography, meteorology, psychology, noise control, physics, marine biology, medicine, and music. Acoustics'08 Paris is the world's largest meeting devoted to this range of topics. It incorporates the 155th Meeting of the Acoustical Society of America (ASA), the 5th Forum Acusticum of the European Acoustics Association (EAA), and the 9th Congrès Français d'Acoustique of the French Acoustical Society (SFA) integrating the 7th EUROpean conference on NOISE control (euronoise), the 9th European Conference on Underwater Acoustics (ecua) and the 60th Anniversary of the SFA.

ABOUT THE ACOUSTICAL SOCIETY OF AMERICAThe Acoustical Society of America is the premier international scientific society in acoustics devoted to the science and technology of sound. Its 7,500 members worldwide represent a broad spectrum of the study of acoustics. ASA publications include The Journal of the Acoustical Society of America-the world's leading journal on acoustics, Acoustics Today magazine, books and standards on acoustics. The Society also holds two major scientific meetings each year. For more information about the Society, visit our Web site, http://asa.aip.org.

MEDIA CONTACT
Register for reporter access to contact details
CITATIONS

Acoustics '08 Paris Meeting