Revealing cicada song patterns through audiolisation

By on

cicada-bugs-pictureOur brains are incredibly powerful pattern detecting devices. Scientists have often relied on our ability to detect visual patterns when trying to convey structure in their data: graphs, maps, images from under microscopes or telescopes. Sometimes, however, other sensory modalities are more appropriate for certain types of data.

team of biologists at the University of Uppsala have turned to audiolisation in order to represent their data. The team recorded the mating song of Henicopsaltria eydouxii cicadas in a half-square-kilometer patch of Australian forest for the first 50 minutes of the day, following sunrise. They substitute the sound of the cicadas for a different chord in 4 areas of of  forest, allowing us to hear the interactions between these areas.

Visualisation isn’t completely shunned however – in the video below, which is sped up 15 times its original rate, the coloured circles mark the different recording stations (the four sonic areas indicated by different colours) and their size represents the volume of the sound recorded there. Watch the video and you’ll appreciate the way that your ability to detect audio patterns helps to make sense of the interactions that occur between these sonic animals over large distances. The singing begins in the east where sunrise is earliest and oscillating patterns emerge before the cicadas break into a full sustained chorus. While the singing of each cicada is thought to be somewhat independent of the other males – their aim is to attract a mate, not to synchronise – detectable patterns emerge, much in the same way as when watching a flock of birds.

Audiolisation is a far underused form of data presentation – as communicative social animals, our reliance on hearing is just as great as for seeing. Our other senses are generally considered secondary to these two – it may therefore be a while before scientists decide to use olfactiolisation (smell), gustatiolisation (taste) or somatosensiolisation (touch) to present their data.


The following two tabs change content below.
James Cooke
James Cooke is a Neuroscience D.Phil. student, interested in cortical function and circuit organisation. He completed a B.A. in Experimental Psychology and an M.Sc. in Neuroscience at the University of Oxford where he is also based for his D.Phil. His research involves using electrophysiological and optogenetic techniques in order to investigate the biophysical basis of cortical computations. When he's not in the lab, he's usually either playing Jazz piano or running along a trail somewhere. You can find more of his work at @NeuralNoises

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>