Extract from ABC News
Astronomy is such a visual profession, made relevant and inspiring by the luminescent images created from powerful telescopes like the Webb and Hubble — our eyes on the universe.
But not everyone sees it that way.
For Nicolas Bonne, there's more to the heavens than giant flaming globes or the twinkling smear of a distant spiralling galaxy.
"So much of astronomy relies really heavily on the visual, and a lot of the time doing that doesn't necessarily make sense because so much of the universe isn't actually visible to us.
"It's outside the visible spectrum," he tells ABC RN's Future Tense.
Dr Bonne is an astronomer at the University of Portsmouth. He also happens to be blind. While he can't see the movement of a planetary transit or the surge of a solar flare, that's not to say he can't "hear" them, and a major part of his work involves getting others to listen in too.
These days, Dr Bonne focuses on outreach and public engagement activities for the visually impaired.
And the technique he uses, called sonification, is not only creating greater opportunities for scientific inclusion, but helping astronomers to fine-tune their celestial observations.
The truth about our visual universe
Technically speaking, there is no noise in deep space. A lack of molecules means there is no medium through which sound waves can travel. Essentially, most of the universe is a giant, near-perfect vacuum.
But hot turbulent gas in stars produce internal and surface waves which can be picked up by telescopes. Space telescopes also measure wavelengths of light and send that data back to Earth. Sonification allows the astronomical data transmitted by telescopes to then be turned into sound.
The sounds generated can be discordant, like the jarring techno-robotic soundtracks associated with sci-fi films, or they can be melodic. It all depends on the purpose for which the sonifications are designed.
Sonified data used for outreach purposes — that is for exhibitions or educational purposes — tends to be easy on the ear, sometimes reminiscent of bells or a glockenspiel.
Generating sound from data in this way may seem unorthodox, but according to Italian astrophysicist, Anita Zanella, it's just as valid as the traditional technique used to generate visual images of outer space.
"We are used to seeing these beautiful images of the sky, but in reality what we receive from telescopes and instrumentation are numbers that we usually translate into images.
"What we're trying to do is investigate whether we can use sound to make sense, let's say, of these numbers and explore these datasets," she says.
Dr Zanella will be hosting an international astronomy festival in Italy in June called The Universe in All Senses. And data sonification will play a starring role.
"I plan to test whether the public is more engaged when sonification is used," she says.
"We explore reality in a multi-sensory way, so why do we have to limit our job and our research by using only one sense, which is sight?"
It's a sentiment shared by Dr Bonne. In his outreach work, he incorporates representations that are multi-dimensional using visual imagery, a Braille-like form of tactile representation and sound.
He says sonification is particularly effective for interpreting motion. He used it in a planetarium show for the blind and visually impaired in 2020 as part of British Science Week. It was called "A Dark Tour of the Universe".
"We created this soundscape where you could hear the planets in the solar system moving around you. That's a really difficult thing to describe," he says.
"But being able to listen to those planets moving around you in space was actually a really powerful thing.
"It was just a really nice example of how sonification can make that type of thing more accessible."
Superior sound, but a lack of uniformity
Bruce Walker, from the Georgia Institute of Technology, believes sonification is more than just a novel alternative to image-based data representation.
He argues there are instances where it's preferable to take a sonification approach.
"The auditory system is a fantastic pattern recognition device. We accomplish speech by listening to changes in a person's voice over time. We can use the same capabilities to listen for changes in a dataset," Professor Walker says.
"So we know what makes for a good [auditory] display — pitch, timing, tempo and so on — and we can leverage that and use that to make our auditory graphs as compelling and as understandable as possible."
But for all its potential, no one has yet come up with a universal set of standards for auditory display.
"It becomes an effort for someone to say, 'I want my data to represent something like the intensity of a solar flare'. And if that's never been done before, that person is basically inventing or designing that auditory display," Professor Walker says.
"That is a challenge because then they have to tell us and teach us how to listen to what they've created. And if it's the first time that we've listened to it, it can be a challenge."
Back in Italy, Dr Zanella has begun working with psychologists from the University of Padova to try to make the sounds that are chosen for sonification less arbitrary.
"What we are learning ... is that there are some translations that are more effective than others."
For example, to distinguish dead galaxies from star-forming ones, timbre is a "highly effective parameter for classification" because "our ear is very good at distinguishing [it]", she says.
But to compare stars' brightness, pitch is more effective, "because our ears are very good at distinguishing very subtle differences in pitch".
The potential for new discoveries
So data sonification is being used to better map the universe and all its components. But stars also create their own noises which allow scientists to better probe their inner workings.
At the University of Birmingham, astrophysicist Bill Chaplin employs sonification to help make sense of the data he gathers from monitoring the internal mechanics of stars. It's a field of science known as asteroseismology.
"Stars act as natural generators of sound," Professor Chaplin says.
"In the outermost layers of the Sun, because we've got parcels of gas that are moving around and carrying their energy with them, things get very turbulent.
"You've got these parcels of gas that are buffeting into one another. And that makes changes in pressure in the gas ... That's just a sound wave."
Professor Chaplin describes his approach as the astronomical equivalent of an ultrasound scan.
"By measuring the pitch of the overtones at which a star resonates, we get information about the structure of the star, and we get information about how rapidly the insides of stars are spinning."
That's been crucial, he says, in gaining a more precise understanding of what stars look like inside, how they have evolved over time and how the Sun's changing outputs and emissions might affect the Earth.
"So, this has really opened up possibilities for us to be able to really validate and test our theories."
Professor Chaplin says that by adopting a sonification approach, the results of his research are far more accessible to a broader audience.
Dr Bonne hopes increased accessibility won't just increase our understanding of the universe, but also inspire the young and vision impaired.
"What we're really trying to do with young people is show them that if this is something they're passionate about, if astronomy [or physics] is something that they want to do ... there's always going to be a way that they can access it that will work for them.
"They just might need to think outside the box and do things a little bit differently."
No comments:
Post a Comment