In astronomy, we rely heavily on visual information. Images from telescopes, graphs of light curves, plotted orbits, spectra and diagrams. Most of the ways we understand astronomy today are shown through these two-dimensional formats. It's almost always something we look at. But vision is only one way of interpreting information. From quantum physics to the largest structures in the universe, the underlying reality is governed by motion, frequency and structure.
The mathematics that define orbits and light waves also describes oscillation and rhythm. This project investigates what develops when these quantitative sequences, foundational to astronomy, are translated into three-dimensional formats and into sound, creating an additional way to engage with astronomical data. This process is called sonification. Sonification is a conversion of physical, numerical or visual data into sound. This works because sound is interpreted through frequency, amplitude and timing. Properties that are already abundant throughout astronomy. Orbital periods can be represented as rhythm, wavelengths as pitch, velocities as modulation, and distances as spatial location. The spectral data used to color astronomical structures represent its physical properties and elements, and those same values can instead be assigned to sound, which allows for an auditory interpretation of the same information.
To explore these ideas, I use mathematical modeling tools such as the Desmos 3D calculator, along with publicly available data from sources like NASA. The colors assigned to interpret wavelengths are reassigned into sound parameters, allowing 3-D structures constructed from equations to be experienced audibly. By routing equations, sound production software, changes in position, velocity and scale are directly mapped onto sound, creating a multi-sensory representation of astronomical systems. In this way the motion of astronomical objects becomes something we can perceive through sound.
Doppler shifts are used to determine whether an object is moving toward or away from Earth. When a star or galaxy moves towards us, its light shifts to a shorter wavelength, known as a blue shift. When it moves away, the wavelength shifts to the red. This is normally observed as small shifts presented by spectral lines on a graph. The same effect is familiar, though in sound. A passing siren has a higher pitch as it approaches and a lower pitch as it moves away. That change in pitch is the Doppler effect. When applied to astronomy, this allows redshift and blueshift to be represented as the pitch changes. Techniques used like the Doppler wobble method, small, repeating shifts in a star's motion caused by an orbiting planet, can be translated into subtle oscillations in sound, the rate reflecting the orbital period and the magnitude related to the planet's mass.
Activating multiple senses in the interpretation of information can deepen our insight. People are more likely to recall sounds better than graphs and numbers. Astronomical structures can help us develop a clearer understanding of how these systems behave. Dr. Wanda Diaz-Merced, a blind astrophysicist who has worked with NASA, uses sound in her research. Her work is a perfect example of how sonification can serve as an accessibility tool and also a practical method for observing patterns in complex data. It can reveal details that may even be difficult to detect visually and expand access to astronomical information. On my website, www.peppersghost.io, I go further into this approach, explaining how analyzing spectral data can be used to interpret light from faraway stars and galaxies through sound. These equations describe the universe in terms of structured behavior, and that behavior can be interpreted audibly. www.peppersghost.io
Music written and produced by Kenny Mihelich. Western Slope Skies is produced by the Colorado Mesa University Astronomy Club, the Western Slope Dark Sky Coalition, and KVNF Community Radio. This feature was written and voiced and sound recordings by Matthew Harrington.