What is DRSRL? Knowledge through music

Getting Science and Art Through Music and Movement

Design Rhythmics Sonification Research Lab works with scientists and museums to turn information and data into music. Why music? Not only do we love music, but it just so happens that music is composed of a very rich palette of qualities upon which data may be mapped and thereby perceived by the brain through the auditory channel. Music stimulates cognition and memory, and offers those who are blind or visually handicapped the opportunity to understand information and gain knowledge in new ways. By working with scientists who are shedding new light on our world, and the museums and centers who are helping to disseminate it, we seek to create innovative, pleasurable and accessible audio information presentation solutions for the public to "get it" by hearing. Whether online or live, our work makes science accessible to more people through listening to the music of the spheres.

Marty Quinn teaches the STEREO spacecraft to play music with its data and images.Created by computer scientist and composer/percussionist Marty Quinn, these sonifications demonstrate how data of all kinds can be translated into music for the purposes of expanding and enhancing our perception of the earth and our environment. Dr. Rita Colwell, former director of The National Science Foundation, cited this work in one of her speeches for being an innovative merging of art and science.

In 2009 , we toured "Walk on the Sun" (see pictures above) as part of new NASA sponsored program under the title "Light Runners". We showed the exhibit for a week at a time in 12 cities at science centers and run special programs for those who are blind and visually impaired at the science center or at centers for the blind. We are pleased to announce the exhibit may now be purchased from DRSRL. See the exhibits link or click here for details on Walk on the Sun.

The Light Runners program was based on a 2-year NASA Ideas grant to create a museum exhibit based on sonified, musically encoded representations of data and imagery from the STEREO Space Mission. New image sonification and visualization techniques for solar images and data has also led directly to new techniques to present works of art, increasing the accessible of art by making the content of paintings generate music based on the color data present in the pixels in a photograph of the art. We showcased and discussing these approaches at the Astronomical Society of the Pacific conference poster session on Sept 6th, 2007 in Chicago and the Soundscapes panel at the Art Education for the Blind 2007 Conference in NYC Sept 29, 2007 sponsored by Art Beyond Site and the MET. The grant included collaboration with the Christa McAuliffe Planetarium in Concord, NH and UC Berkeley's Space Science Lab. See ArtMusic to see and hear examples of art as music.

Water Ice on Mars Listening for water across the surface of mars: a collaboration with the University of Arizona's Lunar and planetary lab

Things to listen for

  • Where is the water more prevalent, the poles or the equator?
  • Which latitude exhibited the highest water content?
  • Why are the drums louder in the equatorial regions?
  • Why do the patterns of music change with the months at some latitudes?
  • What is the perceptive and cognitive effects hearing all variables at once vs one at a time?

Exoplanets

This section explains our sonification design for the European Southern Observatory Exoplanet Database featuring Exoplanet and corresponding star characteristics. This was created a few years ago (2008) after 170 exoplanets had been discovered around relatively nearby stars. The music can be accessed here or (32 mb)

Exoplanet Sonification Audio Key

Exoplanet sonification graphic described in the text

The Mapping from Data to Music

This sonification uses what we term ‘measured’ sonification. Each row of data contains the following fields:

Planet_Name,Pl._Masse, Pl._Period, Pl._Semi-axis, Pl._Pl._Ecc., Pl._Incl., Ang._Dist., St._Dist, St._Spec._Type, St._Mass, St._[Fe__H], St._Right_Asc., St._Decli.

Each row is presented within the duration of a measure, with variables presented at strategic points within the rhythm of the measure. This gives the mind a chance to hear individual variables, while still retaining the concept that they belong together. It is a system of data, one row, that is presented in one measure of music, the next measure contains a different system, the next row, and so forth till the end of the file.

Each planet/star solar system information, each row of data, is presented in 4400 ms using a measure of 10/8 time broken up 4 + 4 + 2. The data looks like this: 14_Her_b, 4.74, 1796.4, 2.8, 0.338, 0.0, 0.154696, 18.1, K0_V, 1, 0.35, 16_10_23, +43_49_18, 16_Cyg_B_b, 1.69, 798.938, 1.67, 0.67, 0.0, 0.078037, 21.4, G2.5_V, 1.01, 0.09, 19_41_51, +50_31_03,

2.1. The first four beats presents the star data.

2.2. The second four beats presents the exoplanet data.

2.3. The two beats at the end act as a pause or separator between planet/star or row presentations.

  1. The higher pitched the strings, the closer the star.
  2. The lower the drum tones, the bigger the star and planet. (using a physical model metaphor)
  3. The louder the cymbal, the more iron in the star. (intuitive, cymbals are metal)
  4. The higher pitched the bell tones, the more iron in the star. (a redundant presentation of data, sometimes a very good thing, gives the brain another opportunity to hear the data in a different way)
  5. The faster the cymbal rhythm, the faster the orbital period of the planet. (intuitive, tempo = period)
  6. The louder the snare drum roll, the more eccentric the orbit.
  7. The chord of music on the first beat announces the name of the planet, created by mapping letters into notes of a scale and mapping each letter position to an instrument that plays its letter pitch (if it has one). This process produces a unique audio signature per word. (maybe the people on the exoplanet, or in the future, can read thru music!:>) )

Exoplanet Sonification Design Layout. Table of beat meanings. Each cell is a quarter note.

chord p.name
Timpani/bass Lower = heavier s.mass bass   s.mass bass   p.mass piano   p.mass piano      
Bell Higher = more   s.fe   s.fe            
Marimba/piano Higher = faster           p.period   p.period    
Cymbal Louder = more s.fe s.fe s.fe s.fe s.fe s.fe s.fe s.fe s.fe s.fe
Cymbal faster p.period p.period p.period p.period p.period p.period p.period p.period p.period p.period
Snare roll Louder = more       p.ccentricy            
                     
                     

Originally developed for

Worlds Beyond Poster GSFC Visitor's Center Exoplanet Garden Opening

 

Things to listen for

  • TBD

Deep Space Image Sonification from the Bareket Observatory Webcast 2009

These images were sonified using 9000 notes in 10 seconds each using a raster scan from top to bottom. 90 pixels across each line and 100 lines down the image were turned into notes based on brightness. Color determined the instrument used from amongst 10 instruments. Because most of the images were black and white, you will hear mostly notes played on a piano.

 

M3


M3


m17-RGB

m 17-RGB.jpg use control below image to play ImageMusic.

m17-RGB



m1-S001-R001-C003-Luminance

m1-S001-R001-C003-Luminance, use control below image to play ImageMusic.

m1-S001-R001-C003-Luminance



Blinking Planetary

Blinking Planetary.jpg use control below image to play ImageMusic.

Blinking Planetary



jupiter-S001-R001-C001-Ha_6nm

jupiter-S001-R001-C001-Ha_6nm.jpg use control below image to play ImageMusic.

jupiter-S001-R001-C001-Ha_6nm



m2-S001-R001-C001-Luminance

m2-S001-R001-C001-Luminance use control below image to play ImageMusic.

m2-S001-R001-C001-Luminance



M15.jpg

M15.jpg use control below image to play ImageMusic.

M15.jpg



m16_2min-001

m16_2min-001 use control below image to play ImageMusic.

m16_2min-001



m57

m57.jpg use control below image to play ImageMusic.

m57.jpg



m57_5min-003

m57_5min-003.jpg use control below image to play ImageMusic.

m57_5min-003



m71.jpg

m71.jpg use control below image to play ImageMusic.

m71.jpg



m82.jpg

m82.jpg use control below image to play ImageMusic.

m82.jpg



m 20-S001-R001-C002-Luminance

m 20-S001-R001-C002-Luminance use control below image to play ImageMusic.

m 20-S001-R001-C002-Luminance



m 27-S001-R001-C001-Luminance

m 27-S001-R001-C001-Luminance.jpg use control below image to play ImageMusic.

m 27-S001-R001-C001-Luminance



m 32-S001-R001-C001-Luminance

m 32-S001-R001-C001-Luminance use control below image to play ImageMusic.

m 32-S001-R001-C001-Luminance



m 39-S001-R001-C001-Luminance

m 39-S001-R001-C001-Luminance.jpg use control below image to play ImageMusic.

m 39-S001-R001-C001-Luminance



m 51-001

m 51-001 use control below image to play ImageMusic.

m 51-001



m 52-S001-R001-C001-Luminance

m 52-S001-R001-C001-Luminance use control below image to play ImageMusic.

m 52-S001-R001-C001-Luminance



M 57-RGB

M 57-RGB use control below image to play ImageMusic.

M 57-RGB



m 71-S001-R001-C004-Luminance

m 71-S001-R001-C004-Luminance use control below image to play ImageMusic.

m 71-S001-R001-C004-Luminance



m 92-S001-R001-C001-Luminance

m 92-S001-R001-C001-Luminance use control below image to play ImageMusic.

m 92-S001-R001-C001-Luminance



neptune-S001-R001-C001-Luminance

neptune-S001-R001-C001-Luminance use control below image to play ImageMusic.

neptune-S001-R001-C001-Luminance



NGC 224-S00X-R001-C001-Luminance

NGC 224-S00X-R001-C001-Luminance use control below image to play ImageMusic.

NGC 224-S00X-R001-C001-Luminance



   
         

IBEX

Interstellar Boundary Explorer

Science Overview

What is the "boundary of our Solar System"?



Some things, like a table or a soccer field have clear edges and boundaries. Other things, like cities and towns, have boundaries that aren’t as easy to see. It is hard to say where they end and something else begins if you are looking at them from a distance.
Our Solar System has a boundary – but where is it? You could say that the Solar System extends as far as the influence of the Sun. Could the reach of the Sun’s light or the extent of the Sun’s gravity help us decide how far the Solar System extends? The light from the Sun gets fainter as you move farther away, but there is no specific place where the light stops or where it suddenly weakens. Also, the influence of the Sun’s gravity extends without limit, although it is weaker the farther away from the Sun that you travel. There is no boundary at which either light or gravity stops. Neither of these would seem to help us define our Solar System’s "edge."
The heliosphere helps define one type of boundary of our Solar System. The solar wind from our Sun blows outward against the material between the stars, called the interstellar medium, and clears out a bubble–like region. This bubble that surrounds the Sun and the Solar System is called the heliosphere. It is a definable, measureable region in space.
IBEX is the first spacecraft designed to collect data across the entire sky about the heliosphere and its boundary. Scientists have used this data to make the first maps of our heliosphere boundary. Our heliosphere boundary does not emit light that we can detect, which means it would be impossible to image using conventional telescopes. Instead of collecting light, like other telescopes do, IBEX collects particles coming from the boundary so that we can learn about the processes occurring there. The boundary of the Solar System protects us from harmful cosmic rays. Without it, four times more cosmic rays would enter our Solar System and potentially damage our ozone layer and DNA. It is important to study this region to know how it works.

Artist's Rendition of Heliosphere. It looks like an eye with Sun in the middle, with a curved boundary on the left. pushing in on the iris.

[credit: http://ibex.swri.edu/mission/]

Sonification Overview

The following sets of rectangular images depicts the skymap energy at .87 kev collected by the IBEX LO instrument over a period of 3 years. It presents the sky as a series of half circle strips per longitude by latitude position in each column of the image. 30 latitudes per 60 longitude values are presented as pitches played on guitar, plucked strings, or piano depending on whether the color is a tone of blue, purple or black. The image is generated using IDL's color table 1 which is based on the color blue. The sonification algorithm designed by Marty Quinn of DRSRL translates color into one of nine instruments and brightness into one of 43 pitches, using a Spanish Gypsy 7 note scale with a flat 2nd, 6th and 7th comprising 6 octaves.

Each ImageMusic example presents the data starting at the 'top' of every column or +90 degrees and moving slowly down each column to -90 degrees. The presentation moves from left to right across the sky starting at 0 degrees longitude to 360 degrees or put another way, from -180 to 0 midway, then 0 to +180 at the right hand edge.

The first example presents the data slowly as individual pixels, the second example plays each column as a chord of music and quickly moves across all 60 longitudes in a matter of a few seconds. Each example provides a different way to perceive the data. Obviously, the slower the presentation, the easier it is to hear each individual value that makes up the whole sky, but it is harder to perceive the whole image. The faster presentation, on the other hand, allows for better perception of the whole. At a cost of not knowing exactly which latitude strip is brighter than another, one can compare longitudes more easily and hear the differences across the sky to the level of being able to count the bright areas, and determine whether the sky is uniformly energetic or not, all through the music.

NOTE: These are YouTube video links, click on the image to play and click on the space bar to pause or replay.

The first video is the presentation of the audio legend. In it we hear the range of brightness and how it maps to pitch. Sonic needles moves from the top of the image, where it is bright, to the bottom of the image, where it is dark, presenting 30 latitude positions down the image as a musical sequence three times. In the visual display, yellow dots appear as each pixel is played.

Example 1. Slow playback by 6 x 6 degree square of the sky.

 

Example 2. Fast playback of longitude chords made up of 30 latitude divisions by 6 degree by 6 degree squares of the sky starting from 0 degrees longitude and moving thru 360 degree longitude in 6 degree widths.

Click here to go to video to hear and view the sonification of the image.

Can you hear the data?

    Each sonification has a short quiz at the end. Please try to answer the few questions as best you can so that we can continue to improve our image, data and map sonifications.

Ice Core Data: The Climate Symphony

Click on image to play the Climate symphony. To play the music alone click here:

Galaxy Spectra

Five galaxy spectra derived from data from the Arecibo Radio Telescope.

Musically Encoded Galaxy Spectra

In this sonification of Galaxy Spectra derived from data from the Arecibo Radio Telescope, the guitar express the values from each dataset considered in isolation from other datasets. In the compare examples, the piano sound expresses how this dataset would sound when considering it in relation to the high and low values of all five files. The data is expressed by mapping low values to low notes and high values to higher notes within an expressive range of a 97 note chromatic scale. The fractional part left over is further used to control a range of pan locations for the resulting sound. For the piano, we pan within a range of 0 to 29, on the left, and for the guitar we pan within a range of 98 to 127, on the right. The fractional part of the pan mapping is further expressed in a range of volumes for each sound ranging from 89 to 110. So the number of theoretical perceptual resolution bins is 97 * 30 * 30 or 87300. ©2006 Design Rhythmics Sonification Research Lab

Image

Track

Duration

Track

Duration

Notes

5957

:27

5957-Compare

:22

4673

:33

4673-Compare

:33

5253

:17

5253-Compare

:17

5291

:26

5291-Compare

:26

Image should be reversed to reflect data order

m83

:15

m83-Compare

:15

Image should be reversed to reflect data order

STEREO and SDO

Image Sonification of NASA's STEREO and SDO Mission Data

Please click here. You will be directed to www.drsrl.com. Thank you.

 

 

 

 

 

Rock Around the Bow Shock Four 'Cluster' spacecraft data as they cross the magnetic bow of the earth

Hear the changes amongst four spacecraft during a bow crossing period.