What does sound look like? California Institute of the Arts (CalArts) is partnering with New York University (NYU) on a large-scale project that allows us to see the acoustic energy of cities on digital maps. The project’s first iteration, Citygram One: Visualizing Urban Acoustic Ecology, has received a $59,000 Google Research Award.
Ajay Kapur, Associate Dean for Research and Development in Digital Arts at CalArts is working with Tae Hong Park, Associate Professor of Music Technology and Composition at the NYU Steinhardt School of Culture, Education, and Human Development. Along with a team of graduate students from both schools, Kapur and Park are measuring the sounds of cities and urban environments and charting the sonic data on maps. The project will enable a richer understanding and representation of cities which are defined by human activities, visible/invisible energies, buildings, machines, and other biotic/abiotic entities.
Citygram aims to provide publically accessible mapping interfaces of acoustic data gathered from NYU and CalArts areas via non-intrusive and privacy protecting technologies.
Invisible energies, such as sound, are typically in a state of flux and could potentially yield insights into the living and breathing dimensions of our environment that are underexplored in contemporary mapping research. Subsequent iterations of Citygram will focus on additional non-oracular energy formats including humidity, light, wind speed, and color. The implications of Citygram are exciting as the technology and “could contribute by providing quantitative data for urban policy-making, monitoring noise pollution, light pollution, and also help asthma patients via real-time measurement and mapping of humidity,” says Park.
“Citygram adds new layers to maps of physical spaces by automatically collecting, processing, visualizing, mapping, and analyzing acoustic energies,” noted CalArts’ Kapur. “The project advances contemporary interactive mapping and offers new tools for evaluating how we use our environments.”
“The collaboration is part of a research effort that will create the next-generation of interactive mapping,” said Park of NYU. “Our aim is to render scale accurate, non-intrusive, and acoustically-driven map layers that reveal information encoded in spatio-acoustic environments in real-time.”
Using remote sensing devices (RSD) the researchers will measure acoustic energy to decode, map, and visualize information such as noise pollution, traffic patterns, and spatial mood. The RSDs run on poly-sensory Android based hardware, which are very small, and can be affixed in designated measurement areas. Data streamed and visualized by Citygram maps can add valuable layers of information to emergency response efforts, municipal records, crime statistic, and census data—and can also provide artists and musicians with the raw data for creative works.
The applications of Citygram are wide in scope and could be “as simple as finding and listening to the closest quiet park for our children at this very moment; locating restaurants in the “moodiest” streets; finding a bar currently playing your favorite music style; and creating real-time ‘sound maps’ for the visually impaired,” notes Park.