Above, watch a sample of Julie Freeman’s new data-driven artwork, We Need Us.
Artist Julie Freeman creates kinetic sculptures, objects, images, compositions and animations from nature-generated data – such as the motion of fish swimming. Today, Freeman announced a new piece of work from the TED Fellows stage at TEDGlobal 2014. We Need Us — an online, data-driven artwork that explores the nature of metadata — has just gone live on The Space, a new website for digital art funded by the BBC and Arts Council England. Here, she tells us about what we can learn from experiencing data, rather than simply drawing information from it.
You are known make art using data from natural sources. Where is the data for We Need Us drawn from, and how is it different?
This metadata comes from a citizen science website called the Zooniverse, which allows people to classify large data sets from all the over the world. Volunteers from all walks of life come together to do this in a very altruistic manner, helping scientists complete extremely labor-intensive tasks, freeing them up for other research and analysis.
Essentially, I use data as an art material. I take the metadata looking at Zooniverse user activity, and how they’re interacting with the site. I manipulate and process the data, and then that’s used to control the animations and sound compositions, which are made of field recordings.
What did you record?
All sorts of stuff – underwater sounds, recordings of the environment, of birds, insects, buildings, machines. Anything.
How is this different from straight-ahead data visualization?
Traditional data visualization is about how we understand data and the information it contains. What I’m doing is a lateral way of looking at data. How can we experience it? How can we feel it, and what does it mean to think about the life of data — how it lives, and what the dynamics within it are?
What is the structure of this piece?
The work is made up of 10 different scenes, if you like, and each scene relates to a project on the Zooniverse website. There’s one called Snapshot: Serengeti, for example, where volunteers look at photographs taken by motion-triggered cameras in the Serengeti, to help classify the animals appearing in the photograph — say a bison or antelope. But I’m not so much interested in the animals as taking the data of the people classifying the data. What do they click on? When do they click on it? Where are they from? Using that data, I animate an abstract illustration drawn from references to the Serengeti. The sounds are things like flies buzzing, grasses in the wind, bison making weird noises.
What was the impetus for collaborating with Robert Simpson and Zooniverse?
Robert and I met at TED2014 in Vancouver, and when he told me about Zooniverse, I thought, “I’ve got a great idea!” At the time, The Space — where We Need Us is hosted and which is a new online platform for data-based artwork — had approached me as a curator. I said, “Actually, I’m an artist that works with digital technologies and would like to make a work with Zooniverse data.” They loved it, so they, along with the Open Data Institute, commissioned the piece.
And as a scientist, what does Robert think about what you’re doing?
He thinks it’s brilliant. Interestingly, a group of scientists are working with exactly the same data that powers my artwork, but they are looking at how communities come together to collaborate, to solve problems. But I’m using the data for art, and they’re using it for proper social science reasons. It’s nice to know that this pot of data is being used by different people for different outcomes. Basically both projects are about the humanity in technology, exposing the altruism of how people use the web, and what we can learn from that.
To view We Need Us, which goes live on Monday at 2pm UK time, visit www.thespace.org/weneedus. And to learn more about Robert Simpson and the Zooniverse, read “You found a planet!: Robert Simpson crowdsources scientific research and accelerates discovery at Zooniverse“.