In Culture

Tech for a better world - The sonic turn

Episode Summary

Meet artists who are transforming the way audiences experience and interact with music. Hosts Becca DeGregorio and Todd Whitney speak to Julianna Barwick, Peter Chilvers, and Matthew Dear.

Episode Notes

Musicians are tapping into technology like never before, transforming the way music is made, experienced, and consumed—and along the way, they’re changing the relationship between artist and listener. From Brian Eno’s technologist Peter Chilvers to composer Julianna Barwick to electronic musician Matthew Dear, we’ll hear from the forward-thinking artists who are pushing the limits of their craft.

For more information about Microsoft's In Culture podcast, please visit: microsoft.com/inculture/podcast

Episode Transcription

Julianna Barwick: Do we want to listen to this last piece? Cool.

Becca DeGregorio: Sure.

Julianna Barwick: This one is “Night.”

Becca DeGregorio: From Microsoft, this is In Culture. I'm Becca DeGregorio.

Todd Whitney: And I'm Todd Whitney.

Becca DeGregorio: This already sounds more nighttime-y to me than the other ones already.

Julianna Barwick: Yeah. It's kind of like floating.

Becca DeGregorio: Yeah. It's starry, kind of. Space-like.

Becca DeGregorio: Todd, what if I told you the piece we just heard sounds different every day? Like, it's always changing. The sounds rearrange.

Todd Whitney: Well, first I'd say that that's mighty tight. And second, I'd ask how.

Becca DeGregorio: Fair question, and I kind of want to hold off answering it until later in the episode. It has to do with technology.

Todd Whitney: Classic.

Becca DeGregorio: And the artist, Julianna Barwick, is just one of three people we spoke with for this episode about music and how artists are using technology to push the boundaries of their sound. And I mean that literally. This is about blurring the line between music maker and the music listener.

Todd Whitney: I love that. Our favorite artists are more accessible nowadays than ever before. They're constantly posting on social media, dropping new music through streaming services, and touring incessantly. But on this episode, what we want to do is show you how artists and listeners can interact on a more fundamental level, within the music itself.

Becca DeGregorio: I'll start with someone who truly straddles music and technology.

Peter Chilvers: My name is Peter Chilvers. I'm a musician and a software designer, and most of my work revolves around various projects with Brian Eno. Almost all of which involve finding some unusual way to use technology to make art better.

Becca DeGregorio: For those of you who don't know, Brian Eno is a bit of a legend in music. As a producer he's helped shape the sound of artists like the Talking Heads, David Bowie, and U2, but he's also got an experimental streak. Solo albums like Music for Airports and Discreet Music laid the foundation for ambient music in the '70s, and pushed the boundaries of composition in popular music. We might not even be talking about the kind of projects we're digging into on this episode, were it not for Brian's innovations. And in recent years, Peter has played a big role in what he does. He's his music technologist. That's his actual title.

Peter Chilvers: Brian isn't himself a technologist, he doesn't know how to program. He understands the ideas of what can be done with computers and can, I think see further into that. And because I'm a musician, I can see what he's trying to do.

Becca DeGregorio: Peter was a computer programmer through the '90s working on video games.

Peter Chilvers: At some point by very lucky chance a mutual friend introduced me to Brian Eno and I started working on a computer game called Spore, which was a very fascinating, huge open-ended computer game. We needed to tackle all sorts of ways of creating music that would just float endlessly in space really. So that got me working with Brian and we became friends while doing that, and I basically didn't leave the studio after that. Pretty much anything that involved a computer we did together.

Becca DeGregorio: Brian had long experimented with technology in his music, but in Peter he found someone who could really push him further. And in Brian, Peter found someone who could push him, too.

Todd Whitney: So it's kind of like some new-age producer and musician partnership?

Becca DeGregorio: Yeah, sort of. But he says it's a real back-and-forth when generating ideas. Brian comes up with something, Peter says it's not possible, Brian insists on trying, Peter discovers it is possible, and back again. And they have mutual creative interests, too.

Peter Chilvers: On my first trip to Brian's studio, he knew I was interested in generative music. Generative music is a system that creates music.

Becca DeGregorio: Music that evolves itself over time and changes. Brian's been at this for years, experimenting with different methods. Peter described one example where Brian would queue up a bunch of CD players and play a different piece of music on each with varying lengths.

Todd Whitney: Very retro.

Becca DeGregorio: Yeah.

Peter Chilvers: And they'd all have just different little elements of a composition on. So one might just have some bells on that came in occasionally, one might have mostly silence and just the occasional sort of washes of string noise. But by this very simple technique, you'd get quite complex music.

Becca DeGregorio: Now, this was obviously very clunky. Brian eventually moved this endeavor to computers, where he could create a more sophisticated set of rules governing his generative compositions. And this work led Brian to ideas that weren't even on his creative radar. Like the idea that your audience wouldn't just listen to a generative composition, but actively participate in it. And that's where Bloom comes in. If you haven't heard of Bloom, well put as simply as possible, it's an application that allows you to trigger musical sounds through touch and visual cues. It's a smartphone app these days, but it wasn't always. The project took seed almost right when Peter and Brian began working together.

Peter Chilvers: We'd had a meeting in the studio, he'd showed me a few of his techniques, and I kind of came home and played around with some software in the studio. This was on a PC using a mouse. So whenever you clicked on the screen, you'd see just a shape expand and sort of mutate, and that immediately felt like an interesting way to make music. It was quite a natural experience. It's just simply click a mouse on the screen, have a sound occur. By having these quite random elements, a person clicking on the screen triggering a sound, it echoing, and it all being in the right sort of key and using nice sounds, which Brian had designed, the user feels like they're creating the piece. They feel at the heart of it. That's something very, very unusual I think in music.

Becca DeGregorio: And all of this development predated smartphones and tablets. When the mid 2000s arrived, putting touchscreens into wide use, it seems like the arrival of technology perfectly aligned with the aims of the project. A mouse click on a PC would become a screen tap on something that lived in your pocket. Bloom was ready for release. So it launched on smart phones. This kind of experience, kind of game…

Todd Whitney: Kind of an album?

Becca DeGregorio: Definitely an album of some sort. Now Bloom could have stopped there as a hard to categorize app, but –

Peter Chilvers: Whenever a new technology comes along, we'll think, "Well, what's possible with this now? What out of the things we've done in the past, could we do with this new technology?" So when virtual reality, augmented reality, when these came along, then immediately we started getting very interested in Bloom again and the HoloLens became one of the very interesting first launch points for this.

Becca DeGregorio: So what happens next?

Peter Chilvers: We're in Brian's studio, I think it was a December evening, so it was quite dark outside and that meant anything we did on the HoloLens just lit up beautifully in the room. So we had all these little orbs floating around. I remember wandering around the room tapping the air and being quite enchanted by the experience. It's hard to explain to someone who hasn't actually put on a HoloLens, but just how compelling seeing a shape in front of you hovering in the air is that you technically know is not real, but you still want to go and touch it. I never in the whole time I was developing on the HoloLens stopped trying to touch these things I was creating in space.

Becca DeGregorio: Enter Bloom: Open Space, a kind of Bloom live performance. That thing happening on your phone screen suddenly happening all around you. At a warehouse in Amsterdam, Peter and Brian created a space enclosed by giant projection screens. About a dozen guests at a time dawned HoloLenses, walked inside, and started building a Bloom composition together.

Peter Chilvers: They'd start tapping and you'd see a little cloud of Blooms around them. You'd be tapping, you see clouds of Blooms. The sound you're making would be appearing on the screens, and you suddenly realize you're all part of a big collaborative piece of music. No one person is a soloist, no one person is more important than the other, but it was something very nice. You see people laughing together, actually deliberately trying to produce Blooms in the same space or creating little towers and little clouds around them.

Todd Whitney: I'd imagine they'd get a lot of reactions and compositions?

Becca DeGregorio: Yeah, and apparently that was the beauty in it. Peter says some people ran around the room tapping everywhere and making as many Blooms as they could. Others hung back a little. The software kept everything in check, making sure the music never got completely out of hand, but the audience itself was in the driver's seat. Without them, the room would have been silent. But Peter says a big motivation behind Bloom: Open Space is to put this very specific experience in front of people and have them enjoy it for its own sake. Slow down a little, unplug...

Todd Whitney: But not really, because you're wearing a HoloLens.

Becca DeGregorio: Good point. But this concept of shifting focus, it's powerful.

Peter Chilvers: I feel we're in the middle of an attention war really at the moment. There's just so many things trying to grab your attention and so to go into a space and just do one thing, that's quite refreshing. To just calm down, to not be checking your phone, hopefully not taking selfies. That was quite nice.

Todd Whitney: Peter Chilvers and Brian Eno used emerging tech to build on the kind of music they've been making for a long time, but what happens when an artist gets the opportunity to crack open their music for the first time?

Matthew Dear: Hi, my name is Matthew Dear and I'm a myriad of things.

Todd Whitney: Matthew, who everyone calls Matt, is a producer who makes everything from synth-pop to big room techno tracks. Like anyone making electronic music, technology is a part of his art, but he'd never call himself a technologist first and foremost.

Matthew Dear: I don't open the hood of my instruments very often. I'm not that kind of a tinkerer.

Todd Whitney: But in 2015, Matt got a pitch to present music in a way he'd never thought of before. As a songwriter he'd stood in front of a crowd and fed off of its energy while he played his songs, and as a DJ he'd create a type of immersive experience every weekend in nightclubs. But this commission you got, he really wouldn't need to be present at all for this performance.

Matthew Dear: I think I hung around for the first like 15 minutes or so until one of the ushers said, "Sir, you have to leave. Other people want to check it out."

Becca DeGregorio: Okay, now what are you talking about?

Todd Whitney: So picture this: You walk through a nondescript door right off of a busy downtown street in New York City. Everything's darkly colored, nothing about blacks and grays. And as your eyes adjust –

Matthew Dear: And you start to notice that there's a whole bunch of stuff there. There's nets on the wall, there's scrim, which is like a really thin fabric that you can kind of see through, kind of not. There's lights, and you start to hear something. You start to hear some sound. Very ambient, very amorphous.

Todd Whitney: You're inside DELQA, an immersive sound experience, which had a limited run at the New Museum a few years ago. Visitors stepped inside of Matt’s music and could control the sound landscape from within. Touch the fabric walls, pull on some of the rope, and the music would just shift.

Matthew Dear: You start to realize that other people are touching some things and then starting to interact with the actual space. And then you realize, "Oh, okay. Well, maybe what I'm doing here is controlling some of the sound." And that's what it was doing. You start to play with things and you start to touch things and you realize that you can make sounds louder and quieter, faster, slower by touching certain parts of the experience.

Todd Whitney: Matt worked with a team of creative technologists to connect his sound to his audience’s actions. They used Kinect, a Microsoft device that reads your body and assigns certain functions certain movements. So basically it turns your whole body into a controller.

Becca DeGregorio: This was the thing they developed for Xbox, right?

Todd Whitney: Yeah. Same technology, but a couple of years ago folks started to realize they could use the Kinect for all kinds of projects. Scanning for 3D printing, interactive dance installations. And that's part of the reason why Matt was asked to do this in the first place, to put this technology to work in a musical capacity.

Matthew Dear: So if you raise your hand in the air, you can make that do something. If you push on something, you can tell the cameras to look for that and to translate that into data that then could be sent to something that I did sound-wise.

Todd Whitney: So as guests move through the space, pressing on the stretchy fabric walls and playing with the rope structures, an army of Kinects translated those manipulations into sounds and musical motifs from Matt's composition. And the piece will take shape in a way that was totally unique to those in the room.

Matthew Dear: You kind of knew a lot of the magic tricks, but you had to let people figure those out for themselves. And I'm guessing now that you think about that, I mean probably half of the things weren't really even used the way that we thought they would be used, which is that's completely fine. That's how it goes. That's art.

Todd Whitney: DELQA was just a moment in Matt's career. The scale and quirks of the project meant that he couldn't take it on tour, and he's gone back to the usual schedule of writing albums, playing shows, and DJing. But Matt's brush with interactive technologies and making his audience an integral part of his music? He says that DELQA was a high point for him artistically and a project that's forced him to consider his listeners in ways he had never done before.

Matthew Dear: DELQA is like the extreme version of taking everything apart, and opening it up, and letting people inside of it. It was definitely an evolution in my career that I don't think I will top in that section of my career in terms of experimenting with light, sound, and the human body. There's no greater achievement than, I think, that one.

Jordan Rothlein: Yeah.

Becca DeGregorio: I guess it's got to be this one.

Jordan Rothlein: Yeah, I think this is what it looks like.

Becca DeGregorio: Which brought me to this musician, Julianna Barwick's home studio in a quiet corner of Los Angeles.

Todd Whitney: Who we heard at the beginning, right?

Becca DeGregorio: Yes. I'm finally getting around to explaining what we heard at the top. Now, you probably heard some vocal lines in that track. Most of her music has this ambient soundscape-y vibe where she uses her vocals as an instrument. Full disclosure, I've been listening to Julianna ever since being a choir nerd in high school, but for this project that I came to speak with her about, she used more than that. Synths, pads, and a tool she'd never played with before to trigger her sounds.

Julianna Barwick: I was just interested right away because I have never used AI. That was totally new territory.

Todd Whitney: Ah, another AI story.

Becca DeGregorio: Yeah, but for a really creative purpose.

Todd Whitney: So what was this for, an album?

Becca DeGregorio: No. Actually, when she got this pitch, she was doing her best to avoid the album cycle. It tired her out.

Todd Whitney: So what was the pitch?

Becca DeGregorio: Get this – to compose a lobby score for a new hotel in the lower east side in Manhattan, which I should say is personal to Julianna. She lived in New York for 16 years before moving to LA and the Lower East Side played a big role in her development as an artist while she was here.

Todd Whitney: Okay, that's cool. When I think of a hotel lobby score though, I'm really thinking, like, standard piano.

Becca DeGregorio: Yeah, like light awkward jazz. Well, the twist with Julianna's score is she's got to use AI.

Todd Whitney: To do what?

Becca DeGregorio: To arrange it to take her sounds and trigger them based on what's going on in the sky overhead.

Julianna Barwick: Well, the idea was to have a camera perched on the roof of the hotel so that the camera would be always on, and the AI would be reading the information. The events would be an airplane going by in the sky, a bird flying by, clouds, bright sunshine, nighttime. Like, all of those things. The AI was formulated to read and understand what those particular events were.

Becca DeGregorio: And these events corresponded with specific sounds she'd composed. She made five pieces in total corresponding to different times of day. And together they play throughout the hotel, which is called Sister City, for a full 24 hours.

Julianna Barwick: Every day it's never going to be the same thing because one event is read by the AI and that changes every day. And then that goes into the score and it generates and evolves. So it's ever changing.

Becca DeGregorio: She had help on the technology front: a computer vision service on Microsoft's Azure Cloud sat behind the camera. But the compositional challenge was all on Julianna.

Todd Whitney: So how did she compose something like this?

Julianna Barwick: Well, I just started from the beginning. Was kind of noodling around with synths and things like that and came up with chord progressions that I liked and thought would fit. And then built on top of that.

Becca DeGregorio: So Juliana showed me her session for the morning piece and when she did, I saw this matrix of audio clips. Each one labeled something like airplanes, sunshine, birds. Things that can be seen by the camera pointed at the New York sky. And she played me a little.

Julianna Barwick: All Right. This is “Morning.”

Julianna Barwick: So what we just listened to was everything all at once. So it's all of the vocal tracks, all of the basslines all at once. All of the sample sounds for the events. So the sound of what's happening in the lobby is completely different from this.

Julianna Barwick: Okay, here we go. Sun and afternoon.

Becca DeGregorio: That sounds sunny to me.

Julianna Barwick: So that was probably the thought process behind that. You know?

Becca DeGregorio: Sounds like a very light xylophone, kind of?

Julianna Barwick: Yeah.

Becca DeGregorio: I feel like I can really hear too that that's a New York sun as opposed to somewhere like here –

Julianna Barwick: Like LA.

Becca DeGregorio: Yeah, something kind of faint and varying.

Becca DeGregorio: I could hear that she's injected emotion and personality into these sounds and that kind of humanizes all this technology sitting between her music and what visitors to the Sister City hotel would eventually hear. Sister City opened in May 2019, which is when Julianna actually heard the piece triggered by AI for the first time.

Julianna Barwick: When I was listening to it for the first time, live in the space, live in the lobby, and finally being able to hear what we'd all been working towards for so long. It was really magical and wonderful, in that I found myself immediately when I would hear a little ping or something, I was like, "Oh, I wonder if that's a bird?" And I wanted people in this space to be doing exactly what I immediately found myself doing, which was understanding the project and what the AI is doing to then be like, "What was that? I wonder what that was."

Becca DeGregorio: I think that makes the Sister City piece fundamentally different from the kind of music you hear in hotel lobbies, and not just on a technical level. It's ambient, but it's not passive. It's meant to bring you into a dialogue with the city to draw your attention to the little random moments happening all around us.

Julianna Barwick: AI feels kind of like maybe my band mate that never ever goes to sleep and can trigger the sounds that I've made and use whatever inputs it's gathering to kind of make things happen fresh that I can't do because I'm not there.

Becca DeGregorio: How much of this score feels yours?

Julianna Barwick: I would say it's nearly 50/50 for me. I know that I provided all of the sounds, but the score would not be what it is without all of the other components. I really feel proud of this project. I feel like I did my best to fulfill all the requirements that were set, but I think that it has a soul at the same time.

Becca DeGregorio: Probably gratifying too that it continues writing itself.

Julianna Barwick: I mean, that's just the coolest thing about this is that it's morphing as we speak in some way that I have no idea what it could be doing, depending on what's happening in New York city skies right now. So it is cool. It lives on.

Julianna Barwick: It's just amazing to think how quickly this last year has gone by and it's kind of a sit in for me since I moved away from New York and can kind of be sitting there in my favorite neighborhood of New York City. It's pretty cool. Bordering on emo.

Todd Whitney: Definitely bordering on emo.

Becca DeGregorio: Yeah. But isn't that wild that AI can get you there, to an emo state of being? I mean, I think what we learned from talking to these folks is that technology doesn't necessarily sanitize the creative process.

Todd Whitney: Right? I mean, all these projects kind of seem to extend how we define music and how we get to experience it.

Becca DeGregorio: Now, I'm sure that more traditional songwriting and concerts aren't going anywhere. I still love those things and all the artists we spoke to, they do too.

Todd Whitney: I'm with that, but the cool thing about these projects is that they really point toward a future where artists can have a different type of relationship with their fans, and fans can really have a different kind of relationship with the music.

Becca DeGregorio: Totally. And with the wider world. Projects like these, they get us out of our headphones a little and really connect us in a new way.

Becca DeGregorio: To learn more about all the people and stories featured in this episode, visit Microsoft.com/In Culture. If you're dying to see what Bloom: Open Space and DELQA looked like in action, you can watch videos and view photos from the events. You'll also find a deeper dive into Julianna Barwick's Sister City score and the technology behind it. And for more sites from the series, find us on Instagram @MicrosoftInCulture. In Culture is hosted by me, Becca DeGregorio and Todd Whitney. Produced by Jordan Rothlein and edited and mixed by Nat Weiner. Original music by Angular Wave Research. This episode also featured some beautiful music from Juliana Barwick's Sister City score. Special thanks to the artist and Secretly Group for letting us use it. In Culture is a production of Microsoft in collaboration with Listen, a sensory experience company in New York City.