Browse Source

Add 'content/transcript/mccarthy.md'

master
james 3 years ago
parent
commit
311246620a
1 changed files with 97 additions and 0 deletions
  1. +97
    -0
      content/transcript/mccarthy.md

+ 97
- 0
content/transcript/mccarthy.md View File

@@ -0,0 +1,97 @@
Lauren Lee McCarthy
James Parker (00:00:00) - All right. Thanks Lauren, for taking the time to speak with us, would you mind just introducing yourself and saying a little bit about who you are and what you do?

Lauren Lee McCarthy (00:00:08) - Sure. My name is Lauren Lee McCarthy, and I'm an artist. Um, I'm based in Los Angeles. And I guess in my work, I'm really interested in the way that our relationships and our understanding of identity is shifting, you know, a lot to do with the technologies that are increasingly occupying every aspect of our lives. And so I make a lot of projects that are kind of mixing performance, software, and film and other artifacts.


James Parker (00:00:46) - You know, Lauren is the one that we sort of first came across and that really jumped out to us as being relevant to Machine Listening specifically, or kind of the way in which it isn't, but maybe is in some ways Machine Listening. But, but also I know that that's very closely related to someone. So there are, I don't know if you think of them as a pair or if it makes sense to speak about those works together or,

Lauren Lee McCarthy (00:01:12) - Yeah, I really feel like, uh, I think of someone as another, like Lauren as a kind of starting point for an exploration. So I'll talk about that one first, I was noticing a few years ago, the proliferation of smart devices and AI into the home and just thinking about what that meant in terms of privacy and the kind of boundaries around a very intimate, personal space, like the home, like, it almost feels even more intimate than your pocket in some ways, because you have these different ideas of what happens there. And so I just started thinking a lot about that and interacting with the devices like Alexa and Google home and things like that. And I guess one thing that I sort of realized in that process was that, you know, because whenever I'm working with an it technology, I'm thinking about like, how, how do I personally relate to this?

Lauren Lee McCarthy (00:02:09) - And, uh, I think a lot of the time we are given new technologies and it's hard to find the metaphor to understand our relationship to it. And so we kind of just accept it. So I was looking for that relationship and I realized what I felt most was that I was kind of jealous of Alexa that I'm like this very shy person. And especially when I'm first meeting someone, um, it's like very hard for me to figure out how to like get through or get to a place of, of more intimacy. And then I was seeing this device where people just like take it in and pop it down in the house and started talking to about talking to it and sharing everything. And so I was kind of like fantasizing about this idea of becoming Alexa and getting to do that in people's homes.

Lauren Lee McCarthy (00:02:50) - Um, and so then I thought, okay, well maybe I can, the piece is called Lauren because I designed this service that, you know, instead of Alexa, it's, it's that you can get in your home, people sign up on this website. And then I install a series of devices. So cameras, microphones, door locks, lights, and other appliances. And then I leave and I remotely watch over that person, 24 hours a day, sleeping when they sleep and interacting with them. And they basically interact with me as if I'm Alexa, but they know there's a human there. So I have like a digital voice and I'm using text to speech. And, um, not only can I talk to them, but I can also control every aspect of their home. And so the, I guess I set it up as my goal to try and like become, or to try and be better than Alexa by drawing on like my human, my humanity, or my ability to observe.

Lauren Lee McCarthy (00:03:46) - And so I would also like go a step further than Alexa in terms of just like taking action without them even asking sometimes just like deciding that that's what they need might need or want at that moment. And so typically those performances would last anywhere from a few days to a week and I did them in different people's homes and yeah, so that's the Lauren project and that kind of opened up a, a larger investigation where then I started thinking about different ways to think about this, this kind of relationship. Um, and so someone was a piece where I guess one of the things I was feeling was, you know, whenever I'm dealing with performance, there's like the participant, the performer, and those are different experiences. And so I was interested in sharing the experience I had as to the performer, cause it was just so effecting.

Lauren Lee McCarthy (00:04:36) - Um, and seeing if I could bring other people into that. And some, someone was kind of like a larger scale. So there were four different homes at the same time that I'll have this system. And then you go into a, a gallery and there's kind of, it looks kind of like a command center. It's like some kind of cross between something like we work center and something and like a call center. And you're sitting there with headphones and you've got a laptop, there's four different laptops and they go each one points to a different home. And so the visitors could come and actually just like.

Lauren Lee McCarthy (00:05:07) - Peek into people's homes and then control them and fulfill that role. So it's called someone because, you know, instead of calling out for Lauren, the people in their homes would call out someone turn on the lights or someone can, you, you know, lock the door. And even if you weren't sitting at a table, if you're in the gallery, you might hear the sound from it, you know, calling out for someone and you know, some little bit go to the table, hopefully and help them out.

James Parker (00:05:30) - What was it like fulfilling that role for you personally? And, you know, were you able to share it as sort of effectively that experience with the gallery goers? Because I imagine that there's something going on in terms of the temporality and plus know the proximity and intimacy that it's hard to reproduce.

Lauren Lee McCarthy (00:05:54) - Yeah, totally. I mean, I think when I did it, um, I was thinking a lot about control and the way that I had control over their home, but also in this commitment to be present with them and watching every moment that they were awake or around, they had a certain control over me, you know, like I'm, I would need to use the bathroom and I would like have to take my laptop with me in case they called out in the meantime, or they may be doing something very monotonous, you know, just watching TV or reading a book for them. That's a relaxing to be real for me. I'm just like, I can't do anything, but watch them do that. And just waiting for the moment, you know, an hour or two in when they're like, Oh Lauren, could you change the song? Or, you know, look up this word for me.

Lauren Lee McCarthy (00:06:40) - And I was really interested in the way that people felt this comfort in doing that. Whereas if the person was in your house, maybe you would feel more of a need to accommodate them. So I was like interested in that idea of like making myself into a system and trying to like have that carry through to the extent that that person felt that. So, yeah, and I, there was definitely like a relationship that developed really clearly between the two of us that felt very intimate and personal by the end then with someone you, I mean, you've named it exactly the challenges that it's not one person committed to watching this, like spending a week of their life, watching the person. It's a bunch of people coming through and maybe spending one minute or maybe, you know, some people would sit there for longer, like half an hour, but totally different.

Lauren Lee McCarthy (00:07:28) - Right. And yeah, they didn't have the same experience that I had necessarily because they, that like durational factor was not there. But I think that they got that feeling of, you know, I would notice people kind of sit down and play around with it. And then at some moment, see a person enter the frame and there'd be like this click of like, Oh, this is like a real, this is like real life that's happening. You know? And I'm actually the one, like the thing that I was just kind of banging on the interface, I was in an interface at someone's life or someone's home. And the other thing that I noticed was that even though there wasn't this continuity of one human behind the people in their homes really started to develop relationships with the system. And so that's where it got really interesting and murky for me because it was like, you know, I know people develop relationships with devices all the time, but it wasn't quite that because they're new they're humans there too. So it became this, they kind of had a relationship with the idea of a human behind this system without it being a specific individual. And so the, the devices in their home really took off, you know, someone in their home became this character for them, but I character was enacted or played by many different people over time

Sean Dockray (00:08:44) - Was thinking in the way that you were just describing it also. And also maybe there's something in the name, someone which is an acknowledgement that there might be no one, you know, like, like when you're calling out for someone, but deep down, you sort of admit that there might not be anyone on the other end. Um, which also seems to be a bit of a difference in that, like the responsibility that you felt personally to kind of being there throughout the entire duration is not necessarily felt yeah. In the someone in, you know, where it's a rotating shift of sort of more or less delegated or outsourced agents let's say. And so I guess what I'm wondering about is, is the role of like the failures, you know, like when someone's call would go unresponded to like, I guess like in, in, in those two different systems, there are the possibilities of like, um, their request being sort of like miss acknowledged or acknowledged wrongly maybe in one case, or like, you know, interpreted by a human, but in the other case, just maybe missed entirely and yeah. How the participants sort of, you know, calibrated their kind of like expectations, you know, like thinking about them as a performer as well. And, um, yeah. And it's those differences between those two systems? Yeah.

Lauren Lee McCarthy (00:10:10) - There's a way in which we kind of comedy technologies, you know, we, we understand the ways to make them work, even when they're not behaving in the way they are necessarily designed. Um, I think we did that already with like apps and devices are like, Oh, you, and some of it is actually techniques for getting things to work again. And some of it is more like folklore, you know, like, Oh, if you do your thing thing with your phone here, and then that will work or whatever, um, like people kind of come up with their own theories and I've felt that kind of dynamic really strongly in both cases. So, you know, like there are moments where I realized when I was playing Alexa or when I was Laura and that I was like, wow, I'm just so much less efficient than Alexa, you know, they would say, can you turn on the hairdryer?

Lauren Lee McCarthy (00:10:58) - And then I'm like, Oh yeah. And I hit the button and I'm like, Oh wait, no, that's the, that's the faucet, wait, hold on. You know, like I'm kind of scrambling in a way that a machine wouldn't and to the extent that there were moments where I'm like, I remember one moment really clearly where someone asked like, Hey, did I take my medication earlier? I can't remember. And, uh, all I could really do is like, start to kind of, you know, I can't analyze all the footage, you know, on the spot, but I could jump around to different points and try and find it. And I was really aware of like this feeling of like, Oh, if I was an algorithm, I would just feel more confident in my answer right now. But all I can give you is just kind of like human guess or response.

Lauren Lee McCarthy (00:11:38) - And, but anyway, where I was going, I guess, was that the people on the other end were very, you know, sometimes you see people get frustrated at their technology or device, cause it's not doing the thing they want in this case. I think knowing there was a human error gave them this like patients just to be, so they were like happy to have that familiarity of like, Oh, I, that, that human humanness of not being able to get everything right. Or, or do the thing quickly, um, was present in the system. And then with the, someone, I think it was similar, but there was definitely more of a, like a time to adversarial relationship, you know, like points where they would be asking someone to do something like there's one moment where a woman was cooking and she's cooking in the dark. And she was like, it's someone, could you turn on the lights?

Lauren Lee McCarthy (00:12:30) - And there's obviously someone there, like, someone's manipulating the interface, but they're not turning on the lights for her. And you know, it just, it kind of added to the, it felt like an exploration of, um, what these personalities could be like, like there's a, it's funny, cause there's a part of this work where I had afterwards or in this process I felt like, Oh, like so many ideas for like what Alexa's personality could be like now or where this could go. Um, but I'm not, I'm very critical. Like I, you know, I don't want to go get a job working at Amazon, but it just like open, open my eyes to a lot of really interesting possibilities. I didn't really know what to do with after that. And so I think something that emerged through all these different performances and experiments were just seeing like range of ways people could relate to these things. I felt like much more open and creative and interesting than the way I think we often interact with these systems, which is very, um, structured. There's a certain amount of distance. He didn't with this kind of personal space that you're interacting with the device in.

James Parker (00:13:41) - Do you have any reflections on the agenda dynamics of your experience? I mean, I'm just ref just thinking back to a conversation, Yolanda stingers and Jenny Kennedy about that with the smart wife and they talk about, you know, the compliant that the model of femininity, that some of these devices embody, which is all about compliance and of course, domestic labor and that, that they also invite these companies to think about, you know, different versions of different kinds of personalities. And I just wondered if you felt strongly that that was a gender dynamic in, in your relationship. Um yeah. Or if there are any other reflections along those lines.

Lauren Lee McCarthy (00:14:30) - Yeah, definitely. I think I'm exploring that in these works, but it's not such a like pointed, you know, I'm not trying to put a point on that. I'm more kind of asking the question, um, you know, because it is a feminine voice in Lauren, which made more sense, I guess, because it's me, but I was also interested in that way that we like attribute, uh, a personality like Alexa or Siri to a woman, you know, and a lot of people have written about this, but I guess I was thinking about like, what is it about my femininity or my, the fact that I'm a woman that makes me able to embody this role in terms of our conception of what these systems should be. So it's was kind of playing with that and I was thinking, and so it's similar in someone to, again, it's a female voice and I think it maybe pushes it a little further because, uh,

Lauren Lee McCarthy (00:15:22) - You know, it could be anyone of any gender sitting down at that table, but they are given this voice that sounds feminine. And so there's some question there about like, how do you fit yourself into that audio space and, and what does that feel like? And I think a lot of my work I'm interested in, there are a lot of pieces that I've done, where if I were a man doing them, they would be, they would have very different reading, um, or I might not be able to do them at all. Right. Because, um, like on one hand you are, you know, women are so much more often the, the target of, you know, the gays or, or just, uh, um, things like, you know, stocking or, um, being tracked in different ways or, um, not just thinking about gender, but thinking about race and religion identity in different ways, right.

Lauren Lee McCarthy (00:16:12) - There, there are some in the population where there's some people where, you know, for them surveillance feels almost novel to be kind of experiencing in this. Um, so explicitly, and for others, it's, it's much more of a daily reality, right? And so maybe they're not the ones that would opt to have this in their home or, you know, sign up for that, I guess I'm interested in. So there's the definitely like the dynamic of it, like who is privileged and who is not privileged within the system who is tracked and not, but I'm also interested in that, you know, by occupying this role, what does that allow me to see? What, what vantage point does that allow me to take? So it's not just about like who's being seen or watched, or who's more familiar with that tracking, but like, because you're not seen as a threat, um, you're seen as the target, like how do you, what does that offer you in terms of like, looking back?

Lauren Lee McCarthy (00:17:08) - But yeah, I did, I have done other pieces, so a more recent or another part taking the series, which is called waking agents was, um, the series of smart pillows. And so in that one, it was, there were performer dedicated performers for each pillow and people would like lie down with this thing and it would talk to them and it could play music. And with that piece, um, well, just to finish the description, the visitors were not told there's a human on the other end, they were just told this is embedded with intelligence and there was up to them to interpret that. And so most of them interpreted as machine intelligence and then there'd be some point in the conversation usually where they would have a moment of understanding or switching up their understanding where they would go from, Oh, I thought I was talking to a machine.

Lauren Lee McCarthy (00:17:51) - I realized I'm talking to a human. And so I was kind of excited by that, that moment of switching contexts, because it also means you go from feeling like you're alone to feeling like you are realizing that with another person the whole time. Um, even if you were never really alone, because, you know, who's knows what's on the other end of these technical systems. Um, but anyway, in that piece, um, the performance could actually choose and you could decide, do you want to a male voice or a female voice? I would love to, I think we're getting to the point now where you could also have like a gender neutral voice as an option in there too. So that was, that was interesting too. Cause they would, you know, people wouldn't necessarily pick the one that matched their gender, but I think for them, there was some questioning about like, why choose one or the other and what does that they would get to see in real-time like test it out. How does the interaction go differently depending on which of these voices they choose?

Joel Stern (00:18:43) - Um, the pillows are super interesting also because, you know, we'd been thinking about a lot about, um, wake words and, you know, and, and the way that sort of being awake or being asleep is, I mean, it's sort of an anthropomorphizing of the Machine in, in, in a, in a really sort of Lee literal sense that machines don't, you know, sleep or wake up. Um, but then, you know, obviously calling the work, Laura and draws, draws attention to the wake word and S the sort of call linked to the Machine, calling the Machine in, into sort of action. Um, but in effect you're always Listening and, and the wake word is, is not so much the call to wake up, but the call to sort of act on that Listening sort of in a, in a more, in a more transparent way. So I was just sort of thinking about if there were instances where the call to act sort of precedes the wake word, you know, where, where you felt I should act, I should do something, but I sort of, haven't been called I'm sort of not, uh, I haven't, you know, and then how that kind of dynamic plays out in terms of the ethics, because I think, again, this difference between human and machine in, in relation to sort of, um, w when it becomes.

Joel Stern (00:20:05) - Sort of Lee legitimate to act when it, when you kind of authorized that. That's, that's an interesting question for us.

Lauren Lee McCarthy (00:20:11) - Yeah. That's so funny. I feel like I had this moment. I mean, I think I started by saying like, a lot of these projects are just kind of these attempts to try to hack my own, you know, social shortcomings. And so, you know, it's like, I've got it. This is my way in. Um, yet I get in and I'm still me. And, um, I think I had one moment in one of these performances and I was like, Oh my God, I just took all my anxieties and just like embedded them in the, you know, system infrastructure of someone's house. Now it gets out of control, um, all by like anxieties about like, should I act, or should I not? Or what should I say now? It's like, now it's distributed over your entire house. But the, I think that was always a question for me was like I do, yes.

Lauren Lee McCarthy (00:21:01) - Obviously, if they're they kind of use the wake word or they command me, then it's clear to respond, but it's, uh, it's a relationship that's unfolding when this performance happens. And it's funny because a lot of people go into it kind of thinking, Oh, it'd be like a show, I think, and then there's not much show, it's just like a situation that we're both in together and have to find our way through. And so how much I act beyond what is like commanded was really, um, it really different with each person. It was like trying to read them, but having very, having complete access to, you know, other camera feeds and information, but also being really aware of the times when you, that wasn't enough, but yet you're still trying to piece together a picture of them because that was the role that you chose to occupy.

Lauren Lee McCarthy (00:21:54) - And so I think for moments like that, it was like really ringing through my head. Like, what am I, you know, what we're doing when we're saying, Oh, well, we'll apply this algorithm. That's going to just figure out when things should happen. Right. It's, it's going to be an incomplete picture that we're then making these certainty or these judgements about, or the algorithm is making judgements about in terms of the, that kind of acting. Um, I think the other side of it was, there were times where I was asked to act and I, well, I guess one example was, um, someone had like a date over and, you know, she kind of said like, okay, this person's coming over. And like, can you kind of set the mood and lighting? And I did that. And, and they came and normally when someone else enters, I try to do something.

Lauren Lee McCarthy (00:22:41) - So they noticed my presence, you know, like, hello is your name or something. But like, they just kind of started getting right down to business very quickly. And so there wasn't like a moment. Um, and so I'm just like watching this thing and then I'm kind of like wondering like, is, you know, what does this person know? Or like, how am I complicit in this thing? Or is this, you know, am I just fulfilling my role here? And it's kind of her job to figure out what the boundaries are. So there were moments like that too, that were interesting and awkward. It was funny how that resolved, because at some point I was like, okay, well, I got to, like, I think I would like to leave them to this. And it was pretty late. So I was just like, you know, good night, everyone I'll be here tomorrow. And then at the moment he is kind of like what's happening. And, um, she was like, it's Lauren. Remember I told you about her. And he'd kind of, I think at that moment, like notice all these cameras everywhere and then they just kind of, then it was like, trug cool. Okay, great. Cool.

Joel Stern (00:23:44) - Most people probably don't care. I mean, that's, that's the sort of, um, punchline sometimes. Yeah. So were there, were there moments where, uh, I mean, that's a good example of a moment where you suddenly let's say couldn't pay the Machine. I mean, you had to sort of confront a human problem of making a decision about how completely you are in potentially intruding on this person. Who's come over without enough knowledge of what's going on. But I mean, um, without ever moments where the sort of you where the user or the participant sort of switched you off that Lee, like I said, um, I, I don't want, I don't want me, you know, because people, I mean, a lot of the people we've been speaking to in this project have been sort of thinking about this, the questions of how not to be heard and sort of coming up with strategies for, um, for doing that. So that kind of part of the relationship as well.

Lauren Lee McCarthy (00:24:44) - Yeah. And it's really important to me when I do. I'm always thinking about land consent so much in this work, and it's interesting cause it's like people can consent to things, but if you haven't experienced it before, you don't necessarily know how you might feel. Right. So how do you, how do you deal with consent in a situation like that? But one of the ways is just trying to be like, when I'm installing it.

Lauren Lee McCarthy (00:25:09) - I work with it, with them to say, where, where do we put the cameras? Are there areas where you don't want them? And then also letting them know, like you can unplug the camera, you can cover it, you can turn it to the wall. You can tell me to stop watching. You can say Lauren, shut down and I'll turn off everything immediately. So they feel like, I mean, obviously there's a trust there, but I try to make that clear that's an option. And so then I noticed people interact with it differently. Like some people it's like cameras on don't care, you know? Um, other people are just like, you know, when they're going to sleep for the night or something, or when they're changing, they'll just like, turn the camera around or cover it or something like that, which I just thought it was like a really sweet gesture in some way of just like, yeah, I, I think each person kind of finds their own relationship with it. And I'm, I'm trying to like be there for that.

James Parker (00:26:02) - I noticed that you said camera a lot of times and that answer. And I'm just because, because the project is about Machine Listening. And one of the questions that we're interested in is like, to what extent are any of the questions that we're asking specific to Listening or audition? And, and the answer is often, well, you know, not so much. Um, but, but sometimes, yeah, I guess I'd just be interested to know if you have any reflections on the different modalities of being with the person and the different, like, so it seems that people were concerned about being seen possibly in their kind of nakedness or what have you. Um, but I always think about smart assistants that there's something about the vocality. So not so much the auditability, but the vocality that's producing something. That's that because of the longstanding connection between ideas of voice and intimacy and so on. So I just, I just wondered if you could reflect or sort of tease out any reflections you had on vision versus this. I mean, you know, even, I don't know, like memory or, you know, uh, different ways in which we, I think about the kind of sensory or affective dimensions of being with someone in these ways.

Lauren Lee McCarthy (00:27:17) - Totally. Um, it's interesting. I mean, I was saying camera kind of as a standard for like all around recording device, you know, because he cameras have microphones built in, but I think you're right. Some of those people that were turning them away, it was more about the image that they wanted to withhold versus the audio. But I think I've been like definitely gained a new appreciation for sound and Listening through these works. And I think in the waking agents piece with the pillows was one example that kind of came out of that because it's all about audio because I mean, first of all, it's just like bandwidth wise, it's easier to deal with, but I found myself like in these performances because I'm, I'm building the software and the technical infrastructure as well as performing them. Uh, there's always like some learning and some development that's happening along the way.

Lauren Lee McCarthy (00:28:12) - And I found myself like picking up these different modalities to try to get clues about what was happening. You. So sometimes the camera does not give me any information. And so I'm relying completely on the audio to try and understand what's happening in the home. Yeah. Either the it's not useful or it's distracting because it's of the quality or, or whatever's happening in it or the lag, but also because it just, so that that's the incoming, right. If I'm to observe, but then as you mentioned, the outgoing and I think about these systems so much more in terms of that kind of audio experience of just like having your house and the fact that there isn't any screen normally. I mean, I know there's some, you can get like Alexa touch or whatever, but like the idea that it's just audio and it's disembodied makes you feel like it's everywhere, right.

Lauren Lee McCarthy (00:29:02) - Instead of just one limited to one device, I definitely played with that in the performances a lot. And I would have a few different speakers that I could like Bluetooth connect to and switch to different spaces of the house too. So I was often following them with my voice. Um, and yeah, and like, as I mentioned, this smart goal, a piece that was all about audio and I I've really found that. I dunno, I feel like there's a way you can like understand the space you're inhabiting so much better just by listening to the audio then by looking at the image sometimes. Um, and that was, that was something unexpected for me. Cause I think it hadn't been such a big part of my practice previously. And I think for them to on the other end, like, um, the thing that really brought that sense of intimacy was the voice. It wasn't, I could control all the things and the lights and the appliances, but it was the conversation or the voice and it, and I use conversation really broadly. You know, some people we would have long discussions and other people, it would be very limited, but I think it was that continuity of audio that built the character for them.

Joel Stern (00:30:10) - I mean, there's also such a rich, um, sort of cultural history of the disembodied voice, whether it's in radio or, or in cinema or, or you think of CA voice characters, like how from, from 2001, and, um, the kind of omnipresence of, of the disembodied voice, like by specifically, by not having a body, it can be everywhere at once and how have a certain, you know, power that comes from being located sort of nowhere and everywhere. And also, I guess it's, it's, you know, the way in which that voice produces a sort of a subjectivity, even though the voice might be Machine ... sort of, it feels like it's listening to you and it feels like it's speaking to you. Um, even though sort of listening and speaking might be human rather than machining sort of qual qualities in a literal sense. It, you know, the Listening is audio sort of processing and the speaking is this sort of form of synthesis, but, uh, we, we experience it as if we're interacting with a human and there's a certain power.

Joel Stern (00:31:25) - I mean, one of the things that's maybe interesting to submit a sort of think about in this cultural moment would be, you know, the way in which the voice of sort of Siri or Alexa, or in this case lower, and it is sort of, you know, friendly and, and helpful and understanding, whereas the voice of say, you know, how, or, or previous DC embodied voices have a certain malevolence as they become conscious of their own sort of power and agency. So I'm just wondering if that, you know, and I guess that's one of the things that's in this project too, is, um, this with sort of thinking about the spectrum sort of from utopian to the dystopian, her horizons of Machine Listening. So, I mean, I feel like your project is sort of broadly optimistic in that it's sort of strongly humanistic, but I just wonder, um, whether there have been moments where you have really had a sense of, you know, not so much the, the, the, the dangers of these forms of mediation, but the more dystopian sort of horizons.

Lauren Lee McCarthy (00:32:35) - Yeah. I think, I mean, these DCIS are never, they're meant to be spaces for people to try to sort out some of their feelings or responses to the technology. Um, I have my own perspective, which is normally quite critical. Um, but I'm trying not to impose that, um, and to let people find their own relationship. And also that, I guess for me, there's always some part that's like a little bit about hope or at least some pleasure, um, because it, I think if it's, if there's not that that people just shut down and I don't know how actionable that is, um, if we want to actually imagine, uh, a different world, but that being said there, like in every performance, there are moments I think, well, I'm aiming where you feel some real sense of connection and maybe in a way that you hadn't anticipated, you know, and that's kind of the hope of like, Oh, there are this, these that's something different, but then there are these moments that feel, I think, incredibly dystopic or, um, I've had some people interact with different pieces and just be like, wow, that was like a depth of darkness.

Lauren Lee McCarthy (00:33:50) - I didn't know. I could go to an art piece. Um, and it's interesting cause the moments are not, they're not necessarily, you know, like there's the time where I like, uh, I messed up the kind of power rating as I was like translating this piece to Europe and then the whole house went dark and I had this moment where I thought like, maybe I just like burnt down their house and you know, like who, let me do this. And who's insuring, um, right. But those aren't necessarily the darker, the more to still pick that they're often in the kind of quieter times for a year, just like, Oh, I feel close to you. And I feel this like huge distance right now because of this remoteness or like, how did we end up in this? I know it's an art piece, but like, how did we end up here? Yeah. And I think those are the same things we feel when we use technology to, um, but maybe you don't have so much. I think the thing about feeling those while you're scrolling through your phone or whatever, is that the whole system is set up to keep you just moving past it. Whereas I guess what I'm trying to do here is just like, sit with that for a little, for a minute. I mean, you're also not a profit extraction Machine. Unfortunately.

Lauren Lee McCarthy (00:35:06) - Well, artists, artists are, you know, which has extract little more profit.

Lauren Lee McCarthy (00:35:15) - Yeah, definitely different goals. Right. And that, I think that's a big part of why the relationship feels different. I mean, yes, there's a human on the other side, but there's a certain trust that you just, I think is impossible to have with some of these other technologies.

Sean Dockray (00:35:28) - That's a quick thing that I was thinking about is just, um, on the one hand, like I'm there, whether it's you or gallery goers kind of like sitting in for the role of, uh, you know, corporate personality, you know, substituting for the algorithm, but then at the same time, there's the like, you know, massive kind of actual human labor sort of underlying a lot of the algorithm, not just the algorithms, but sometimes they kind of like actually are performing that, that role. And so obviously the pieces seem to relate to that in many ways. I just wondering if he'd say a few words just, um, on that, cause I'm sure it's something you've thought about in the course of making the works.

Lauren Lee McCarthy (00:36:13) - Yeah, definitely. Um, and it was, uh, I'm glad you brought that up again. Cause I was actually starting to think about that when you were talking about like the image of versus the sound or just like what's seen and what's not seen. Yeah. I mean, everything from now, you just dropped the link to the, um, you know, what's called artificial urge, artificial intelligence, right? Like a mechanical Turk, that's kind of their catch phrase. But you know, I think after I started doing this project, like there was a news headline. It said like there actually are humans listening to Amazon Alexa. And when further to talk about what that experience was like, which is, they're mostly doing quality control, but they're, you know, often overhearing things that could be extremely disturbing. There's not a lot of guidance in terms of like what to do in here.

Lauren Lee McCarthy (00:36:58) - Those in this article mentioned like, what should you do when you hear that? Well, go to this like chat room where you can kind of like commiserate with other people that are also feeling doing this work and feeling traumatized. That seemed to be like the response. So yeah, that's, it's, it's horrifying, you know, it's every, it's on every system that we're interacting with. You know, there's like a great documentary about, um, the workers behind Facebook, you know, that are filtering the content. And I don't think there are a lot of good answers in terms of like, what do you, what is the way to address that? And it's, it's so such a big problem and it's so invisible. And so I dunno, hopefully there's something in this piece where people think about that aspect of it a little bit too. And also like, what does it say?

Lauren Lee McCarthy (00:37:49) - You know, there's a specific audience that is experiencing these artworks. Um, as much as I'm interested in like reaching a lot of different people, it's like they are the people that would elect to sit behind this desk and do this as like a novel experience. Right. Because it's far from the, the work that they do on a daily basis, you know, there's a huge sector of the like care work is the fastest growing, one of the fastest growing job sectors, at least in this country. And how do we, how do we even think about that? You know, or address like what a big need that is as we're also like on this fast track to automating as much as possible. I think we'll start to realize there's something you can't automate and are those jobs seen as, you know, things that we value or is that, um, human labor things that they try to be invisible completely.

James Parker (00:38:41) - I wonder if that's a good note to end on, um, could open up a whole new thing, uh, and I'm tempted, but, um, um, but it was so interesting. Um, thanks so much, Lauren. Thank you.

Loading…
Cancel
Save