您最多选择25个主题 主题必须以字母或数字开头,可以包含连字符 (-),并且长度不得超过35个字符

49 KiB

title status
André Dao Auto-transcribed by reduct.video

André Dao (00:00:00) - So my name is Andre Dao. Um, yeah, I guess I primarily see myself as a writer and as a writer, um, I’ve been working, um, with them a mix of sort of different forms for, for awhile now, um, chiefly sort of oral history and fiction, um, combined with journalism somewhere in that sort of mix. Um, and so when I was thinking about how that works relates to, um, to your project, um, I suppose there’s always been this element of Listening, um, involved in my work as a writer, um, uh, listening and recording. Um, so that begins with sort of my workers in fiction and families, um, is, um, really built around a series of conversations I had with my grandparents over many years using various kinds of recording devices in their, um, in their apartment in Paris. Yes. Um, and, um, I suppose coming out from that, I then I’ve done our early street work focused around immigration detention on migration, more generally in Australia.

André Dao (00:01:23) - Um, but yeah, so I guess there’s this element of Listening in that work, but there’s also this, um, these themes of migration and incarceration. Um, that’s both in the migration context, immigration context, but also, um, in my family history, um, there’s kind of this, these threads of, um, both being detained, um, in places, um, waiting in places, but also moving across borders. Um, so those are sort of some of the things that I think as float through my work as a writer, um, as a scholar, um, I guess I might’ve considered myself an extremely junior scholar, um, uh, um, emit about half way through a PhD project at the moment. Um, that is sort of a sensibly on, um, the question of big data and human rights, but fairly early on in that project. Um, I thought I left a lot of work around big data and human rights is, um, looking at kind of what human rights law can do to tame big data or else what big data can do to, um, to make human rights work more efficient.

André Dao (00:02:42) - And in my PhD project, I’ve been more interested in, um, what happens to, uh, conception of human rights as well as the work of human rights is, um, as actors within human rights, fear take up the tools and rhetoric of big data, how that all comes together, um, is something that I’m still thinking about. Um, but I think if there’s a thread that ties together my work as a writer and artist and that scholarly work, it has something to do with, um, the difference between being a human listener, um, in an oral history context or in a, in a writerly context, um, and, and an Machine scenario and what difference that makes to, um, these various kinds of projects, human rights work, or fiction writing or oral history.

Sean Dockray (00:03:45) - I know that we’re going to spend more time, um, with, on the academic side of things, I think through the interview, but, um, I’m sort of interested just to, to ask a quick question about sort of form and method with, um, your writing and especially where, where you, how you choose to use fiction and when fiction kind of, um, seems to make sense, particularly when you’re sort of balancing or moving between, uh, forms of oral history and fiction, is it to fill gaps? Is it to, you know what I mean? Is it, um, like what, what, what is it that fiction offers you in the writing process? Hmm.

André Dao (00:04:25) - Um, there’s definitely partly a, um, a slightly practical drive to fiction or at least in terms of, so the context of, um, my fiction writing is it is family kind of history project. Um, and yeah, there’s definitely a filling gaps, um, part to, to the turn to fiction, um, and also, um, um, some sense of, um, wanting to, um,

André Dao (00:04:58) - I have a kind of almost plausible deniability when it comes to my relationship with my family. And when this book gets published, I’m publishing it as fiction, I think makes a difference. I think more in as a matter of like artistic form though. Um, I also, I think that, um, for my writing, working in fiction, uh, allows me to focus more squarely on, um, I suppose the, the work of memory as kind of a fictional process, um, in itself. Um, but when I say, yeah, so when I say that I’m working in fiction, I think there are lots of non-fiction writers that I admire greatly who use very similar techniques. They happen to batch it or their publishers batch it as non-fiction experimental nonfictional, so on, but it’s in that kind of, um, realm of so that some of them thinking of, um, for example, this Maria to mock and, um, as a non-fiction writer, but, um, she uses a great deal of techniques drawn from, um, not strictly from non-fiction.

James Parker (00:06:12) - Should we maybe move back onto global pulse, uh, and the work on big data? I mean, we, we, we could spend a lot long time talking about, um, writing more generally, but you’ve been doing some work on this thing called the UN global polls. I was utterly amazed to find out, you know, that this existed and I’m sure that many other people would be too. Um, can you explain a little bit about what it is, where it comes from? Um, what, what kinds of projects are undertaken in its name?

André Dao (00:06:43) - Sure. So, um, the global pulse is, um, badged as the UN secretary General’s innovation initiative and essentially, um, what that means is, um, it’s the UN’s attempt to, um, get to grips with, um, what they understand is big data technologies. Um, so largely machine learning based on machine learning and to try and harness that technology to, um, to complete the, um, military chief, their human rights and development mandates, um, its headquarters, um, is in the, um, is in New York, um, just a few blocks down from the UN headquarters. Um, but they have set up two satellite labs. So one in Uganda and one in Indonesia, and they basically produce, um, prototypes of big data tools for development, um, data gathering tools. So they might partner with, um, other UN agencies like a UNH UNHCRs to say, monitor Twitter feeds, um, uh, of, um, to pick up data about, um, migration routes, um, through the med.

André Dao (00:08:01) - Then they might also work with, say the Ugandan government to monitor, um, Ugandan talk back radio to try and, um, pick up early warnings of hate speech and potential for, um, for violence against refugees in Uganda. Um, or they might use big data techniques to monitor and predict, um, smoke, um, and pollution air pollution in Jakarta, I guess the impetus behind, um, the global pulse seems to be the UN anxiety about its own competency to complete its work. So I think one of the kind of hidden hidden stories to the global pulse is the UN’s ongoing financial crisis. It’s just run out of money. And it, I think it looks across at big tech and, and sees an opportunity to continue to do its work in a way that, um, I think they, they, they both see it as a way of making their work more efficient, but also by taking up the rhetoric around big data, that it can attract, um, a certain kind of investment, whether it’s from the technology companies themselves or from governments, um, particularly in say Scandinavian countries that are very excited about supporting data projects in places like Uganda and Indonesia.

André Dao (00:09:29) - And I think that side of the story becomes pretty clear when you look at the launch of the global pulse, um, back in, I think 2012, um, where, um, the, then secretary general ban Ki-moon launched the pulse alongside, um, you know, former executives from Amazon and Apple. Um, and it, the global pulse is headed up by, um, a former head of Microsoft’s humanitarian systems. So it’s.

André Dao (00:09:58) - It seems pretty clear. I think even just from the personnel involved that, uh, this is the UN’s attempt to, to ride, um, uh, kind of the, that big tick way.

James Parker (00:10:11) - Could you maybe give us an example of how one of these experiments or tools works? I mean, I was particularly interested in talking through, um, I think what they call the radio content analysis tool, just in terms of thinking about Machine Listening or how Machine Listening might be deployed by or produced by something like the UN what is, what is this radio content analysis tool?

André Dao (00:10:41) - So that’s one of the prototypes before I explain the radio content analysis to it. Um, I did want to mention that one odd thing about the global pulse is in my research so far. Um, all of these, whether it’s the radio content analysis tool, any of the other, um, um, initial, um, kind of programs that they set up, it’s, it’s never clear to what extent any of it gets taken up by other parts of the UN. Um, so they’ve been operating for kind of, um, eight years now, and it seems that they kind of, um, continually produce prototypes and push for more use of machine learning techniques throughout the UN. But the actual tools themselves don’t necessarily seem to have a lot of take ups or the importance or the significance of the club post seems to lie elsewhere in the actual use of this technology, at least at the moment, um, which is maybe something we can come back to the radio content analysis tool is, um, one of the prototypes developed by the Kampala lab in Uganda.

André Dao (00:11:46) - Um, and it’s, um, function is to essentially analyze, um, Ugandan talk back radio. And so in the documentation about the tool, um, the pulse really emphasize this kind of overwhelming amount of, um, audio, um, that’s produced by Ugandan radio every day. Um, and the point there is to really emphasize the, the incapacity of the human, um, in the human listener in that situation. There’s no, um, there’s no possibility of having enough people listening to enough radio stations to capture everything that’s going on. Uh, and so that’s where machine learning comes in. And so the tool, um, works by first filtering out music and ads from all of the radio, played it in Uganda, um, and then producing, um, using speech recognition software to produce transcripts of, um, of the talkback radio portions. Um, and then it uses keyword filtering. That’s been, um, kind of trained by by human analysts and human tag is to, um, to pick out supposedly relevant snippets of conversation.

André Dao (00:13:02) - And so that the diagram that the posts uses to illustrate this processes, you kind of start with this really large box of, um, structured, um, data. And then that box gets smaller and smaller as I, um, as I intervened with more machine learning processes, combinating in this kind of useful tagged and categorized information at the end, uh, and that useful information at the end, um, various between it could be their examples that they give is it could be, um, picking up instances of hate speech on, on Ugandan radio again, um, you know, aimed at vulnerable groups like refugees. Um, but it could also be more in the vein of kind of user feedback on, on service services. So refugees calling in to complain about, um, say the health and sanitation services in a refugee camp. And so that’s the basic idea of it. So

James Parker (00:14:00) - I should, you know, play my card is in a way as, as somebody who works in a law faculty and I just, what you were describing then I think it sounds intuitively plausible on some level, but as somebody who’s read hundreds and hundreds of pages of judgements at the end of extremely long trials, debating whether or not something constitutes hate speech, I’m just immediately thinking, how on earth? What, what, who, what kind of things are being picked up on, uh, is it, you know, just references to ethnicities? Is it like, what, what, what is constituting hate speech for the purposes that analytic or whether or not it works? I just want, I’d just be fascinated to know what are they even thinking. It would mean for a content analysis to, to be able to detect something like hate speech.

André Dao (00:14:58) - Yeah. So from what I can say from their documentation, the, um, I mean the process, uh, isn’t necessarily reductive, um, and the ability to, um, to tag, um, sort of 30 seconds snippets is potentially hate speeches. Um, it’s pretty simplistic. Um, and the sort of K-12 filtering that they use it. Yeah, it is. I think it does say that it’s as simple as in the end, the team of human designers, um, coming up with lists of, um, of words that seem to be relevant, um, to say hate speech. Yes.

James Parker (00:15:48) - Sorry. Sorry. Um, and those teams of designers are in New York, right? So that’s not a team of designers that for example knows the euphemisms in the local language. I mean, I’m just thinking of the Rwanda genocide and the specific ways in which you from Islam were deployed in the promotion of hate speech there. Right. So I don’t know how an algorithm would be able to detect that, so that it’s New York designers who are inputting that, those analytics,

André Dao (00:16:17) - Um, at least at the first instance, I guess, um, the promise of the, um, of the tool is that, um, as it gets rolled out, those things get refined, but in some ways that’s, I mean, your question, I think, and, and particularly a reference to say that the, kind of the legal procedures involved, um, posts Rwandan genocide, I mean, it’s kind of precisely that sort of lengthy convoluted, legal proceeding that, um, this entire, um, like the, the, the pulses sort of raising for bang, I mean is to sort of short circuit that right. Is to say, and this is in, uh, in, in all the ways that they describe the aggression of the tool is that you’re dealing in situation of crisis. Um, that there’s a vast amount of data out there that, um, is described as data exhausts. So you get this picture of it, a being produced for free, or it’s sort of produced as detritus following people’s, um, kind of everyday lives. But particularly I think that image of exhaust is it’s escaping, it needs to be captured quickly. And that combined with this always the emphasis of crisis, um, and always the emphasis on real time data. Um, the point is you don’t have the time for those sorts of legal niceties, um, or even social cultural understandings of euphemisms and so on. We just don’t have the time for it. And the UN needs to kind of act almost instantly. So that’s where this technology comes in.

James Parker (00:18:00) - So it’s trading out preemption, well, pre trading out judgment and justice mechanisms for a logic of preemption and prevention.

André Dao (00:18:10) - Absolutely. Yeah. Prevention and yeah. So it’s not, it doesn’t really, um, operate under a rubric of justice in that sense. Um, it’s much more something like executive role or, or management. Yeah.

Sean Dockray (00:18:27) - Can’t help, but also think of the financial sort of problems that you referenced at the beginning of, um, of, of all of this and that it’s just imagining that tools like this would also sort of be imagined to be cost, maybe revenue generating on the one hand, but also cost saving on the other hand of, um, you know, because then in theory you need fewer people on the ground, like who have this kind of, whatever deep knowledge of, um, kind of meanings of language and all this kind of thing. And instead you have a system that can just sort of alert the right people to potential problems. So, um, I think that’s interesting tying, tying the development of this tool to the kind of financial problems at the UN more generally is, is experiencing. Yeah,

André Dao (00:19:20) - I guess what I mean, it’s interesting that one of the things I’m interested there is that, um, so while kind of expert the rule of experts that, um, uh, an international organization, like the UN has been, it has been criticized, I think for, um, bringing in, um, to both the human rights and development.

André Dao (00:19:43) - So to see is right, is like you bring in these experts and, um, they can, um, produce this kind of technocratic rule. It seems like, um, what the policy is doing goes beyond, or even sits outside of that logic, because as you say, it’s, um, even bypassing that expert with the kind of data, you know, local knowledge or history or, and moving towards a model where yeah. Someone, um, sitting in Manhattan, um, is not even necessarily being the best of the monitors, but they just simply the person that designed the algorithm that the algorithm then does the monitoring.

Sean Dockray (00:20:28) - James are a little earlier, I sort of said slipped and, or on the way to saying something else said who who’s writing this code, you know, and, and, uh, a few in a few of our conversations, there’s been talk about getting who is at the table at the conception and design of these systems. I’m just sort of interested. Do you have a sense of, of, um, who, well, on the one hand who’s sort of conceiving of the systems, but also who’s, who’s actually implementing them, who’s working on what types of people are being employed to actually develop these systems?

André Dao (00:21:02) - Yeah. So it’s probably important that to acknowledge that. So while the is headquartered in, um, New York, um, from what I can, can tell the, like the Kampala lab and the Jakarta lab, um, to employ a lot of local stuff, um, and you know, they are, um, employing local data scientists. Um, I guess the question there is for me is, um, what, and what I haven’t, um, yet, um, sort of followed up on is, you know, so where were these data scientists try and cause kind of, I guess on a very surface level, you look at, um, you know, the fo the staff photos for those two labs. And it seems like, you know, it is, um, say a lot of Ugandans working on the radio content analysis tool, uh, on some level. But, um, yeah, there’s still, uh, a question in my mind of, well, you know, what kind of training, where were they trained?

André Dao (00:22:08) - What kind of assumptions about, um, about data? Do they bring to the project, um, on the question of who, uh, another way that I’ve been thinking about about that is not so much at the design level, but, um, I guess one of the things that the post says about the radio content analysis tool and about their other machine learning tools, um, is that if it’s within this rubric of, um, uh, participation. So the UN for, for a long time now has been, um, talking about, uh, people centered development, um, there’s participation and informed consent. Um, that sort of language is incorporated into a lot of their human rights work into their development work. And what that means seems to be shifting at least when it comes to the involvement of the global pulse. So participation in a process in a kind of human rights process, um, it seems to be participation could just mean that your data is being scraped from a Twitter fade or that you’re, um, or that some people’s, uh, calls to talk back radio in Uganda have been, uh, tagged and categorized and turned into a chat and read in, um, at UN headquarters in, in York that that’s a form of participation or that that’s what makes development people centered.

André Dao (00:23:51) - And so that, um, yeah, I find that a very interesting potential shift.

James Parker (00:23:58) - I mean, it’s also a form of doublespeak, isn’t it? You mean, you wouldn’t say that, you know, counter-terrorism, you know, is people centered, for example, you know, w w w I mean, I’m really struck by how similar, what you’re describing is to many of the, um, NSA’s programs that were, you know, revealed by Snowden. And in fact, you know, around precisely around the time when global policy is being developed, there’s a quotation in the, the essay that you shared with us, but blew my mind. Um, so this is the independent expert advisory group on the data revolution for sustainable development, which I kind of want to know more about, but you quote them as saying.

James Parker (00:24:43) - Never again, should it be possible to say, we didn’t know, no one should be invisible. This is the world. We want a world that counts. So a world in which we count everyone all of the time. And that is exactly the same rhetoric as the NSA is program of total capture in the name of counter-terrorism. And one of the things that interests me is how it’s possible for an organization like the UN to speak that way without, without recognizing the in congruency, like, you know, um, it seems like there must be a lot of in there must be a lot of, is it a lot of faith in the idea of data for good, or just a fundamental commitment to them not being, you know, the NSA so that, so that it doesn’t, it simply doesn’t present a problem, but I was just, I was just amazed by the, kind of the hubris of that as a goal for the UN. And I guess, I guess that site sort of slides into a question about what the UN itself understands to be the risks of the programs that the big data turn, or what have you that is developing and how you think alternatively, uh, we might think about those risks or, um, or what have you.

André Dao (00:26:06) - Yeah, it’s quite odd, isn’t it? That, um, surveillance as a word and as a concept, just doesn’t seem to get, um, picked up in this conversation, like as soon as you’re in that kind of data or AI for good world, um, this, yeah, quite similar seeming practices just don’t get that label surveillance, um, and seems to be that surveillance has some tie to security, um, and that we have this kind of conceptual link between those two, um, and that somehow in the kind of AI for good realm, because it’s what, where it’s not explicitly based around security or law enforcement, it’s no longer surveillance in that quote that you read out about, um, the world counts. Um, so I think what, what I found interesting in my research has been the move that, um, the UN makes the move to that independent advisory group on the data revolution, um, makes, uh, to authorize this vision of a world in which everyone, um, is surveilled essentially, but they, they can’t is they, um, they talk about the right to be counted.

André Dao (00:27:29) - And so actually this is where, um, it’s, I think a good example of the use of human rights, at least conceptually to authorize a move that otherwise we’d think of in of surveillance and, and, and, and, and certain kinds of people would then, um, you know, have a repertoire of resistance to, um, but by using this move of saying, actually there’s a right to be counted in. So that, that right to be counted, that is to say that it’s already a human rights violation to not be counted by these processes, um, to not be saying or heard by Machine Listening or Machine vision. Um, I think that’s just such a interesting move because it’s very, I think in, in sort of a lot of the languages available to us for, um, political activism, it’s quite hard to, um, Dave and say to that moves happened first of all, but then to, um, kind of intelligibly, um, articulate a resistance to this idea of, um, so they, they, they placed the right to be counted as prior to every other human rights. So you can’t have, you know, the right to ha you can’t have your right to housing fulfilled unless you’ve first been counted as a person. Um, and there’s this kind of conflation, I think, between, I guess, a deliberate confession really of, between this idea of being counted or being measured and being quantified through algorithmic processes and counting as in, you know, mattering, but it’s in that. Yeah. So I think that’s in that sort of slippage. Um, it then authorizes a whole lot of stuff that we might otherwise have concerns about.

Joel Stern (00:29:30) - I would say it’s almost, um, sort of analogous to like, um, the right to vote or something in a, in a kind of democracy like to be counted is sort of to participate in the functioning.Oh, of a sort of political system in the same way that, you know, perhaps, um, voting in, it might be, it’s almost like a, uh, sort of more passive way of having your say let’s, let’s say. And, um, I kind of want to, um, you know, ask a couple of like bigger sort of DAMA questions too, to sort of provoke let’s, let’s say, um, for, from you, uh, sort of let’s say even a provisional sort of position on, um, this policy. So, I mean, I’m tend to, attempted to ask, you know, do you think that this program pals or, or certain sort of tools just, um, simply should not exist, you know, should, you know, be, be sort of put to a stop, uh, likely to do more harm than good et cetera. And, you know, so I, I, you sort of fundamentally skeptical about the use claims they make and, you know, if so, w w what are the, um, you know, negative horizons, like what, what are the sort of most, um, serious problems and concerns, um, that, that we should be mindful of?

André Dao (00:31:08) - Hmm. Um, so am I fundamentally opposed to, um, to the use of these technologies in a human rights context or, or by the UN?

Joel Stern (00:31:20) - Sure. I mean, H however you want to interpret that, but I mean, I think Pat perhaps narrowing the question to, you know, the, the UN and this sort of specific program when, as you’re sort of researching it and looking at it and going into it sort of more deeply a thinking this simply sort of shouldn’t exist as a program, or it should be fundamentally different to what it is, I guess I’m just trying to get to the sense of whether what we’re saying is these programs should be better sort of safer, more regulated, more sort of stringently sort of applied, or are we saying, no, this is a fundamental wrong turn at the end, we should sort of what, what’s the kind of, um, from, from an activist or a sort of political intervention, um, into this, what what’s the position that you think is, is going to be the productive one?

André Dao (00:32:19) - So maybe my way of grappling with that question, um, is through, uh, I, I know that in the, um, kind of the, the prep doc that, um, that you guys sent around, those, the question around how I, why I use, um, jurisdictional thinking and in my PhD project, and maybe a very quick, um, answer to that release of hash Lance to what, what I useful in jurisdictional thinking is, um, is emphasis or, um, yeah, that part of jurisdictional thinking, I think, um, emphasizes a question, uh, who question of like, kind of who decides in relation to your question, Joel, then around these technologies, um, I’ve been wondering about whether or not that who question does really matter. Um, and the reason I’m thinking of that is sometimes I say artists and activists taking out some of these technologies, right. Um, to do quite interesting work, quite useful work, it seems. Um, and so I’m wary of maybe accepting fully the, um, kind of the input of your question and saying, because while I am quite skeptical of the use of these technologies by the UN, I’m not sure yet whether it’s a question of the technologies or what yeah. Does that make sense of, I’ve been wondering what, what difference it makes who’s wielding the technologies and for what purpose, uh,

James Parker (00:34:06) - Could I follow up there? Because, because it seems to me that the question is not about in your writing, the question is not about the technology. You talk about big data as a concept that has been taken up by the UN as a program that’s being taken up by the UN, not a technology that’s being taken up by the UN. And so when you start to think that way about big data as an orientation to the world, which has a certain kind of visual rhetoric and buys into a language of innovation and can start to imagine something like the right to be counted, then.

James Parker (00:34:48) - You’ve got a much more sort of complex, you know, socio-technical ideological beast than, than simply a technology which can be deployed or not deployed by this organization or person that one, you mean? So what if we were to reframe this question and say, are you opposed to the way in which big data has been taken up by the UN, but as opposed to asking, are you, are you opposed to gods or data analytics or machine learning per se?

André Dao (00:35:26) - Yeah. Um, yeah, I mean, you’re right. As a program, um, what has happened with UN takes up big data as a program is this vision, um, of where, um, I think what this one, the global pulses, chief data scientist, um, talks about by, by 2030 we’ll know everything from everyone. So no one will be left behind. Um, so it’s in so far as that kind of logic is carried with this program, with these technologies of the knowing everything from everyone that’s, um, that’s sort of, yeah. Um, certainly opposed to, to the UN moving in that direction in terms of the, that the negative horizons of things we should be worried about. Um, and this ties back to something that James, you asked, um, a little earlier around the risks, the rest of the UN SES is, is very squarely around the privacy question. And I think that’s, um, probably unsurprising, um, to you guys that they say the only potential problems around say the radio content analysis tool or any of their other pulses of, um, projects is kind of individual privacy.

André Dao (00:36:58) - Um, whereas I think there are much bigger, um, and very different types of risks involved here. I mean, one of them we’ve alluded to before is, um, the global pulse, um, and the taking up of big data by the UN is, um, proving to be a way into international institutions for big tech. Um, so, um, that has, it seems to have an interesting effect on the kind of authority of the big tech companies themselves as they work with the UN and do human rights work. Um, I I’ve seen the, um, the CEO of Microsoft kind of explicitly, um, and quiet. I think Kressley make these connection way. He talks up Microsoft’s human rights work and its partnerships with the UN and then, um, sort of immediately segues into their facial recognition software and how they’re doing all the right things around privacy of facial recognition software. But you know, what the audience is left with is this initial impression of, but the Microsoft is working with the UN it’s doing this good human rights work. I think that already does some authorizing work for the facial recognition software. For example,

James Parker (00:38:35) - Particularly if, um, the programs aren’t even being an enacted so that they’re doing good human rights work remains entirely at the level of speculation. It’s enough for it never to be enacted so long as the collaboration is there. I mean, it seems to me that the, to invent such a thing as the right to be counted, even if it’s very early days, is doing a lot of work for big tech, you potentially down the road as well, a complete reconfiguration of what it means to be in the world to be a subject of development. I mean, that’s sort of a little bit far away, cause it’s not exactly like a massive rights document that’s been fully elaborated, but even to see that idea is, Hmm,

André Dao (00:39:24) - Well, they, um, yeah, the, one of the other interesting, um, kind of images from the same report that introduces this rat to be counted is, and that also talks about that. You know, the, the world that we want is the world where everyone counts, um, is this image of two worlds. They, they talk about, you know, there’s, there’s a world where, um, we have great access to the internet and.

André Dao (00:39:52) - Great access to, to these new technologies. Um, and there’s another world where, you know, people don’t have that. And I think that framing immediately puts us into that kind of familiar developmental timeline of that, of, you know, from the pre-vet, from the primitive to the advanced and what it does. And he just, it creates a single image of the future that we’re all sort of striving towards, which is where these technologies are fully rolled out. And this program of knowing everything from everyone is, is fully realized. It kind of presents the question only as how do we help Africa and Asia and the developing world catch up to that world where we know everything about everyone, and it’s not, you know, it obviously obscures the question of whether or not that well is when we want to get to in the first place. Um, and it presents that gap as a, as a question of justice inequity. Um, and so if you’re concerned about questions of social justice, you focus on the gap and you miss the fact that, you know, what, what purportedly aiming towards has, um, potentially huge negative ramifications.

Joel Stern (00:41:11) - I mean, it begs the question of, um, like I’m just sort of thinking about a statement like everybody counts from the UN in relation to a statement like, um, black lives matter, for instance, like if they’re sort of somehow not notionally doing similar kind of work in trying to say, you know, um, create a sort of politics of, of sort of who is counted. And, and I’m just wondering about like the UN trying to roll out a program like the radio analysis tool, for instance, in the United States or, or in Australia or in the UK, um, and how different programmatically and politically it would be as it sort of intersected with all of the sort of infrastructures that exist in those countries. Um, and so, so how much the kind of relative, you know, political power of, for instance, the United States and the UN in relation to Uganda structures, a pro a program like this, if we sort of imagine this tool being rolled out in Australia across sort of hundreds of radio stations, it’s, I mean, it’s very speculative, but I’m imagining there would be a number of, sort of, it would just be called easier.

Joel Stern (00:42:30) - Well, that, I mean, that’s also goes back to the point you’re making about the NSA, um, and the fact that, um, these programs essentially already exist in other sort of departmental and political spaces. And so they are kind of then taken up in, uh, by the UN and,

Joel Stern (00:42:51) - And perhaps in a slightly more transparent, more transparent than the NSA imaginally

André Dao (00:42:58) - Is this kind of service delivery aspect to it. I think which, I mean, it’s interesting for one thing is the question of whether or not that’s how the, this positioning of the UN as you’re delivering services to customers. And actually there’s a, um, a slip of the, of the tongue at the launch of the global pulse, perhaps because, um, it’s not the secretary general, but I think an under an under secretary, but he refers to essentially, you know, the global poor as the UN’s customers and that this, this, that the post is going to help us know more about our customers. Um, and perhaps that’s because he’s following up from, or introducing this, you know, execs from Amazon and Apple, but that’s already a shift in how we’ve, uh, of how the UN would view it’s. Um, so I that’s a shift in how the UN presents its work, but, um, Bay, I think, yeah, it just, it fits within a much more benign, uh, kind of understanding of what surveillance is if it’s kind of badged is, um, yeah. And, and consumer feedback really, you know, um, even though ultimately what some of this data could and probably will authorize will be international intervention, military intervention, that’s that’s, I mean, that’s certainly where this idea of early warning systems around hate speech. I mean, what, what can that be pointing towards except a situation in which, you know, Machine Listening picks up enough data on, um, future high crimes that authorize his military intervention.

James Parker (00:44:44) - I mean, that, it can’t be a coincidence that they’ve gone for radio first. I mean, you know, after the Rwanda genocide, where the story that gets told is that radio is one of the prime drivers and so much of the discourse after it was either we should have intervened early. We being the U S and Europe primarily.Oh, we should have jammed the radio stations at the very least. It just, it just must be that it was conceived with that in mind. I mean, is, is there other explicit about that? Yeah.

André Dao (00:45:13) - Make that, um, that connection explicitly, but yeah, I mean, it, it’s definitely floating around in there, I think.

Joel Stern (00:45:22) - Mm. I mean, they give lots of, they give lots of examples to do with sort of, um, you know, environmental disasters, like, like, you know, floods or, uh, and sort of the way in which responses to that, to those might be sort of ex accelerated. But I suppose again there where we’re in territory, uh, where, you know, what happens in the wake of an environmental disaster is not always, um, you know, positive development, but also kind of exploitation of that disaster, especially in relation to sort of, you know, multinational companies. I just had a, um, uh, I just wanted to return to this, um, concept of a world that counts and everyone counting and just to make a really dumb connection to the, um, uh, to the census or to a kind of like census within the country. And that’s obviously like often rolled out with, with exactly the same justification, which is, and sometimes with the same branding as well.

Sean Dockray (00:46:30) - We, you know, we want everyone to count and, and certainly in the U S there’s like, uh, there’s a lot of political struggle around the census and, you know, we’d, we do want to count everyone, um, particularly undocumented workers, because that’s how political power is apportioned. That’s how, you know, tax dollars and funding, you know, there are a lot of, so I can see the, the UN sort of deploying that, um, kind of rhetoric and justification, but I guess the big difference. And so what I’m asking you to do is a little bit too, cause I’m naive as to explain the UN to me, the part of the difference is that the, like what gets apportioned in the UN’s vision of a world in which everyone counts, because like through big data, like what, um, in a way it seems like part of the problem is all of this is collected, but there’s what mechanisms are there to take it up and use it in a, in a, in a, um, positive way. And maybe that’s kind of related to the problem of a lot of these, uh, observation that a lot of these pilot programs aren’t actually taken up, because what, um, maybe the question is like who or what areas within the UN have the capacity to, um, make use of these, um, these technologies, you know, in, in a, in a, in a useful way.

André Dao (00:47:53) - Hmm. I mean, that’s, I think that’s why the, the slip of the tongue where the, um, with a UN official refers to, um, the global portrait is the UN’s customers. I mean, it’s, it’s a revealing one, not just because of the, of that, leading to that, um, question around corporate power, um, and so on. But, um, I think it’s also interesting simply because it’s not correct, it’s the UN is made up of it’s kind of constituent state members, and this what’s really one of the really fascinating aspects of everything that the post does that tries, I think, to reframe or it’s part of it’s part of a larger, ongoing process of within the UN of reframing it’s its constituency as the peoples of the world, and that it derives some kind of direct authority to, um, to make better the lives of, of all the peoples of the world.

André Dao (00:48:57) - And I think that is a really important part of the function that these technologies or this program plays for the UN to be able to position itself as being both capable of and authorized to, um, intervene positively for people, um, you know, as a, as opposed to having to work through States, nation States. And so that, yeah, I think there’s a really fascinating, um, thing going on there around competency and perception of competency. Um, so, and that’s why the, the lack of uptake of these technologies and projects is so interesting because it seems to be that you can presenting data is as good as.

André Dao (00:49:54) - Having the actual capacity to change the vein. Um, and maybe an interesting image that captures some of this, um, comes from the document around the radio content analysis tool. Um, they illustrated by these really strange images, um, of Ugandan women standing in kind of, um, empty fields, holding radios up against their faces.

James Parker (00:50:25) - I’m on the, um, the UN global pulse website right now. And it’s, it’s the landing image, right? You sort of, you, you land there, it says big data and artificial intelligence, sustainable, uh, humanitarian, yada, yada. And then on the left-hand side, there’s a woman with a portable radio held up to her face in a field.

André Dao (00:50:45) - And so it’s a strange image, right? Because, um, I mean, I think your first impression is it, it, it fits perfectly well with this, um, idea of participation of, of the UN creating some kind of a platform for, um, the peoples of the world to speak directly to the UN. And that’s what makes the UN competent to intervene on their behalf. Um, but of course she’s holding a radio, not about fun and so already in the CMH, there’s this? Um, yeah. So that’s the image that I’m thinking of there’s already, um, something we had going on in their own kind of marketing materials about, you know, supposedly showing this participation. Yeah. So it’s a, it’s an image that I keep returning to in my, in my writing on the post. So

Joel Stern (00:51:48) - Is it, does it, it sort of speaks to Machine Listening in such a weird way because it’s sort of, it’s so far from a kind of computational algorithmic sort of data gathering kind of image. It’s so much about community and a very old fashioned media device, and there’s also some really weird displacements going on, right? So who’s the person doing it, Listening in the image. It’s a Ugandan woman who are the people doing the Listening in the UN global posts? Well, no one, but if anyone in Machine that’s been programmed by, you know, ultimately all roads lead back to New York. So there’s a lot of, a lot of political work in that displacement, the kind of attempt to convince people that the Listening is being done. Presumably the idea is that it’s being done on behalf of the Uganda woman. So we, we, we might as well be Listening for you in Oxford, a role as, you know, benevolent surveillance that’s right.

Joel Stern (00:52:53) - Because she’s listening to talk back, um, and radio and therefore, so, so might way, uh, in order to sort of, um, understand on a macro sort of level what Shea is understanding, just sort of with the one radio, I don’t know, but the ability, she can’t understand it from the one radio because she’s not listening to all of it all the time. So it’s an enhanced, it’s better. The UN can do much better like that than she ever could. I mean, it’s so it’s so strange. I mean, it seems patronizing to me as well, quite frankly.

André Dao (00:53:29) - Um, there’s I think, uh, a blog post, um, in which the, the idea of the right to be counted kind of gets introduced by two of the experts on that advisory groups that I mentioned before. Um, but in that they asked a question, um, can they take your voice, which I think, um, kind of nately, uh, intersects with the CMH. Um, but also is it the entire program, um, which they, uh, you know, I think clearly the posts, um, and so that in the affirmative that, that here we have data giving voice even where voice is impossible because she’s holding the wrong device. Um, and yeah, uh, recently I was rereading, um, uh, Gayatri, SPI VAX article, um, can, can this whole Bolton speak and, um, it kind of, yeah, the CMH and, um, that question about data giving voice reminds me of the, the conflation between two types of representation that Spivak talks about in that, um, in that essay. So, you know, between representation is something bang, symbolize something, paying big shit over, represented in that sense, and then represents it as you know, that idea of speaking for in the way that we’re represented by parliamentarians.

André Dao (00:54:58) - Um, and that those two things go run together. Um, particularly in the English language. I think she says in German, there’s actually two different terms for those two meanings of representation, but yeah, there is something happening here I think, was the posts where the ability to represent someone through data, um, older ability ability to represent many someones for data, um, same to then also immediately carry, carry with it. Um, the authorization to represent someone as in speak for them act on their behalf.

Joel Stern (00:55:39) - I’m wondering if, um, the, cause I think, um, we were gonna need to finish up in a, in a, in a few minutes, but just I’m thinking, you know, where we’ve arrived at is reef is really amazing and it could be, could be interesting, um, to have you sort of depart from this image and kind of pre present sort of an, an analysis of what is the work that this image is sort of doing, um, in its representation of Listening against the form of Listening that is sort of happening with the radio analysis tool and maybe those two questions, um, can data give voice against sort of spear of X quick question, can the subaltern spake, you know, it just could be a really nice way of, um, sort of framing this image. I mean, uh, Andre has, um, given many presentations enrich that’s more or less exactly what it does.

James Parker (00:56:39) - I’m sorry. Yeah. Okay. Um, but I think it’s a great idea to, I mean, I think, you know, I don’t know if we’re rounding out the conversation, um, but there is, there remains the question of the live context to unsound and whether, I mean, I think with that, because just with the unsound context, the connection has to be strongly to forms of Listening, um, rather than to sort of let’s say the politics of big data more generally. So I think in that context it would, it would sort of be to depart from an image of Listening, which, um, yeah, anyway, I mean, should we, should we sort of end the sort of formal bit of the interview? Um, I mean the only other things I was going to ask were about COVID and, uh, the interception of migration, uh, distress signals from boats, carrying migrants and refugees, it seems like that, you know, are kind of part of this expansion, you know, like everybody’s responding to COVID. So the UN got its big data COVID thing, techno solutionism as, uh, as a way of responding to COVID. Um, and that, you know, the mounting refugee crisis is just another one. Um, so I just wondered if you, any, any particular thoughts on, on those either of those projects?

André Dao (00:58:01) - Hmm. Yes. I haven’t, um, been following the COVID, um, part of, um, out of the pulse particularly closely, um, except yeah, I guess it just shows the, I guess the, the logic of the pulse as these kind of innovation lab is that they’re in the same way that they, um, there’s this language of, um, matching up innovators with problem, uh, problem owners. And so they’re, they’re the innovators and they’re looking for the UN’s problem owners. And so when it was a refugee crisis, it’s the UNX here, who are the problem owners they own, they own the problem with the refugee crisis, I guess, which sets us that. Yeah, I think that they kind of by the very like kind of constituents that they have to move from crisis to crisis. And so when they say the COVID like it it’s inevitable that, um, this innovation initiative has to show the worth of its work and its program in relation to whatever the, the latest processes,

Joel Stern (00:59:21) - But it’s so striking the way that they lack the language and that, that move exactly mirrors, everything that every tech startup and major, uh, you know, Google, Amazon, whatever did, I mean, it’s just suddenly tech presents itself as the solution to the problem. And, and, and it’s an, a further opportunity for expansion

André Dao (00:59:44) - Actually. Um, the one thing that I didn’t mention that, um, is probably if there’s sort of an end point.

André Dao (00:59:52) - Uh, or a hunch that I’m going on with my PhD project it’s in relation to, um, the pulses, the, the one project that sort of seems to underpin all of these other more prototype, um, style projects, is there push for a kind of global data comments and, and push for what they call data philanthropy. Um, and, and that is essentially there. Um, it seems that that they present these tools like the radio content analysis tool, um, to, um, particularly to large corporations that hold a lot of data on people. Um, and so this is the work that the UN can do with data. If you see the value and the worthwhileness in that work, you should donate the donate that donate the data that you hold on your customers or whatever data you hold, um, to the UN, um, so that we can start to turn these tools, whatever tools we have for analyzing data, um, onto your data sets. Um, and, and that the vision there is of no longer having data in any silos, any corporate or national silos, but a big, um, global data commons. Um, and so that seems to be one of the primary purposes of, uh, of these individual projects is to, um, to convince the holders of data to, to, to share it in this big happy kind of weld, if

James Parker (01:01:32) - True data comments, or a data commons that is possessed by the UN. Right.

André Dao (01:01:36) - My understanding is that it would be open for old, that wished to do good with AI.

James Parker (01:01:43) - Well, that, that that’s not going to happen. Is it, I mean, why, why would, why would Google give up? Yeah,

Joel Stern (01:01:49) - I mean, that’s it because if, um, Google and Facebook handed over their data to an open comments, then they would all just grab each other’s data. But I mean, it’s from one another, but I mean, it would be good. It’d be good. And then we, and then we could just sort of rebuild a not-for-profit Facebook out of, I mean, good might be a strong, a strong word to use. Okay. Be better than what we currently have possibly

André Dao (01:02:17) - And thought about it from, from that, from the perspective of, of your Googles and, and why they exist. I, um, had mostly been thinking about it from, uh, from the post his perspective, but that’s yeah, that’s interesting point.