0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/585 Thanks! 1 00:00:10,100 --> 00:00:12,199 He has hails from Canada, from the 2 00:00:12,200 --> 00:00:14,509 Montreal Institute for Genocide and Human 3 00:00:14,510 --> 00:00:17,359 Rights Studies at Concordia University, 4 00:00:17,360 --> 00:00:19,519 and he helped build up the digital 5 00:00:19,520 --> 00:00:22,069 mass atrocity prevention lab. 6 00:00:22,070 --> 00:00:23,450 Please welcome Nikolai. 7 00:00:29,900 --> 00:00:31,519 Hello, everyone. Thank you very much for 8 00:00:31,520 --> 00:00:33,229 that kind introduction. 9 00:00:33,230 --> 00:00:35,579 Thank you all for just coming at the 10 00:00:35,580 --> 00:00:37,549 latest, uh, talk tonight. 11 00:00:37,550 --> 00:00:39,289 And, uh, thank you for C.C.C. 12 00:00:39,290 --> 00:00:41,419 for inviting us. Um, uh, it's 13 00:00:41,420 --> 00:00:42,949 a great honor for us to be here. 14 00:00:42,950 --> 00:00:45,289 Um, just a little bit about 15 00:00:45,290 --> 00:00:46,309 myself, just a little bit of background, 16 00:00:46,310 --> 00:00:48,379 like, uh, I'm a German Swiss citizen, but 17 00:00:48,380 --> 00:00:50,359 I work now in Montreal for the Monterey 18 00:00:50,360 --> 00:00:52,489 Institute for Grad Studies. 19 00:00:52,490 --> 00:00:54,559 And, um, since the beginning of 20 00:00:54,560 --> 00:00:56,959 the year, like or for me better since 21 00:00:56,960 --> 00:00:59,079 April, I helped, uh, build up the, 22 00:00:59,080 --> 00:01:00,229 uh, what we call the digital mass 23 00:01:00,230 --> 00:01:02,209 atrocity prevention lab. 24 00:01:02,210 --> 00:01:04,098 And I mean, everyone has the lab these 25 00:01:04,099 --> 00:01:06,199 days. And so what we want to 26 00:01:06,200 --> 00:01:08,269 do with that is actually kind of try and 27 00:01:08,270 --> 00:01:10,339 reach out into different communities like 28 00:01:10,340 --> 00:01:11,989 we do a lot of policy work. 29 00:01:11,990 --> 00:01:14,239 Um, but obviously, like since 30 00:01:14,240 --> 00:01:16,519 we have like, uh, a hacking 31 00:01:16,520 --> 00:01:18,709 audience tonight, like we also want to 32 00:01:18,710 --> 00:01:20,299 be involved in, like, uh, tech 33 00:01:20,300 --> 00:01:22,609 communities and the hacking communities. 34 00:01:22,610 --> 00:01:24,799 So when we saw 35 00:01:24,800 --> 00:01:26,899 the talk, uh, on the keynote and, 36 00:01:26,900 --> 00:01:29,119 uh, the introduction 37 00:01:29,120 --> 00:01:31,189 on, uh, to to this Congress, 38 00:01:31,190 --> 00:01:33,319 um, like my colleagues and I 39 00:01:33,320 --> 00:01:36,109 were really pleased to see that there is, 40 00:01:36,110 --> 00:01:38,179 uh, way that, uh, like a need from the 41 00:01:38,180 --> 00:01:40,159 community actually to open up and kind of 42 00:01:40,160 --> 00:01:41,419 let people who are not necessarily 43 00:01:41,420 --> 00:01:42,469 hackers like my background is 44 00:01:42,470 --> 00:01:44,389 international relations and not 45 00:01:44,390 --> 00:01:45,859 necessarily the deep tech. 46 00:01:45,860 --> 00:01:47,149 Um, I'm interested in it. 47 00:01:47,150 --> 00:01:48,859 And I can follow what the CCC is doing 48 00:01:48,860 --> 00:01:50,329 for the last couple of years. 49 00:01:50,330 --> 00:01:52,549 Um, and especially one talk, uh, 50 00:01:52,550 --> 00:01:54,709 just at the camp this this year by 51 00:01:54,710 --> 00:01:57,049 Claudio, uh, about helping the helpless, 52 00:01:57,050 --> 00:01:58,050 where he basically. 53 00:01:59,320 --> 00:02:01,089 Told the infosec community to reach out 54 00:02:01,090 --> 00:02:02,499 to human rights groups and and help them 55 00:02:02,500 --> 00:02:04,089 where they where they can actually be 56 00:02:04,090 --> 00:02:06,249 helpful, that 57 00:02:06,250 --> 00:02:08,019 resonated with us. And we come from the 58 00:02:08,020 --> 00:02:10,149 other side and basically Montreal, like 59 00:02:10,150 --> 00:02:12,519 we a fairly small team. 60 00:02:12,520 --> 00:02:14,589 And so what 61 00:02:14,590 --> 00:02:15,729 I'm going to talk about in the next 62 00:02:15,730 --> 00:02:17,829 couple of, uh, I would say 20 minutes 63 00:02:17,830 --> 00:02:20,529 is just a little bit what we did 64 00:02:20,530 --> 00:02:22,629 over the last seven, eight, 65 00:02:22,630 --> 00:02:23,630 nine months. 66 00:02:24,610 --> 00:02:26,169 And it's going to be like pretty much in 67 00:02:26,170 --> 00:02:27,279 chronological order. 68 00:02:27,280 --> 00:02:28,419 We're going to talk a little bit about 69 00:02:28,420 --> 00:02:29,829 what the U.N. is doing in peacekeeping 70 00:02:29,830 --> 00:02:30,830 and peace TAC. 71 00:02:31,610 --> 00:02:33,879 We did a big research project 72 00:02:33,880 --> 00:02:34,869 using Media Cloud. 73 00:02:34,870 --> 00:02:36,909 That's a tool by the Berkman Center and 74 00:02:36,910 --> 00:02:39,189 the Center for Media. 75 00:02:39,190 --> 00:02:41,469 We kind of encourage 76 00:02:41,470 --> 00:02:43,449 like we we we talk to a lot of, uh, 77 00:02:43,450 --> 00:02:45,429 technologists and hackers at various 78 00:02:45,430 --> 00:02:46,929 hackathon. And we kind of want to share 79 00:02:46,930 --> 00:02:47,869 some lessons learned. 80 00:02:47,870 --> 00:02:50,209 Um, and 81 00:02:50,210 --> 00:02:51,669 if this time I just want to go through, 82 00:02:51,670 --> 00:02:53,319 like, a couple of various projects, which 83 00:02:53,320 --> 00:02:55,689 we found a little bit interesting, very 84 00:02:55,690 --> 00:02:57,069 interesting. And just kind of see what, 85 00:02:57,070 --> 00:02:58,929 uh, what what you could learn from it and 86 00:02:58,930 --> 00:03:00,999 then just open the dialog between 87 00:03:01,000 --> 00:03:02,379 different fields, because obviously I'm 88 00:03:02,380 --> 00:03:04,449 coming from a fairly different field 89 00:03:04,450 --> 00:03:05,949 than many of you. 90 00:03:05,950 --> 00:03:08,169 Um, so 91 00:03:08,170 --> 00:03:09,999 one thing we do at the Monterey Institute 92 00:03:10,000 --> 00:03:11,469 of Translating Studies, we have a 93 00:03:11,470 --> 00:03:13,629 workshop on mass 94 00:03:13,630 --> 00:03:15,729 atrocity prevention. So it's mainly for 95 00:03:15,730 --> 00:03:18,069 policymakers and people in the peace 96 00:03:18,070 --> 00:03:20,719 keeping and peace building community. 97 00:03:20,720 --> 00:03:22,419 A lot of people with a non-technical 98 00:03:22,420 --> 00:03:24,609 background. And what we try 99 00:03:24,610 --> 00:03:26,289 to do in the last couple of years is 100 00:03:26,290 --> 00:03:28,449 getting more tech stuff into the 101 00:03:28,450 --> 00:03:30,509 workshops you see here 102 00:03:30,510 --> 00:03:32,109 at this picture. It's like the workshop 103 00:03:32,110 --> 00:03:33,069 in the middle. 104 00:03:33,070 --> 00:03:34,659 Uh, we have Walter Dalton, who is a 105 00:03:34,660 --> 00:03:36,969 professor in, uh, Kingston, Ontario, 106 00:03:36,970 --> 00:03:39,549 in Canada, and he looks 107 00:03:39,550 --> 00:03:41,799 on how like the UN is using 108 00:03:41,800 --> 00:03:43,569 technology or could be potentially using 109 00:03:43,570 --> 00:03:45,849 technology, uh, to 110 00:03:45,850 --> 00:03:47,979 monitor peace treaties or 111 00:03:47,980 --> 00:03:50,109 just kind of look at look at new 112 00:03:50,110 --> 00:03:51,549 ways, because, I mean, obviously, it's 113 00:03:51,550 --> 00:03:53,799 2015 and the UN is finally 114 00:03:53,800 --> 00:03:56,109 kind of catching up to that. 115 00:03:56,110 --> 00:03:58,269 Um, for those 116 00:03:58,270 --> 00:04:00,159 who are interested, like there is earlier 117 00:04:00,160 --> 00:04:02,129 this year, like the UN actually came off, 118 00:04:02,130 --> 00:04:03,969 come out with a final report on how they 119 00:04:03,970 --> 00:04:06,309 can use, uh, new technologies 120 00:04:06,310 --> 00:04:07,749 in peacekeeping. 121 00:04:07,750 --> 00:04:09,899 Um, and Walter was part 122 00:04:09,900 --> 00:04:11,649 of that kind of expert panel. 123 00:04:11,650 --> 00:04:13,779 Um, so there is 124 00:04:13,780 --> 00:04:14,799 there are things happening. 125 00:04:14,800 --> 00:04:17,229 I mean, the UN is a slow and 126 00:04:17,230 --> 00:04:19,509 sometimes very kind of hard to 127 00:04:19,510 --> 00:04:21,069 like a ship, hard to maneuver. 128 00:04:21,070 --> 00:04:23,289 Um, so it's interesting to see that even 129 00:04:23,290 --> 00:04:25,239 there there's kind of thinking about 130 00:04:25,240 --> 00:04:27,939 technology and, um, 131 00:04:27,940 --> 00:04:30,009 admittedly, like it is more 132 00:04:30,010 --> 00:04:31,419 like on the military side of things. 133 00:04:31,420 --> 00:04:33,709 So like they did think about like the new 134 00:04:33,710 --> 00:04:36,699 the new peacekeeping, uh, 135 00:04:36,700 --> 00:04:38,799 kind of force and how to kind 136 00:04:38,800 --> 00:04:40,419 of get them equipped. 137 00:04:40,420 --> 00:04:42,519 Um, but what 138 00:04:42,520 --> 00:04:45,009 we also have been involved with is like 139 00:04:45,010 --> 00:04:46,479 a growing field of of what they call 140 00:04:46,480 --> 00:04:48,639 piece tech, um, 141 00:04:48,640 --> 00:04:49,779 which is more like on the on the civil 142 00:04:49,780 --> 00:04:51,789 side. And really kind of I believe a lot 143 00:04:51,790 --> 00:04:54,279 of people with good intentions are 144 00:04:54,280 --> 00:04:56,379 being part of that. And it seems like 145 00:04:56,380 --> 00:04:57,489 there are a couple of conferences out 146 00:04:57,490 --> 00:04:59,679 there, um, since I guess 147 00:04:59,680 --> 00:05:01,779 2010, 2011 they started. 148 00:05:01,780 --> 00:05:03,369 And what they want to do is try a couple 149 00:05:03,370 --> 00:05:05,769 of try to get a lot of people from 150 00:05:05,770 --> 00:05:07,269 that community software development 151 00:05:07,270 --> 00:05:08,979 community together with peace building 152 00:05:08,980 --> 00:05:11,169 community. So it's about kind of getting 153 00:05:11,170 --> 00:05:13,269 the barriers down between different, uh, 154 00:05:13,270 --> 00:05:14,270 different fields. 155 00:05:15,200 --> 00:05:17,299 Because that's also what we saw over 156 00:05:17,300 --> 00:05:19,609 like pretty much last year in 157 00:05:19,610 --> 00:05:21,619 the different events we participated in. 158 00:05:21,620 --> 00:05:23,719 It is difficult for those few to talk 159 00:05:23,720 --> 00:05:25,369 to each other, the different languages, 160 00:05:25,370 --> 00:05:27,079 different backgrounds and different 161 00:05:27,080 --> 00:05:28,129 cultures. 162 00:05:28,130 --> 00:05:30,349 And we 163 00:05:30,350 --> 00:05:32,659 see a benefit and actually in 164 00:05:32,660 --> 00:05:34,829 kind of getting those barriers down and 165 00:05:34,830 --> 00:05:37,399 in a bit, um, 166 00:05:37,400 --> 00:05:39,469 so that would be one part of just 167 00:05:39,470 --> 00:05:40,619 kind of informing you what's out there. 168 00:05:40,620 --> 00:05:41,899 That's way more stuff out there. 169 00:05:42,980 --> 00:05:45,139 And, um, that's 170 00:05:45,140 --> 00:05:47,209 just like a brief overview. 171 00:05:47,210 --> 00:05:49,369 Um, the other thing what we did is, 172 00:05:49,370 --> 00:05:51,559 uh, at the Monterey Institute, we 173 00:05:51,560 --> 00:05:53,059 have a media monitoring project for 174 00:05:53,060 --> 00:05:55,729 basically we do qualitative research in 175 00:05:55,730 --> 00:05:57,889 countries which might be, 176 00:05:57,890 --> 00:05:59,959 uh, at risk of mass atrocity, atrocity 177 00:05:59,960 --> 00:06:02,089 crimes. There are other NGOs 178 00:06:02,090 --> 00:06:03,659 out there. The Crisis Group, for example, 179 00:06:03,660 --> 00:06:05,809 has a very prominent project as 180 00:06:05,810 --> 00:06:06,739 well. 181 00:06:06,740 --> 00:06:08,659 Um, but what we find is that it's very 182 00:06:08,660 --> 00:06:11,119 difficult to just kind of we we basically 183 00:06:11,120 --> 00:06:13,309 rely on interns who, like SCAP, look 184 00:06:13,310 --> 00:06:14,809 at the media landscapes in those 185 00:06:14,810 --> 00:06:17,059 countries and look at international 186 00:06:17,060 --> 00:06:18,379 media and then see what's actually 187 00:06:18,380 --> 00:06:20,209 happening. But it's very qualitative 188 00:06:20,210 --> 00:06:21,859 work. So we were thrilled when we saw 189 00:06:21,860 --> 00:06:24,199 that there is a way to actually 190 00:06:24,200 --> 00:06:26,779 use more quantitative methods. 191 00:06:26,780 --> 00:06:29,449 Um, there's a 192 00:06:29,450 --> 00:06:31,219 debate in Stockholm in which we basically 193 00:06:31,220 --> 00:06:32,569 followed. You wrote a proposal for this 194 00:06:32,570 --> 00:06:34,639 and we kind of got invited 195 00:06:34,640 --> 00:06:37,459 to do research, uh, with the Media Lab. 196 00:06:37,460 --> 00:06:39,949 Um, what we did there was we used 197 00:06:39,950 --> 00:06:42,319 the media cloud, uh, 198 00:06:42,320 --> 00:06:43,879 framework that I have a couple of other 199 00:06:43,880 --> 00:06:45,889 tools. We mainly use the media to 200 00:06:45,890 --> 00:06:48,319 dashboard, which pretty much is like 201 00:06:48,320 --> 00:06:50,419 a dashboard, which you can 202 00:06:50,420 --> 00:06:52,519 query a database in this in 203 00:06:52,520 --> 00:06:54,649 this, uh, in the 204 00:06:54,650 --> 00:06:56,809 sense that we created the database about 205 00:06:56,810 --> 00:06:58,369 US mainstream media. 206 00:06:58,370 --> 00:07:00,829 Um, there's about twenty five biggest 207 00:07:00,830 --> 00:07:02,719 media sources in the US. 208 00:07:02,720 --> 00:07:04,859 And basically just look, you can 209 00:07:04,860 --> 00:07:06,769 you can query four different for 210 00:07:06,770 --> 00:07:08,869 different, uh, subjects 211 00:07:08,870 --> 00:07:10,759 and then it gives you like a timeline 212 00:07:10,760 --> 00:07:12,829 where you see like it's basically, 213 00:07:12,830 --> 00:07:14,869 uh, sentences per day and then over the 214 00:07:14,870 --> 00:07:16,969 time line which you would you have 215 00:07:16,970 --> 00:07:18,469 looked at and you can get like a word 216 00:07:18,470 --> 00:07:18,839 cloud. 217 00:07:18,840 --> 00:07:20,899 And, um, it's very good 218 00:07:20,900 --> 00:07:23,079 to kind of just get a very brief overview 219 00:07:23,080 --> 00:07:24,289 on a topic. 220 00:07:24,290 --> 00:07:25,369 And what we did. 221 00:07:25,370 --> 00:07:27,379 We were really interested in a lot of 222 00:07:27,380 --> 00:07:29,509 countries we did research on, like 223 00:07:29,510 --> 00:07:31,129 in sub-Saharan Africa mainly. 224 00:07:31,130 --> 00:07:33,709 So we wanted to know how, um, 225 00:07:33,710 --> 00:07:35,509 sub-Saharan Africa is presented in the US 226 00:07:35,510 --> 00:07:37,369 mainstream media, because many of the 227 00:07:37,370 --> 00:07:38,749 things you would normally think, it's 228 00:07:38,750 --> 00:07:40,249 like there are a couple of topics which 229 00:07:40,250 --> 00:07:42,499 always come up, but, uh, many 230 00:07:42,500 --> 00:07:44,119 other good things which happen in 231 00:07:44,120 --> 00:07:45,889 sub-Saharan African countries that never 232 00:07:45,890 --> 00:07:46,879 show up. 233 00:07:46,880 --> 00:07:48,979 Um, so 234 00:07:48,980 --> 00:07:50,509 what is the here we have a couple of 235 00:07:50,510 --> 00:07:53,119 countries like Cameroon, Niger, 236 00:07:53,120 --> 00:07:55,489 South Sudan, Uganda, they all like 237 00:07:55,490 --> 00:07:57,829 they all plotted, uh, next to each other. 238 00:07:57,830 --> 00:07:59,929 And what we found is that compared to 239 00:07:59,930 --> 00:08:02,389 other topics, there's not much coverage 240 00:08:02,390 --> 00:08:04,819 at all. If there is and 241 00:08:04,820 --> 00:08:06,859 if there is, uh, if there's actually 242 00:08:06,860 --> 00:08:08,719 interest to see it here, it's when 243 00:08:08,720 --> 00:08:10,189 there's a World Cup. So that's like an 244 00:08:10,190 --> 00:08:12,619 event in the West, which, 245 00:08:12,620 --> 00:08:14,779 uh, then then actually sparks interest 246 00:08:14,780 --> 00:08:16,189 in media attention. 247 00:08:16,190 --> 00:08:17,190 And 248 00:08:18,500 --> 00:08:19,789 obviously now we don't really have too 249 00:08:19,790 --> 00:08:20,989 much to compare it to. 250 00:08:20,990 --> 00:08:22,909 So we figured that like we needed 251 00:08:22,910 --> 00:08:24,769 something which is very prominent in US 252 00:08:24,770 --> 00:08:26,929 mainstream media and that is Kim 253 00:08:26,930 --> 00:08:29,119 Kardashian. I'm not sure if you know who 254 00:08:29,120 --> 00:08:31,759 she is. She's like a media celebrity. 255 00:08:31,760 --> 00:08:34,038 But when you voted against like 256 00:08:34,039 --> 00:08:36,139 what what the media 257 00:08:36,140 --> 00:08:37,939 attention, she gets this way more than 258 00:08:37,940 --> 00:08:40,129 what all those nine, uh, 259 00:08:40,130 --> 00:08:42,558 sub-Saharan African countries get 260 00:08:42,559 --> 00:08:43,548 combined. 261 00:08:43,549 --> 00:08:45,739 Um, that's actually nothing 262 00:08:45,740 --> 00:08:46,099 new. 263 00:08:46,100 --> 00:08:48,169 We had we had the kind of research that's 264 00:08:48,170 --> 00:08:49,759 been like since the sixties, but now we 265 00:08:49,760 --> 00:08:52,099 can actually show it like with actual 266 00:08:52,100 --> 00:08:53,390 like a lot of data behind it. 267 00:08:54,890 --> 00:08:57,049 Um, then we also went 268 00:08:57,050 --> 00:08:59,389 into like a more like a qualitative 269 00:08:59,390 --> 00:09:01,309 sort of research where we looked at 50 270 00:09:01,310 --> 00:09:03,379 articles each and what we 271 00:09:03,380 --> 00:09:05,809 saw was like basically we have like 272 00:09:05,810 --> 00:09:08,449 like 340 stories which are 273 00:09:08,450 --> 00:09:09,649 which are valid. 274 00:09:09,650 --> 00:09:12,169 And when you look at what the frame 275 00:09:12,170 --> 00:09:14,299 of the stories is, it 276 00:09:14,300 --> 00:09:16,669 was terror related, it's Ebola related 277 00:09:16,670 --> 00:09:18,379 or soccer related. That that's pretty 278 00:09:18,380 --> 00:09:20,899 much made the main the main 279 00:09:20,900 --> 00:09:23,299 topics that got on on mainstream 280 00:09:23,300 --> 00:09:25,369 media reporting Africa is that, 281 00:09:25,370 --> 00:09:27,499 um and so a lot of 282 00:09:27,500 --> 00:09:29,719 a lot of good things which are happening 283 00:09:29,720 --> 00:09:30,979 in those countries are not getting 284 00:09:30,980 --> 00:09:32,089 reported or not. 285 00:09:32,090 --> 00:09:33,500 Not their, um. 286 00:09:35,390 --> 00:09:36,390 So. 287 00:09:36,940 --> 00:09:39,549 When we kind of were at that stage, 288 00:09:39,550 --> 00:09:40,929 we will look at the supply side of 289 00:09:40,930 --> 00:09:42,369 things. That's what the media supplies 290 00:09:42,370 --> 00:09:44,529 and that's what people can can read into. 291 00:09:44,530 --> 00:09:47,189 And we also look 292 00:09:47,190 --> 00:09:48,729 at the same time when we when we did the 293 00:09:48,730 --> 00:09:50,349 research, there was there was an article 294 00:09:50,350 --> 00:09:52,659 by Chris Norton on Medium 295 00:09:52,660 --> 00:09:55,119 where she basically says 296 00:09:55,120 --> 00:09:57,279 that it's also 297 00:09:57,280 --> 00:09:58,209 on the demand side. 298 00:09:58,210 --> 00:09:59,709 So a lot of people actually don't really 299 00:09:59,710 --> 00:10:02,319 want to read anything else. 300 00:10:02,320 --> 00:10:04,569 So it's just a very strong 301 00:10:04,570 --> 00:10:06,489 quote there where she says, like, you 302 00:10:06,490 --> 00:10:08,099 people never click the fucking link. 303 00:10:08,100 --> 00:10:11,139 Um, so 304 00:10:11,140 --> 00:10:13,449 and when when we kind of think about mass 305 00:10:13,450 --> 00:10:15,309 atrocity prevention or prevention of 306 00:10:15,310 --> 00:10:16,970 atrocities, uh, 307 00:10:18,150 --> 00:10:20,829 we believe it's important to know that, 308 00:10:20,830 --> 00:10:21,830 um. 309 00:10:22,950 --> 00:10:25,359 The way we see a lot of countries like 310 00:10:25,360 --> 00:10:26,649 African countries especially. 311 00:10:28,440 --> 00:10:30,299 We need like another we need another 312 00:10:30,300 --> 00:10:32,519 frame, we just don't need only the frame 313 00:10:32,520 --> 00:10:34,919 we have, uh, which 314 00:10:34,920 --> 00:10:37,259 gets gets normally portrayed like Ebola 315 00:10:37,260 --> 00:10:39,029 and terror related. 316 00:10:39,030 --> 00:10:40,030 Um. 317 00:10:41,920 --> 00:10:44,049 Then one thing we kind 318 00:10:44,050 --> 00:10:45,639 of did at the same time when we when we 319 00:10:45,640 --> 00:10:47,829 talked about, uh, 320 00:10:47,830 --> 00:10:50,289 when we did the research, when we, 321 00:10:50,290 --> 00:10:52,239 um, when you wrote the proposal right at 322 00:10:52,240 --> 00:10:54,369 the beginning, we had like 323 00:10:54,370 --> 00:10:56,319 a little mishap. Um, I'm really 324 00:10:56,320 --> 00:10:58,209 interested in, like, the philosophy track 325 00:10:58,210 --> 00:11:00,639 we have here at, uh, at the 326 00:11:00,640 --> 00:11:03,219 Congress. And so I want to share like the 327 00:11:03,220 --> 00:11:04,719 second a little anecdote. 328 00:11:04,720 --> 00:11:06,939 Um, so basically what happened, 329 00:11:06,940 --> 00:11:09,309 I like somebody saw the proposal 330 00:11:09,310 --> 00:11:11,379 and I went, uh, I 331 00:11:11,380 --> 00:11:12,999 just saw it on my computer. 332 00:11:13,000 --> 00:11:15,069 And then I went I went home, uh, on my 333 00:11:15,070 --> 00:11:16,389 laptop and. 334 00:11:17,440 --> 00:11:18,440 So. 335 00:11:21,800 --> 00:11:23,929 I went home and wrote like an email, 336 00:11:23,930 --> 00:11:25,549 just kind of trying to figure out, hey, 337 00:11:25,550 --> 00:11:28,189 guys, what is that research about 338 00:11:28,190 --> 00:11:30,319 cetera, and just have a couple of 339 00:11:30,320 --> 00:11:32,599 questions. What I didn't know was that 340 00:11:32,600 --> 00:11:35,509 my wife actually installed a 341 00:11:35,510 --> 00:11:38,389 chromium plugin which would 342 00:11:38,390 --> 00:11:40,489 actually mocked the overuse of the 343 00:11:40,490 --> 00:11:42,659 word cloud and turned 344 00:11:42,660 --> 00:11:45,919 it into. But, um, so 345 00:11:45,920 --> 00:11:47,569 what happened? I write this really long, 346 00:11:47,570 --> 00:11:49,279 elaborate email because I think that it 347 00:11:49,280 --> 00:11:51,439 was a great project and then this 348 00:11:51,440 --> 00:11:53,569 is what I get back and why 349 00:11:53,570 --> 00:11:54,889 it's media cloud, not media. 350 00:11:54,890 --> 00:11:57,319 But, um, I'm 351 00:11:57,320 --> 00:11:58,879 curious, etc.. 352 00:11:58,880 --> 00:12:00,979 Um, obviously 353 00:12:00,980 --> 00:12:02,419 it was a failure. We feel that we felt 354 00:12:02,420 --> 00:12:04,759 embarrassed, but we were lucky and 355 00:12:04,760 --> 00:12:06,619 we got it. But like, yeah, I think it's 356 00:12:06,620 --> 00:12:08,239 important to kind of share whatever 357 00:12:08,240 --> 00:12:10,009 mishaps you have beaten, like software 358 00:12:10,010 --> 00:12:12,649 development or especially in academia. 359 00:12:12,650 --> 00:12:14,809 Um, so just putting it out there 360 00:12:14,810 --> 00:12:17,329 very briefly, um, 361 00:12:17,330 --> 00:12:18,819 then I guess the last thing I want to 362 00:12:18,820 --> 00:12:20,419 want to talk about is like we participate 363 00:12:20,420 --> 00:12:22,159 in various microphones and that's there 364 00:12:22,160 --> 00:12:24,049 were some times that were like like free 365 00:12:24,050 --> 00:12:25,339 and open source software because there 366 00:12:25,340 --> 00:12:27,409 there was a lot of people just 367 00:12:27,410 --> 00:12:29,029 like the traditional tech community, like 368 00:12:29,030 --> 00:12:30,109 in Montreal. We have a lot of software 369 00:12:30,110 --> 00:12:32,479 developers, um, 370 00:12:32,480 --> 00:12:34,369 and obviously out there there's like 371 00:12:34,370 --> 00:12:36,349 something like a hackathon fatigue. 372 00:12:37,550 --> 00:12:39,079 I mean, you see you see there like 373 00:12:39,080 --> 00:12:40,669 T-Mobile and flatboat Lebanon. 374 00:12:40,670 --> 00:12:42,889 They'd like basic big corporations using 375 00:12:42,890 --> 00:12:44,959 hackathon and hacking culture in a way to 376 00:12:44,960 --> 00:12:47,149 to basically outsource their 377 00:12:47,150 --> 00:12:48,079 R&D. 378 00:12:48,080 --> 00:12:50,299 Um, so 379 00:12:50,300 --> 00:12:52,939 we also saw kind of like a hackathon 380 00:12:52,940 --> 00:12:55,009 fatigue in what we did, what 381 00:12:55,010 --> 00:12:56,599 we did. And we said like a couple of 382 00:12:56,600 --> 00:12:58,879 criticisms, like pretty 383 00:12:58,880 --> 00:13:00,979 much the biggest, uh, thing 384 00:13:00,980 --> 00:13:02,959 we noticed was that when we opened, like 385 00:13:02,960 --> 00:13:04,999 a lot of people who organized hackathon, 386 00:13:05,000 --> 00:13:07,189 we participated in the drive to bring 387 00:13:07,190 --> 00:13:09,679 actually both pieces together, um, 388 00:13:09,680 --> 00:13:11,959 the tech community and the peace building 389 00:13:11,960 --> 00:13:13,669 community, for example. 390 00:13:13,670 --> 00:13:15,799 Um, but oftentimes it was 391 00:13:15,800 --> 00:13:17,869 really difficult to actually get 392 00:13:17,870 --> 00:13:20,119 enough people with the technology. 393 00:13:20,120 --> 00:13:22,309 So you had like 40, 50 people, 394 00:13:22,310 --> 00:13:23,689 um, but only two or three. 395 00:13:23,690 --> 00:13:25,669 Well, were able to code. 396 00:13:25,670 --> 00:13:26,670 Um. 397 00:13:27,550 --> 00:13:29,079 Which then in the end, if you have to 398 00:13:29,080 --> 00:13:31,299 actually try to get some sort 399 00:13:31,300 --> 00:13:33,789 of ideas going into the project, 400 00:13:33,790 --> 00:13:35,919 you actually, um, end up 401 00:13:35,920 --> 00:13:38,289 with a lot of policy talk and a lot of 402 00:13:38,290 --> 00:13:39,789 like there wasn't as much like 403 00:13:39,790 --> 00:13:41,789 cross-cultural ization as this is like 404 00:13:41,790 --> 00:13:43,989 and then talking between the feel that I 405 00:13:43,990 --> 00:13:45,999 would have kind of hope for it. 406 00:13:46,000 --> 00:13:48,189 Um, but also like, 407 00:13:48,190 --> 00:13:50,469 uh, one I would say 408 00:13:50,470 --> 00:13:52,239 more like on the positive side of things. 409 00:13:52,240 --> 00:13:54,579 An example was, uh, where we participated 410 00:13:54,580 --> 00:13:56,859 in the Talking Peace Festival, where 411 00:13:56,860 --> 00:13:59,259 we found, like, uh, that, uh, hackathon 412 00:13:59,260 --> 00:14:01,719 in, uh, in Washington, D.C. 413 00:14:01,720 --> 00:14:02,979 and we participated in that. 414 00:14:02,980 --> 00:14:05,649 And there we found a good mix of 415 00:14:05,650 --> 00:14:07,089 both people actually knew how to code a 416 00:14:07,090 --> 00:14:08,689 new kind of. 417 00:14:08,690 --> 00:14:10,399 How to think about surveillance and kind 418 00:14:10,400 --> 00:14:12,439 of think about the important topics, I 419 00:14:12,440 --> 00:14:13,999 guess the audience here is really 420 00:14:14,000 --> 00:14:15,000 concerned about. 421 00:14:16,860 --> 00:14:19,019 So and what we also 422 00:14:19,020 --> 00:14:21,239 found is that when you think about 423 00:14:21,240 --> 00:14:23,909 building whatever kind of technology, 424 00:14:23,910 --> 00:14:25,679 it's really important to be engaged with 425 00:14:25,680 --> 00:14:27,779 the community, those 426 00:14:27,780 --> 00:14:29,879 those people, uh, it should 427 00:14:29,880 --> 00:14:32,129 be served with for, 428 00:14:32,130 --> 00:14:33,130 um. 429 00:14:33,660 --> 00:14:35,759 So we found like this was something 430 00:14:35,760 --> 00:14:38,069 we felt like there was a lot of 431 00:14:38,070 --> 00:14:40,079 movement. I think that was interesting. 432 00:14:40,080 --> 00:14:41,759 Both sides. So it wasn't only from like 433 00:14:41,760 --> 00:14:44,489 the political policy side, but also from 434 00:14:44,490 --> 00:14:46,949 the side of of of the tech 435 00:14:46,950 --> 00:14:48,129 knowledge. 436 00:14:48,130 --> 00:14:50,189 And we think and and 437 00:14:50,190 --> 00:14:52,349 so like in a way we 438 00:14:52,350 --> 00:14:54,569 like what what, uh, a lot of NGOs 439 00:14:54,570 --> 00:14:56,819 and human rights groups could 440 00:14:56,820 --> 00:14:58,979 use the hackathon or 441 00:14:58,980 --> 00:15:01,079 similar ideas could use it for is 442 00:15:01,080 --> 00:15:03,369 actually like. Yeah, build trying to 443 00:15:03,370 --> 00:15:05,129 get get get a bridge between those two 444 00:15:05,130 --> 00:15:07,259 communities. I know that's like I mean 445 00:15:07,260 --> 00:15:09,449 today we have a lot of a lot of, uh, 446 00:15:09,450 --> 00:15:10,709 fairly depressing stories. 447 00:15:10,710 --> 00:15:12,899 And, uh, I don't 448 00:15:12,900 --> 00:15:15,239 want to end with, like, a happy ending, 449 00:15:15,240 --> 00:15:16,889 but like I'm just like what I what I want 450 00:15:16,890 --> 00:15:19,109 to say is that, uh, we 451 00:15:19,110 --> 00:15:21,239 feel that there is like there 452 00:15:21,240 --> 00:15:23,579 are a lot of people who think they could 453 00:15:23,580 --> 00:15:25,949 be, um, working together 454 00:15:25,950 --> 00:15:27,899 to actually build technologies which 455 00:15:27,900 --> 00:15:28,889 could help people in human rights 456 00:15:28,890 --> 00:15:30,539 situations and. 457 00:15:31,810 --> 00:15:34,119 But what we think is actually 458 00:15:34,120 --> 00:15:36,009 just trying to engage and get together 459 00:15:36,010 --> 00:15:37,899 and see where that leads, that's part of 460 00:15:37,900 --> 00:15:39,519 why I'm here. That's kind of like I'd be 461 00:15:39,520 --> 00:15:42,309 happy to talk to people who 462 00:15:42,310 --> 00:15:44,259 have different thoughts and actually give 463 00:15:44,260 --> 00:15:45,819 us feedback on what we can improve, 464 00:15:45,820 --> 00:15:48,009 because, um, 465 00:15:48,010 --> 00:15:49,929 if it is about talking to each other, 466 00:15:49,930 --> 00:15:51,579 that's that's something we would really 467 00:15:51,580 --> 00:15:53,320 be, uh, engaging in. 468 00:15:54,680 --> 00:15:57,469 OK, and so 469 00:15:57,470 --> 00:15:58,759 just to give you like an example of a 470 00:15:58,760 --> 00:16:00,739 couple of projects we found fairly 471 00:16:00,740 --> 00:16:02,869 interesting or kind of 472 00:16:02,870 --> 00:16:05,209 free, we came across during during 473 00:16:05,210 --> 00:16:07,279 the time we did the research. 474 00:16:07,280 --> 00:16:08,809 And then in the last half year, 475 00:16:10,880 --> 00:16:13,189 it is a project called The 476 00:16:13,190 --> 00:16:14,489 Early Warning. 477 00:16:14,490 --> 00:16:15,490 Uh. 478 00:16:16,760 --> 00:16:19,219 The early warning system, which basically 479 00:16:19,220 --> 00:16:21,019 has a couple of components, is like a 480 00:16:21,020 --> 00:16:23,509 statistical analysis method and 481 00:16:23,510 --> 00:16:25,789 some sort of like an expert or so, 482 00:16:25,790 --> 00:16:27,979 uh, they ask a lot 483 00:16:27,980 --> 00:16:30,049 of people from, uh, 484 00:16:30,050 --> 00:16:31,729 the peacebuilding community and then 485 00:16:31,730 --> 00:16:33,679 different other communities like what 486 00:16:33,680 --> 00:16:34,939 they what they're going to think is going 487 00:16:34,940 --> 00:16:36,889 to happen in the next 12 months in 488 00:16:36,890 --> 00:16:39,079 country X, Y, Z, and then 489 00:16:39,080 --> 00:16:40,639 they give opinions. I guess it's fairly 490 00:16:40,640 --> 00:16:42,199 it's a fairly big opinion poll, too. 491 00:16:42,200 --> 00:16:44,689 It's about like, I think, to 200 people, 492 00:16:44,690 --> 00:16:46,789 uh, participating from different 493 00:16:46,790 --> 00:16:47,689 fields. 494 00:16:47,690 --> 00:16:49,819 So it kind of tries to to get a couple of 495 00:16:49,820 --> 00:16:50,839 methods together. 496 00:16:50,840 --> 00:16:52,969 And, um, I'm I'm very sure that, 497 00:16:52,970 --> 00:16:54,439 like a lot of people who are engaged with 498 00:16:54,440 --> 00:16:56,599 the this project would be interested 499 00:16:56,600 --> 00:16:58,699 in how to make those tools better. 500 00:16:58,700 --> 00:17:01,069 Um, because what 501 00:17:01,070 --> 00:17:03,229 we kind of think is that, uh, a 502 00:17:03,230 --> 00:17:05,299 lot of NGOs and human rights groups, what 503 00:17:05,300 --> 00:17:08,389 they can offer is kind of like a network 504 00:17:08,390 --> 00:17:10,578 to connect people with each other. 505 00:17:10,579 --> 00:17:12,649 And they also have like, uh, 506 00:17:12,650 --> 00:17:14,779 oftentimes a better 507 00:17:14,780 --> 00:17:16,159 knowledge of how to actually get the 508 00:17:16,160 --> 00:17:18,439 message out rather than if you just have, 509 00:17:18,440 --> 00:17:21,068 um, technologies and 510 00:17:21,069 --> 00:17:22,549 tactics. Just think about the deep tech 511 00:17:22,550 --> 00:17:24,588 and that's what they're doing. 512 00:17:24,589 --> 00:17:26,899 And so, like, kind of getting 513 00:17:26,900 --> 00:17:29,359 their heads away from the keyboard 514 00:17:29,360 --> 00:17:30,589 and actually the wider world and how it 515 00:17:30,590 --> 00:17:31,969 is deployed, et cetera. 516 00:17:31,970 --> 00:17:32,970 Um. 517 00:17:33,580 --> 00:17:34,899 So be grateful if you could get some 518 00:17:34,900 --> 00:17:36,729 feedback on that as well. 519 00:17:36,730 --> 00:17:39,159 Um, then there's another, 520 00:17:39,160 --> 00:17:41,379 uh, actually, like I'm very 521 00:17:41,380 --> 00:17:43,539 kind of hesitant to to promote apps 522 00:17:43,540 --> 00:17:45,519 because I'm not really sure how just an 523 00:17:45,520 --> 00:17:47,709 app by itself can promote 524 00:17:47,710 --> 00:17:49,579 peacekeeping or mass atrocity prevention. 525 00:17:49,580 --> 00:17:51,549 It's just I don't really see that's 526 00:17:51,550 --> 00:17:53,619 necessarily like a good 527 00:17:53,620 --> 00:17:55,149 way of promoting it. 528 00:17:55,150 --> 00:17:57,219 But there's this one up we found 529 00:17:57,220 --> 00:17:58,179 pretty interesting. 530 00:17:58,180 --> 00:18:00,459 Um, it's called eyewitnessed. 531 00:18:00,460 --> 00:18:02,139 And basically what it does, it is it is 532 00:18:02,140 --> 00:18:04,509 an app to, um, 533 00:18:04,510 --> 00:18:06,459 document atrocity crimes. 534 00:18:06,460 --> 00:18:08,979 Um, and like, it's 535 00:18:08,980 --> 00:18:11,049 you can just download it and it 536 00:18:11,050 --> 00:18:13,749 kind of looks at, uh, takes the security 537 00:18:13,750 --> 00:18:15,909 of the people who use it very seriously. 538 00:18:15,910 --> 00:18:16,910 Um. 539 00:18:17,680 --> 00:18:19,749 So, for example, like you said, it's 540 00:18:19,750 --> 00:18:21,549 not just like a logo on the air, but it's 541 00:18:21,550 --> 00:18:23,939 kind of it's more hidden in within 542 00:18:23,940 --> 00:18:24,940 us. 543 00:18:25,930 --> 00:18:27,369 But actually, to get to that point, it 544 00:18:27,370 --> 00:18:29,019 took a lot of work from security 545 00:18:29,020 --> 00:18:31,239 researchers to actually 546 00:18:31,240 --> 00:18:33,159 be able to make photos of an atrocity and 547 00:18:33,160 --> 00:18:35,379 then sexually transmitted 548 00:18:35,380 --> 00:18:36,849 to, uh. 549 00:18:36,850 --> 00:18:38,199 I think it's the International Bar 550 00:18:38,200 --> 00:18:40,539 Association, which then 551 00:18:40,540 --> 00:18:42,759 can use those pictures with timestamps, 552 00:18:42,760 --> 00:18:45,189 et cetera, then to be actual 553 00:18:45,190 --> 00:18:47,589 legal documents in, uh, 554 00:18:47,590 --> 00:18:49,719 for example, ICC, the International 555 00:18:49,720 --> 00:18:51,849 Criminal Court cases. 556 00:18:51,850 --> 00:18:52,850 Um. 557 00:18:53,430 --> 00:18:55,499 I think the last thing I would 558 00:18:55,500 --> 00:18:57,839 do is just kind of there like three 559 00:18:57,840 --> 00:18:59,549 Canadian companies, I think a couple of 560 00:18:59,550 --> 00:19:01,199 them might be even at the Congress. 561 00:19:01,200 --> 00:19:03,419 Uh, so we try to 562 00:19:03,420 --> 00:19:05,489 kind of engage with them, like, for 563 00:19:05,490 --> 00:19:07,829 example, quality of life. 564 00:19:07,830 --> 00:19:09,929 Uh, they work very much 565 00:19:09,930 --> 00:19:12,299 for free and open source software 566 00:19:12,300 --> 00:19:13,979 in the field of human rights. 567 00:19:13,980 --> 00:19:16,169 And kind of since we're 568 00:19:16,170 --> 00:19:17,639 very new with what we're doing at the 569 00:19:17,640 --> 00:19:19,769 Digital Mass Atrocity Lab, uh, it would 570 00:19:19,770 --> 00:19:22,079 be great if we could just kind of 571 00:19:22,080 --> 00:19:23,819 get a conversation going, seeing what's 572 00:19:23,820 --> 00:19:26,339 actually what we're doing really badly 573 00:19:26,340 --> 00:19:28,219 from a security perspective. 574 00:19:28,220 --> 00:19:30,299 Um, and 575 00:19:30,300 --> 00:19:33,449 yeah. So if I think. 576 00:19:33,450 --> 00:19:35,699 That should be actually most of what I 577 00:19:35,700 --> 00:19:37,979 what I wanted to share, um, I 578 00:19:37,980 --> 00:19:41,189 will put a couple of a couple of, uh, 579 00:19:41,190 --> 00:19:43,349 links already of what we did and 580 00:19:43,350 --> 00:19:45,809 what we what we did on on an iPad. 581 00:19:45,810 --> 00:19:48,029 But I feel free to just reach out, write 582 00:19:48,030 --> 00:19:48,959 your comments in there. 583 00:19:48,960 --> 00:19:50,099 You can't even troll. 584 00:19:50,100 --> 00:19:51,100 I will just delete it 585 00:19:52,530 --> 00:19:54,089 and. Yeah, I think. 586 00:19:55,110 --> 00:19:57,329 That would be the the the main thing 587 00:19:57,330 --> 00:19:59,549 of my talk, uh, give us feedback and let 588 00:19:59,550 --> 00:20:01,859 us know what you think and then 589 00:20:01,860 --> 00:20:02,860 be critical. 590 00:20:10,610 --> 00:20:11,610 Thank you very much. 591 00:20:12,770 --> 00:20:14,869 We have actually plenty of time 592 00:20:14,870 --> 00:20:17,089 for questions, so if you have 593 00:20:17,090 --> 00:20:18,679 any questions, please line up at the 594 00:20:18,680 --> 00:20:19,909 microphones. 595 00:20:19,910 --> 00:20:21,630 Are there questions from Iasi? 596 00:20:22,870 --> 00:20:25,309 No, no questions for any questions 597 00:20:25,310 --> 00:20:26,310 from the audience. 598 00:20:27,910 --> 00:20:29,320 No, go on, 599 00:20:30,760 --> 00:20:32,529 please, please go to the microphone so 600 00:20:32,530 --> 00:20:34,689 our listeners at home can hear you 601 00:20:34,690 --> 00:20:35,769 and we can also record it. 602 00:20:39,400 --> 00:20:42,039 Microphone front left, please, 603 00:20:42,040 --> 00:20:44,159 from left, all right. 604 00:20:44,160 --> 00:20:46,779 OK, OK, 605 00:20:47,960 --> 00:20:50,079 yeah, I guess some reflections 606 00:20:50,080 --> 00:20:52,479 because you were also asking for input. 607 00:20:52,480 --> 00:20:55,179 The first one was interested if anyone 608 00:20:55,180 --> 00:20:57,339 here is developing us is D 609 00:20:58,390 --> 00:21:01,479 or C application toolkits 610 00:21:01,480 --> 00:21:03,939 or any type of technology 611 00:21:03,940 --> 00:21:06,269 that I 612 00:21:06,270 --> 00:21:08,319 guess it's called technological blending 613 00:21:08,320 --> 00:21:09,399 or whatever you call it. 614 00:21:09,400 --> 00:21:11,799 But to bridge the digital 615 00:21:11,800 --> 00:21:14,909 divide between people with dumb phones 616 00:21:14,910 --> 00:21:17,139 and feature phones, which are pretty 617 00:21:17,140 --> 00:21:19,329 much the new dumb phone and I 618 00:21:19,330 --> 00:21:21,549 guess, yeah, the entry 619 00:21:21,550 --> 00:21:23,439 level of an Android phone that can run 620 00:21:23,440 --> 00:21:25,689 that app is always increasingly going 621 00:21:25,690 --> 00:21:27,069 down. 622 00:21:27,070 --> 00:21:29,199 But yeah, I mean, successful 623 00:21:29,200 --> 00:21:31,389 projects like Ushahidi have thought about 624 00:21:31,390 --> 00:21:33,459 that integration with. 625 00:21:33,460 --> 00:21:35,619 Yeah. Dumb phones from 626 00:21:35,620 --> 00:21:37,509 the start. So I would be interested if we 627 00:21:37,510 --> 00:21:39,729 have those skills or 628 00:21:39,730 --> 00:21:42,069 some application tool kit, for example. 629 00:21:42,070 --> 00:21:43,779 Is that a completely neglected 630 00:21:43,780 --> 00:21:44,780 technology? 631 00:21:45,820 --> 00:21:48,159 That's just guess more of a brain fart, 632 00:21:48,160 --> 00:21:49,989 more of an idea. 633 00:21:49,990 --> 00:21:52,599 And then the second question is, 634 00:21:52,600 --> 00:21:54,850 how do you deal with the fact that. 635 00:21:55,860 --> 00:21:58,229 In situations like this, you do usually 636 00:21:58,230 --> 00:22:00,539 have tech experts, 637 00:22:00,540 --> 00:22:03,149 usually from the global north, 638 00:22:03,150 --> 00:22:05,579 narrating the stories of the global 639 00:22:05,580 --> 00:22:07,679 south is a 640 00:22:07,680 --> 00:22:09,869 big generalization, but it's a tendency. 641 00:22:09,870 --> 00:22:12,119 And how does that get resolved? 642 00:22:13,380 --> 00:22:15,599 Yeah, but well, 643 00:22:15,600 --> 00:22:17,669 I think the second question, 644 00:22:17,670 --> 00:22:19,859 like, I could answer it in a way that a 645 00:22:19,860 --> 00:22:21,419 lot of what we're doing is this is part 646 00:22:21,420 --> 00:22:22,919 of a learning experience, especially for 647 00:22:22,920 --> 00:22:24,479 me, for myself personally, like I know my 648 00:22:24,480 --> 00:22:25,829 colleagues who do that work for a longer 649 00:22:25,830 --> 00:22:27,419 time. And like they they've been involved 650 00:22:27,420 --> 00:22:29,219 in peace building projects for a long 651 00:22:29,220 --> 00:22:30,299 time. So they were actually talking 652 00:22:30,300 --> 00:22:32,079 about, like, you know, how getting better 653 00:22:32,080 --> 00:22:34,259 than ever, just getting them into, like, 654 00:22:34,260 --> 00:22:36,329 sort of like areas where they 655 00:22:36,330 --> 00:22:37,330 shouldn't be. 656 00:22:38,250 --> 00:22:40,249 And so they actually did, in a way, more 657 00:22:40,250 --> 00:22:41,759 about that. But for us, like for me 658 00:22:41,760 --> 00:22:42,719 personally, it's just kind of like a 659 00:22:42,720 --> 00:22:44,049 learning experience. That's what we try 660 00:22:44,050 --> 00:22:45,209 to reach out. 661 00:22:45,210 --> 00:22:47,729 Um, so I'll be able to talk more, uh, 662 00:22:47,730 --> 00:22:48,730 if that's possible later. 663 00:22:51,100 --> 00:22:53,169 Great story, just to add a selfish 664 00:22:53,170 --> 00:22:55,279 point, I'm writing a thesis on the topic, 665 00:22:55,280 --> 00:22:56,559 so if anyone wants to continue a 666 00:22:56,560 --> 00:22:58,689 conversation later, sure, I'm sure that's 667 00:22:58,690 --> 00:23:00,849 the point of this whole session anyway, 668 00:23:00,850 --> 00:23:02,079 to continue our conversation. 669 00:23:02,080 --> 00:23:03,369 Thank you very much. Thank you so much. 670 00:23:05,380 --> 00:23:07,359 Microphone on the front, right, please? 671 00:23:07,360 --> 00:23:09,489 Yeah, thematically breaching the 672 00:23:09,490 --> 00:23:12,129 early warning project and the media cloud 673 00:23:12,130 --> 00:23:14,229 project, um, I'm 674 00:23:14,230 --> 00:23:16,869 wondering, it seems kind of plausible 675 00:23:16,870 --> 00:23:19,329 to me that mass atrocities are introduced 676 00:23:19,330 --> 00:23:21,669 by some media activity locally 677 00:23:21,670 --> 00:23:22,869 or globally. 678 00:23:22,870 --> 00:23:24,519 And I'm wondering if you're aware of any 679 00:23:24,520 --> 00:23:26,409 research or if you have done any research 680 00:23:26,410 --> 00:23:29,079 in that direction, analyzing media 681 00:23:29,080 --> 00:23:31,689 and as an 682 00:23:31,690 --> 00:23:33,789 early warning for mass 683 00:23:33,790 --> 00:23:36,729 atrocities. Are you aware of any research 684 00:23:36,730 --> 00:23:38,049 done in that direction? 685 00:23:38,050 --> 00:23:40,169 Yeah, like I mean, one 686 00:23:40,170 --> 00:23:41,889 one piece where we were interested in 687 00:23:41,890 --> 00:23:43,779 what what media cloud is providing is 688 00:23:43,780 --> 00:23:45,249 that it actually matters what what's 689 00:23:45,250 --> 00:23:47,709 happening, uh, 690 00:23:47,710 --> 00:23:49,989 like in the media environment, in 691 00:23:49,990 --> 00:23:51,309 certain countries, uh, for example, 692 00:23:51,310 --> 00:23:52,310 Burundi. 693 00:23:52,960 --> 00:23:55,059 And one thing we kind of 694 00:23:55,060 --> 00:23:56,409 were thinking about, and that's just an 695 00:23:56,410 --> 00:23:57,999 idea which which would be interesting to 696 00:23:58,000 --> 00:23:59,809 get some feedback about it, like there is 697 00:23:59,810 --> 00:24:01,519 something called the Cheatle Project. 698 00:24:01,520 --> 00:24:02,889 It's like the general database on 699 00:24:02,890 --> 00:24:05,709 language and tone. 700 00:24:05,710 --> 00:24:07,719 And I mean, it's a project like actually 701 00:24:07,720 --> 00:24:09,309 Google is behind some of that. 702 00:24:09,310 --> 00:24:11,499 And they basically look at media and runs 703 00:24:11,500 --> 00:24:12,999 in almost real time. 704 00:24:13,000 --> 00:24:15,069 So when you when 705 00:24:15,070 --> 00:24:16,059 you when you basically have an 706 00:24:16,060 --> 00:24:18,009 environment where where you think there 707 00:24:18,010 --> 00:24:20,319 might there might be atrocities happening 708 00:24:20,320 --> 00:24:22,569 in the near future, that 709 00:24:22,570 --> 00:24:24,669 could be something to look at because 710 00:24:24,670 --> 00:24:26,469 you can actually monitor what's 711 00:24:26,470 --> 00:24:27,819 happening. But I mean, then again, 712 00:24:27,820 --> 00:24:29,979 monitoring is close to 713 00:24:29,980 --> 00:24:31,839 then again, also like you surveil the 714 00:24:31,840 --> 00:24:32,949 whole country and you see what's 715 00:24:32,950 --> 00:24:35,129 happening in their in their in 716 00:24:35,130 --> 00:24:36,130 their environment. But 717 00:24:37,480 --> 00:24:38,559 that's something we talked about. 718 00:24:38,560 --> 00:24:40,899 Like I mean, the the research 719 00:24:40,900 --> 00:24:43,329 on what like different technologies 720 00:24:43,330 --> 00:24:45,609 or media play in mass atrocities, 721 00:24:45,610 --> 00:24:47,349 that that's just I think that's a whole 722 00:24:47,350 --> 00:24:48,669 range of literature out there. 723 00:24:48,670 --> 00:24:50,799 Um, where you look at that, at the 724 00:24:50,800 --> 00:24:52,869 front end genocide, for example, there's 725 00:24:52,870 --> 00:24:55,089 a whole band 726 00:24:55,090 --> 00:24:56,349 of, uh, literature on 727 00:24:57,460 --> 00:25:00,099 radio as being the enabling technology 728 00:25:00,100 --> 00:25:02,199 because it was like, uh, 729 00:25:02,200 --> 00:25:04,359 there was lies and kind of hate 730 00:25:04,360 --> 00:25:06,489 speech trends transmitted over 731 00:25:06,490 --> 00:25:08,589 radio technology, for example. 732 00:25:08,590 --> 00:25:10,419 Um, so there was a whole bunch of 733 00:25:10,420 --> 00:25:11,769 research on that topic. 734 00:25:14,010 --> 00:25:16,469 The microphone from left, please. 735 00:25:16,470 --> 00:25:18,129 Hi, thanks for talk. 736 00:25:18,130 --> 00:25:18,449 Thank you. 737 00:25:18,450 --> 00:25:20,639 Um, could you expand a little bit 738 00:25:20,640 --> 00:25:22,979 like in an ideal world, what 739 00:25:22,980 --> 00:25:24,749 kind of software would be missing from a 740 00:25:24,750 --> 00:25:26,669 peace building or from the UN 741 00:25:26,670 --> 00:25:27,689 perspective? 742 00:25:27,690 --> 00:25:29,939 Um, in an ideal 743 00:25:29,940 --> 00:25:32,099 world, like I mean, if there 744 00:25:32,100 --> 00:25:34,439 is such a thing, I think what we're 745 00:25:34,440 --> 00:25:36,719 actually interested in is, uh, 746 00:25:36,720 --> 00:25:38,579 how to enable collaboration. 747 00:25:38,580 --> 00:25:39,959 I think collaboration tools might be 748 00:25:39,960 --> 00:25:42,119 something really interesting to look at, 749 00:25:42,120 --> 00:25:44,249 because you have in 750 00:25:44,250 --> 00:25:45,809 some environments, you have many 751 00:25:45,810 --> 00:25:47,729 different NGOs with different different 752 00:25:47,730 --> 00:25:49,799 kind of channels and how they work with 753 00:25:49,800 --> 00:25:51,899 each other. So, uh, 754 00:25:51,900 --> 00:25:53,339 if you make it like a community effort, 755 00:25:53,340 --> 00:25:55,259 it might be interesting to look at just 756 00:25:55,260 --> 00:25:56,969 basic collaboration tools and how to 757 00:25:56,970 --> 00:25:59,429 implement them. I mean, from from 758 00:25:59,430 --> 00:26:01,619 the little insight I got over the work 759 00:26:01,620 --> 00:26:03,869 I did over the last couple of months and 760 00:26:03,870 --> 00:26:05,939 over like a year was that 761 00:26:05,940 --> 00:26:07,589 oftentimes the tools are there and the 762 00:26:07,590 --> 00:26:09,659 tools are there to be used, but they're 763 00:26:09,660 --> 00:26:11,879 actually not embraced or not used by 764 00:26:11,880 --> 00:26:14,009 a lot of NGOs or like I mean, 765 00:26:14,010 --> 00:26:16,139 when we look at a lot of surveillance 766 00:26:16,140 --> 00:26:18,449 technologies and technologies, 767 00:26:18,450 --> 00:26:20,609 um, people know about 768 00:26:20,610 --> 00:26:21,839 it, but we know how to use it. 769 00:26:21,840 --> 00:26:24,329 And it actually would be great to have 770 00:26:24,330 --> 00:26:25,499 better resources. 771 00:26:25,500 --> 00:26:27,539 I mean, that so far they're like they're 772 00:26:27,540 --> 00:26:29,699 wikis and that's a way to implement 773 00:26:29,700 --> 00:26:31,409 it. But it also I think it needs the 774 00:26:31,410 --> 00:26:33,599 human factor to actually push people to 775 00:26:33,600 --> 00:26:35,399 to use the technologies. 776 00:26:35,400 --> 00:26:37,590 Uh, we could be providing them. 777 00:26:40,280 --> 00:26:41,779 Microphone from dry, please. 778 00:26:43,550 --> 00:26:45,649 Um, so I understood 779 00:26:45,650 --> 00:26:48,199 that you are covering official media 780 00:26:48,200 --> 00:26:50,659 reports with Media Cloud. 781 00:26:50,660 --> 00:26:52,789 Are you also researching already in 782 00:26:52,790 --> 00:26:55,129 the social media, like the publicly 783 00:26:55,130 --> 00:26:56,540 available, um. 784 00:26:57,750 --> 00:26:59,969 Whatever Twitter streams and Zaref like 785 00:26:59,970 --> 00:27:02,339 that, like us personally, 786 00:27:02,340 --> 00:27:04,529 we don't, um, I mean, obviously 787 00:27:04,530 --> 00:27:06,659 there is there is like I've been involved 788 00:27:06,660 --> 00:27:08,909 in a couple of projects which like 789 00:27:08,910 --> 00:27:11,159 I know about it, but I said, um, 790 00:27:11,160 --> 00:27:13,619 who do that? Um, but 791 00:27:13,620 --> 00:27:14,649 that's the thing like us. 792 00:27:14,650 --> 00:27:16,679 It's like we are not like a handful of 793 00:27:16,680 --> 00:27:18,539 people actually at my institute. 794 00:27:18,540 --> 00:27:20,689 And like we 795 00:27:20,690 --> 00:27:22,829 we don't at this point, we don't have the 796 00:27:22,830 --> 00:27:24,629 technological skills or the tech skills. 797 00:27:24,630 --> 00:27:26,459 And some of it is actually fairly easy, I 798 00:27:26,460 --> 00:27:28,649 suppose. Um, I mean, that there are two 799 00:27:28,650 --> 00:27:29,819 out there down which you can like, look 800 00:27:29,820 --> 00:27:31,019 at what's happening on Twitter, et 801 00:27:31,020 --> 00:27:33,119 cetera. And, uh, actually 802 00:27:33,120 --> 00:27:35,459 look at this at this hackathon in 803 00:27:35,460 --> 00:27:37,379 Washington, D.C. we had something which 804 00:27:37,380 --> 00:27:39,299 basically looked at this, uh, digital 805 00:27:39,300 --> 00:27:40,829 project, Media Cloud and Twitter at the 806 00:27:40,830 --> 00:27:43,169 same time as some sort of like a 807 00:27:43,170 --> 00:27:45,029 big monitoring system or risk assessment 808 00:27:45,030 --> 00:27:46,649 system system. 809 00:27:46,650 --> 00:27:48,719 And the interesting thing about it is it 810 00:27:48,720 --> 00:27:50,609 also evaluates tone. 811 00:27:50,610 --> 00:27:53,009 So basically you have a sentence which is 812 00:27:53,010 --> 00:27:55,619 kind of negative towards a special group. 813 00:27:55,620 --> 00:27:58,259 It would kind of it could theoretically 814 00:27:58,260 --> 00:27:59,789 trigger trigger something. 815 00:27:59,790 --> 00:28:02,519 And we had a couple of guys who like, 816 00:28:02,520 --> 00:28:04,529 uh, like, uh, code in Python. 817 00:28:04,530 --> 00:28:05,609 They called it like a prototype. 818 00:28:05,610 --> 00:28:08,189 And then basically they 819 00:28:08,190 --> 00:28:10,079 would then send it automatically a 820 00:28:10,080 --> 00:28:12,419 message to to the peace, peace, 821 00:28:12,420 --> 00:28:14,299 uh, building people on the ground. 822 00:28:14,300 --> 00:28:16,409 Um, so that's thinking about 823 00:28:16,410 --> 00:28:18,629 it. Like, I'm not really sure 824 00:28:18,630 --> 00:28:20,099 who else is thinking about that kind of 825 00:28:20,100 --> 00:28:20,999 stuff. 826 00:28:21,000 --> 00:28:23,099 But I mean, the fact is, 827 00:28:23,100 --> 00:28:25,829 like the stuff's going to be more digital 828 00:28:25,830 --> 00:28:27,569 like years to come. 829 00:28:27,570 --> 00:28:29,819 So, like, I think 830 00:28:29,820 --> 00:28:31,139 the peace building community will have to 831 00:28:31,140 --> 00:28:32,140 catch up to that to. 832 00:28:33,960 --> 00:28:36,029 I think that Twitter sentiments or 833 00:28:36,030 --> 00:28:38,379 also image recognition, 834 00:28:38,380 --> 00:28:40,019 this can help. 835 00:28:40,020 --> 00:28:40,559 Thank you. 836 00:28:40,560 --> 00:28:42,689 OK, take the microphone 837 00:28:42,690 --> 00:28:43,690 from the left, please. 838 00:28:44,850 --> 00:28:46,379 Yeah. Thank you for your time. 839 00:28:46,380 --> 00:28:48,239 One thing that that was missing for me 840 00:28:48,240 --> 00:28:50,459 was, um, uh, 841 00:28:50,460 --> 00:28:52,969 actions for peace enforcement, which 842 00:28:52,970 --> 00:28:56,189 you I think did not cover 843 00:28:56,190 --> 00:28:56,819 in the top. 844 00:28:56,820 --> 00:28:58,949 Did you what you can do to promote peace 845 00:28:58,950 --> 00:29:00,959 enforcement is the third escalation step 846 00:29:00,960 --> 00:29:03,569 for, um, um, atrocity 847 00:29:03,570 --> 00:29:05,729 prevention, meaning deploying forces, 848 00:29:05,730 --> 00:29:08,909 et cetera. I mean, deployed forces to 849 00:29:08,910 --> 00:29:11,309 prevent stuff. 850 00:29:11,310 --> 00:29:13,919 So so what 851 00:29:13,920 --> 00:29:15,839 I saw here was mostly reactive. 852 00:29:15,840 --> 00:29:17,999 Um, I mean, international law 853 00:29:18,000 --> 00:29:18,929 is reactive. 854 00:29:18,930 --> 00:29:21,029 When these atrocities, um, 855 00:29:21,030 --> 00:29:23,279 some I think technology can could also 856 00:29:23,280 --> 00:29:25,379 be used, um, as an offensive 857 00:29:25,380 --> 00:29:26,309 weapon. 858 00:29:26,310 --> 00:29:28,499 Um, the eyewitness 859 00:29:28,500 --> 00:29:31,289 idea I think was it was a good start. 860 00:29:31,290 --> 00:29:33,539 If you combine it, for example, with, 861 00:29:33,540 --> 00:29:35,879 um, image recognition on the ground 862 00:29:35,880 --> 00:29:38,250 or let's say, um, drones 863 00:29:39,540 --> 00:29:42,169 or small mini drones 864 00:29:42,170 --> 00:29:44,579 that fly over the, um, 865 00:29:44,580 --> 00:29:47,259 the area affected so that they 866 00:29:47,260 --> 00:29:48,359 they are also visible. 867 00:29:49,380 --> 00:29:51,419 So the people on the ground who might 868 00:29:51,420 --> 00:29:53,339 commit atrocities can also see that they 869 00:29:53,340 --> 00:29:54,419 are witnesses. 870 00:29:54,420 --> 00:29:55,479 Right. And stuff like that. 871 00:29:55,480 --> 00:29:57,809 So that you build up offensive, um, 872 00:29:57,810 --> 00:29:59,749 capabilities with, um, technology. 873 00:29:59,750 --> 00:30:01,889 I think that would also be interesting. 874 00:30:01,890 --> 00:30:03,539 Right. That's something I haven't thought 875 00:30:03,540 --> 00:30:05,259 about that. But but thanks for the input. 876 00:30:05,260 --> 00:30:06,260 That's great. 877 00:30:07,440 --> 00:30:09,839 Do we have any questions from Iasi. 878 00:30:09,840 --> 00:30:12,029 No. Uh, Ben, one last 879 00:30:12,030 --> 00:30:14,279 time microphone from left please. 880 00:30:15,360 --> 00:30:18,029 Uh, you mentioned using technology 881 00:30:18,030 --> 00:30:20,339 for looking at the media and it 882 00:30:20,340 --> 00:30:21,539 seemed like all sorts of things were 883 00:30:21,540 --> 00:30:23,609 leveraging what's already 884 00:30:23,610 --> 00:30:26,039 been done 885 00:30:26,040 --> 00:30:28,139 by different organizations 886 00:30:28,140 --> 00:30:30,060 involved in preventing atrocities. 887 00:30:31,620 --> 00:30:33,239 And it seemed like you thought that 888 00:30:33,240 --> 00:30:35,159 Eyewitnessed was an example of an app 889 00:30:35,160 --> 00:30:36,989 that may not be a very effective 890 00:30:36,990 --> 00:30:39,239 direction for technology 891 00:30:39,240 --> 00:30:41,369 investment, but 892 00:30:41,370 --> 00:30:43,079 it seems like the appeal of an app like 893 00:30:43,080 --> 00:30:45,179 that is that it's about directly 894 00:30:45,180 --> 00:30:47,879 impacting the atrocity, 895 00:30:47,880 --> 00:30:50,429 whereas the media and just leveraging 896 00:30:50,430 --> 00:30:51,989 other people's abilities seems more 897 00:30:51,990 --> 00:30:52,990 indirect. 898 00:30:53,790 --> 00:30:55,589 It seems like this image recognition is 899 00:30:55,590 --> 00:30:57,299 also a way of using technology to more 900 00:30:57,300 --> 00:30:58,529 directly impact things. 901 00:30:58,530 --> 00:31:00,359 I kind of wonder if there are other 902 00:31:00,360 --> 00:31:02,759 directions that would come to your mind 903 00:31:02,760 --> 00:31:05,009 as far as having a direct impact 904 00:31:05,010 --> 00:31:07,319 on an atrocity or potential 905 00:31:07,320 --> 00:31:08,189 atrocity? 906 00:31:08,190 --> 00:31:09,809 Well, actually, just to correct you, I 907 00:31:09,810 --> 00:31:11,429 think I think what I'm aware of the 908 00:31:11,430 --> 00:31:13,559 obvious in general, I think like 909 00:31:13,560 --> 00:31:15,119 I've seen a lot of apps, which is don't 910 00:31:15,120 --> 00:31:16,199 really make much sense. 911 00:31:16,200 --> 00:31:18,029 It's basically just like a website and an 912 00:31:18,030 --> 00:31:20,189 app and a nap umbrella, 913 00:31:20,190 --> 00:31:22,709 um, just kind of repackaging content. 914 00:31:22,710 --> 00:31:24,869 Um, actually did the math that 915 00:31:24,870 --> 00:31:27,329 I would not have we found comparatively 916 00:31:27,330 --> 00:31:29,519 very, very, uh, kind of smart in 917 00:31:29,520 --> 00:31:31,079 the way they thought about it. 918 00:31:31,080 --> 00:31:33,269 Um, so there's the other part 919 00:31:33,270 --> 00:31:34,859 of your question. What I think about 920 00:31:34,860 --> 00:31:36,659 other ways of. 921 00:31:36,660 --> 00:31:38,519 Yeah. Kind of what we've been talking 922 00:31:38,520 --> 00:31:40,709 about so far besides eyewitness is like 923 00:31:40,710 --> 00:31:42,779 image recognition, where it seems like 924 00:31:42,780 --> 00:31:44,909 clearly the technology itself is 925 00:31:44,910 --> 00:31:47,489 playing an integral role in preventing 926 00:31:47,490 --> 00:31:50,699 or documenting an atrocity. 927 00:31:50,700 --> 00:31:51,839 So I was wondering if there are other 928 00:31:51,840 --> 00:31:54,029 ways that the technology can be used 929 00:31:54,030 --> 00:31:56,099 to directly impact 930 00:31:56,100 --> 00:31:58,499 either the prevention or 931 00:31:58,500 --> 00:32:00,839 the results of an atrocity. 932 00:32:00,840 --> 00:32:01,840 Mm hmm. 933 00:32:02,490 --> 00:32:04,469 Well, to be honest, I'm not really sure, 934 00:32:04,470 --> 00:32:06,389 like whether we thought about image 935 00:32:06,390 --> 00:32:08,369 recognition in that sense, as you 936 00:32:08,370 --> 00:32:10,799 describe it, um, but I mean, that's 937 00:32:10,800 --> 00:32:12,589 part of why we're here, because, like, 938 00:32:13,620 --> 00:32:15,689 we kind of want to get input on a lot of 939 00:32:15,690 --> 00:32:16,979 different things. I mean, obviously, 940 00:32:16,980 --> 00:32:19,439 like, you can you can research a lot, 941 00:32:19,440 --> 00:32:21,779 but just getting like a human feedback on 942 00:32:21,780 --> 00:32:23,369 things would be very valuable. 943 00:32:23,370 --> 00:32:25,499 So, uh, if if, uh, you can 944 00:32:25,500 --> 00:32:26,699 stick around a couple of minutes later, I 945 00:32:26,700 --> 00:32:27,869 would be able to talk more. 946 00:32:27,870 --> 00:32:28,870 That's OK. 947 00:32:30,550 --> 00:32:31,689 Thank you very much, Nicolai. 948 00:32:31,690 --> 00:32:32,889 Please give them a warm round of 949 00:32:32,890 --> 00:32:33,890 applause.