1 00:00:00,000 --> 00:00:19,462 *36C3 Preroll music* 2 00:00:19,462 --> 00:00:23,945 Herald: Two speakers, that are on stage today are both anthropologists and they 3 00:00:23,945 --> 00:00:32,847 are both experts on hacking culture. Today also, they launched a website HackCur.io. 4 00:00:32,847 --> 00:00:37,633 And which is also the name of the talk 'hack_curio, decoding the Cultures of 5 00:00:37,633 --> 00:00:44,471 Hacking'. One video at a time. I welcome Gabriella, a.k.a. Biella Coleman and 6 00:00:44,471 --> 00:00:46,282 Paula Bialski. 7 00:00:46,282 --> 00:00:50,823 *Applause* 8 00:00:50,823 --> 00:00:56,285 Paula Belsky : Hello. Hello. Yes, good evening. CCC is so lovely to be here. We 9 00:00:56,285 --> 00:01:01,747 are super excited to stand before you here today and present a project we've been 10 00:01:01,747 --> 00:01:04,316 working on for the past year or so. 11 00:01:04,316 --> 00:01:06,899 Biella: Would not have been finished if it were not for 12 00:01:06,899 --> 00:01:09,751 this talk. Paula: Exactly. Biella: So thank you. 13 00:01:09,751 --> 00:01:12,031 Paula: Exactly. Thanks for forcing us to stand before you 14 00:01:12,031 --> 00:01:17,133 and get away from our desks. Here's a drink, some wine, have some 11:30 PM 15 00:01:17,133 --> 00:01:21,205 discussion with you and there's no better place to launch the project that 16 00:01:21,205 --> 00:01:25,467 we're gonna show you then at the CCC. So we're super excited to be here. Let's 17 00:01:25,467 --> 00:01:29,349 start with the very basics. What is hack_curio? What is it that you guys are 18 00:01:29,349 --> 00:01:35,421 gonna see in the next hour or so? Hack_curio is a web shot site featuring 19 00:01:35,421 --> 00:01:40,723 short video clips all related to computer hackers. Now a bit a bit of background. My 20 00:01:40,723 --> 00:01:45,395 name is Paula Bialski and I am a sociologist. I'm an ethnographer of hacker 21 00:01:45,395 --> 00:01:49,297 cultures. I study corporate hacker developers. And for those of you who don't 22 00:01:49,297 --> 00:01:51,719 know Biella Coleman. Biella: I'm an anthropologist. 23 00:01:51,719 --> 00:01:55,939 I also study computer hackers. And we, along with Chris Kelty, have helped to put 24 00:01:55,939 --> 00:01:59,821 this Website together. Paula: Exactly. And in 25 00:01:59,821 --> 00:02:04,393 the past year, we've decided to come together and bring all sorts of clips from 26 00:02:04,393 --> 00:02:11,485 public talks, from documentaries, from Hollywood films, mems, advertising, all 27 00:02:11,485 --> 00:02:16,197 sorts of sources. We've brought together these videos that also come together 28 00:02:16,197 --> 00:02:22,109 short descriptions by authors, by scholars, by journalists, by people who 29 00:02:22,109 --> 00:02:26,929 know something about hacker cultures. And we brought that together all in one place. 30 00:02:26,929 --> 00:02:32,663 So call it a museum, call it a compendium, call it a web site. And it's a place for 31 00:02:32,663 --> 00:02:38,465 you to really pay homage to you guys, because hackers come in all shapes and 32 00:02:38,465 --> 00:02:41,797 sizes. What it means to hack, might mean something to you, but might mean something 33 00:02:41,797 --> 00:02:46,199 very different to you. And we decided as anthropologists, we think it's very 34 00:02:46,199 --> 00:02:50,211 important to represent a certain culture in a certain way. We're not just hackers 35 00:02:50,211 --> 00:02:54,268 in hoodies. It's a really diverse culture. So we're going to talk about that today. 36 00:02:54,268 --> 00:02:58,119 Biella: All right. So like, how did this project come into being? Like, why are we here? 37 00:02:58,119 --> 00:03:02,717 Why did we spend the last year doing this? Well, you know, first of all, it wasn't 38 00:03:02,717 --> 00:03:08,849 created. I didn't create it because I had this idea in mind. It was created because 39 00:03:08,849 --> 00:03:14,731 I started to collect videos for a reason. I'm a professor and I twice a week stand 40 00:03:14,731 --> 00:03:20,533 in front of students who are on the Internet, on Facebook, maybe buying shoes. 41 00:03:20,533 --> 00:03:25,964 And it's really hard to get their attention. And you know, what I found 42 00:03:25,964 --> 00:03:31,367 using videos in class was an amazing way to get them off Facebook and paying 43 00:03:31,367 --> 00:03:36,628 attention to what to me. Right. So over the years, I just collected a lot of 44 00:03:36,628 --> 00:03:40,660 videos. Right. Video after video after video after video. And in certain 45 00:03:40,660 --> 00:03:44,139 point. I was like, you know, I have this private collection, semi private 46 00:03:44,139 --> 00:03:49,976 collection they use in class. Why don't I transform it into a public resource and 47 00:03:49,976 --> 00:03:55,877 more so as someone who studied hackers for many years, why don't I kind of make it 48 00:03:55,877 --> 00:04:01,572 into a collaborative project? Why don't I tap into the kind of expertise that exists 49 00:04:01,572 --> 00:04:05,758 among hackers and journalists and researchers and academics and draw them 50 00:04:05,758 --> 00:04:11,595 in? And so I decided to do that right. Until about a year and a half ago, I 51 00:04:11,595 --> 00:04:16,648 brought together a couple of other people like Paula, Chris Kelty, who's another 52 00:04:16,648 --> 00:04:21,904 curator. And I said, like, let's get this going. So when we were kind of fashioning 53 00:04:21,904 --> 00:04:26,159 the project, we were also thinking like, what are we trying to do with this project? 54 00:04:26,159 --> 00:04:31,847 Right. You're not my students. I don't see you twice a week. And so we came up with 55 00:04:31,847 --> 00:04:35,006 some goals and we don't know if we're gonna achieve these goals. The site 56 00:04:35,006 --> 00:04:39,228 literally is going live like right now. But this is what we're trying to do with a 57 00:04:39,228 --> 00:04:42,677 project. We're trying to chip away at simplistic conceptions and stereotypes of 58 00:04:42,677 --> 00:04:47,705 hackers. We know these exist. Can we chip away at them? Right. We want to offer new 59 00:04:47,705 --> 00:04:52,870 perspectives on what hackers have actually done and what they do. A really important 60 00:04:52,870 --> 00:04:56,601 thing which Paula has already kind of mentioned is showcase the diversity of 61 00:04:56,601 --> 00:05:00,616 hacking. Right. People who do block chain and free software and security. And 62 00:05:00,616 --> 00:05:04,295 there's there's similarities, but there's also differences like let's try to show 63 00:05:04,295 --> 00:05:09,272 this. And while this is not an archive, this is not the Internet Archive. We are 64 00:05:09,272 --> 00:05:13,794 trying to kind of preserve bits and bytes of hacker history. So these are the four 65 00:05:13,794 --> 00:05:20,724 goals. And we do feel that video. Right, is a nice medium, a mechanism to achieve 66 00:05:20,724 --> 00:05:26,783 these four goals. It's persuasive, it's compelling, it's memorable. It's fun. Like 67 00:05:26,783 --> 00:05:31,525 we like to waste time at work on video. Right. So we're like, hey, let's add a 68 00:05:31,525 --> 00:05:37,012 little persuasive punch to Tex. And this is why we decided to do it this way. 69 00:05:37,012 --> 00:05:40,827 Paula: Exactly. So what happens when you click on the site today and how is it organized? We 70 00:05:40,827 --> 00:05:47,004 want to show you a little bit of of the actual architecture of the site itself. So 71 00:05:47,004 --> 00:05:53,939 we got. When you click on the Web site, you see as... you see certain categories, 72 00:05:53,939 --> 00:05:58,814 we've grouped the videos in two different categories because as you say, there's a 73 00:05:58,814 --> 00:06:03,036 huge diversity. So you can see here, Biella is lovely here, pointing out the 74 00:06:03,036 --> 00:06:07,884 beautiful categories. We've got anti security hackers, block chain hackers. 75 00:06:07,884 --> 00:06:12,760 We've got free and open the software, we've got phreaking, we've got hacker 76 00:06:12,760 --> 00:06:17,724 depictions. You can look at all sorts of sort of different categories. You go onto 77 00:06:17,724 --> 00:06:23,554 a category, website site and then you have a blurb about what this subculture of 78 00:06:23,554 --> 00:06:28,348 hacking is all about or what this what the aim is, exactly what the theme is. And 79 00:06:28,348 --> 00:06:32,308 then you have all sorts of little videos that last maybe 30 seconds, maybe a few 80 00:06:32,308 --> 00:06:38,249 minutes. And under these videos, you would look at the video and then you would have 81 00:06:38,249 --> 00:06:42,100 a little bit of a blurb. It's not an essay. It's not a book. It's not some 82 00:06:42,100 --> 00:06:45,935 boring academic text. It's supposed to be funny. It's supposed to be for your 83 00:06:45,935 --> 00:06:50,201 grandmother's to read. It's supposed to be actually accessible and understandable. 84 00:06:50,201 --> 00:06:54,829 Right. So you have the video and the actual text itself. Right. So this is how 85 00:06:54,829 --> 00:06:59,214 it looks like. And this is maybe some sample of our content itself. What do we 86 00:06:59,214 --> 00:07:04,209 have? We've got 42 entries at the moment which we've collected from, as I said, 87 00:07:04,209 --> 00:07:09,822 various different academics with different authors. And by the end of 2020, we would 88 00:07:09,822 --> 00:07:15,690 love to have around 100, 100 entries and we'd try to publish around 50 or 20 89 00:07:15,690 --> 00:07:21,689 entries. Biella: after that. Because it's really brutally hard to edit academics. Paula: Exactly. 90 00:07:21,689 --> 00:07:26,689 Exactly. And so we've got what you'll find. These are just some examples. We'll 91 00:07:26,689 --> 00:07:30,878 get into some really of the videos in just a moment. But for example, you would look 92 00:07:30,878 --> 00:07:35,437 at hackers and engineers humming at the Internet Engineering Task Force, or you 93 00:07:35,437 --> 00:07:40,109 look at an entry that's about the programing legend, of course, Grace Hopper 94 00:07:40,109 --> 00:07:44,087 being interviewed by a clue, David Letterman. Maybe you guys have seen this 95 00:07:44,087 --> 00:07:49,291 video, a block chain ad that people see it or you'd say you'd ask, is this real? It's 96 00:07:49,291 --> 00:07:53,644 kind of wacky ad or is it parody? And when you watch you, you have to know that this 97 00:07:53,644 --> 00:07:58,766 is actually real. The actor Robert Redford showing off his mad social engineering 98 00:07:58,766 --> 00:08:04,244 skills with the help of cakes and balloons or how to make sense of why algerien hacker 99 00:08:04,244 --> 00:08:08,827 Hamza Bender Lodge dressed by U.S. government smiles and how many people from 100 00:08:08,827 --> 00:08:13,509 Algeria understand his grin. So this kind of various diversity of really what 101 00:08:13,509 --> 00:08:19,434 hacking is really all about. Biella: But but we're here to get the video party started. 102 00:08:19,434 --> 00:08:22,597 Paula: Exact right. *From audience*: Exactly! 103 00:08:22,597 --> 00:08:27,771 Finaly. Fine. Let's get started. Yeah. Biella: with a little Background. Paula: Exactly. Exactly. 104 00:08:27,771 --> 00:08:32,090 Ok. So which I'm going to start with the day. You start. You start. Biella: All 105 00:08:32,090 --> 00:08:35,024 right. So we thought it would be a good idea to start with phone phreaking, 106 00:08:35,024 --> 00:08:39,978 because phone phreaking really developed at the same time. If not kind of before 107 00:08:39,978 --> 00:08:45,233 computer hacking. And we're going to show Joy Bubbles. Joe Ingrassia, who is, you 108 00:08:45,233 --> 00:08:49,181 know, often considered to be the grandfather of phone phreaking. So let's 109 00:08:49,181 --> 00:08:53,528 go to a video. *Text from Video* Speaker1: In the 110 00:08:53,528 --> 00:08:57,463 days when calls went through, the operators phreaking wasn't possible. But as 111 00:08:57,463 --> 00:09:01,705 humans switchboards were replaced by mechanical systems, different noises were 112 00:09:01,705 --> 00:09:04,389 used to trigger the switches. 113 00:09:04,389 --> 00:09:09,435 *Whistling* 114 00:09:09,435 --> 00:09:12,602 If you'd had perfect pitch like blind phone free Joe 115 00:09:12,602 --> 00:09:23,208 Ingrassia, you could whistle calls through the network. Joe: Let's see if I make it this time. This is really hard 116 00:09:23,208 --> 00:09:29,688 to do, it sounded like all the tones were present, so the phone should be ringing a bell. Now. Okay. 117 00:09:29,688 --> 00:09:31,958 I get the phone, it just take a little while... Speaker1: He even showed 118 00:09:31,958 --> 00:09:37,263 off his skills at the local media. Speaker2: From his one phone to a town in Illinois and back 119 00:09:37,263 --> 00:09:42,920 to his other phone, a thousand miles phone call by whistling. Joe Ingrassia.... 120 00:09:42,920 --> 00:09:46,581 Biella: right? Very cool, right? So Joe Ingrassia 121 00:09:46,581 --> 00:09:52,120 is featured. And Joan Donovan, who is like a mad researcher at Harvard University, 122 00:09:52,120 --> 00:09:56,796 wrote a really awesome entry about that. No, of course, she emphasizes things like, 123 00:09:56,796 --> 00:10:01,461 you know, while hacking is often tied to computers, it's often tied to any system 124 00:10:01,461 --> 00:10:07,484 that you could understand, improve, fix, undermine. And the phreakers really showed 125 00:10:07,484 --> 00:10:11,954 that. Right. And of course, the history of phone phreaking is about blind kids. Not 126 00:10:11,954 --> 00:10:16,390 everyone who is a freak was blind, but many of them were. They met each other in 127 00:10:16,390 --> 00:10:22,289 camp and kind of exchanged information. And that was one of the ways in which 128 00:10:22,289 --> 00:10:26,092 phone phreaking grew. Phone phreaking really grew as well. When a big article in 129 00:10:26,092 --> 00:10:33,001 1971 was published by Roone Rosin Bomb in Esquire magazine, who has read that 130 00:10:33,001 --> 00:10:38,167 article? Is anyone? It's incredible. We mentioned it, I think, in a piece. Check 131 00:10:38,167 --> 00:10:43,995 it out. Phreaking freaking exploded after that article. The spelling phreaking 132 00:10:43,995 --> 00:10:49,637 changed from Capital F Freak to Ph. Because of that article, phreaking also 133 00:10:49,637 --> 00:10:54,820 grew when blue boxes were created. Right. This is also something that Joan writes 134 00:10:54,820 --> 00:10:59,774 about in her entry. One of the cool things that Joan writes about and then I'm going 135 00:10:59,774 --> 00:11:05,524 to turn it over to Paula again is that some phreaks train birds, OK, to freaking 136 00:11:05,524 --> 00:11:10,405 phreak. Let's just leave it at that, because that's pretty cool. All right. 137 00:11:10,405 --> 00:11:14,035 Paula: OK. Are you guys ready now to cringe? There we need a little bit of a 138 00:11:14,035 --> 00:11:18,600 cringing moment as well. So without further ado, this is Steve Ballmer that 139 00:11:18,600 --> 00:11:22,844 would like to do some dancing. Biella: From Microsoft. You just don't know. 140 00:11:22,844 --> 00:11:49,633 *Music* 141 00:11:49,633 --> 00:11:51,752 Paula: OK. Yeah, that's right. Biella: I just want to say one little 142 00:11:51,752 --> 00:11:56,715 thing. Paula: Yeah, of course there's a remix of this with goats screaming like, look it 143 00:11:56,715 --> 00:12:03,297 up. It's awesome. Paula: Exactly. But why do we show Steve Ballmer the sort of like Godfather? 144 00:12:03,297 --> 00:12:08,728 Exactly. Kind of an anti hacker of sorts. I myself am a staff who've worked 145 00:12:08,728 --> 00:12:13,538 among a corporate culture of software developers. Aren't hackers per se? But if 146 00:12:13,538 --> 00:12:18,030 you think of a figure like Steve Ballmer, a lot of you guys who perhaps identify 147 00:12:18,030 --> 00:12:22,255 yourself as hackers, you have day jobs, you go to work and you have to make some 148 00:12:22,255 --> 00:12:26,451 money in order to live and do work on your own projects. And you often have to face 149 00:12:26,451 --> 00:12:31,366 sort of mini Steve Ballmers at work. And this is a quote that I have my own entry 150 00:12:31,366 --> 00:12:35,530 that I did right next to this video. Steve Ballmer, even Ballmers unbridled display 151 00:12:35,530 --> 00:12:39,758 of exuberance is exceptional. Many software developers will have to deal with 152 00:12:39,758 --> 00:12:44,265 Mini Steve Ballmers every day. Biella: We are sorry that you do. But If you do - you do. 153 00:12:44,265 --> 00:12:47,261 Paula: Exactly. And so this but this exuberance is 154 00:12:47,261 --> 00:12:51,577 all about these sort of slogans of win big, save the world while building 155 00:12:51,577 --> 00:12:56,271 technology, be awesome, be true, whatever it is your corporate slogan is. And 156 00:12:56,271 --> 00:13:01,655 there is, I think, the way in which the software developer and sort of the hackers 157 00:13:01,655 --> 00:13:06,199 that work in their day jobs challenge this sort of really intense exuberance of 158 00:13:06,199 --> 00:13:11,403 wearing your corporate T-shirt and smiling every day in a way in which you hack your 159 00:13:11,403 --> 00:13:16,247 daily projects, you work on your own private projects on the side. You actually 160 00:13:16,247 --> 00:13:22,723 do have many acts of resistance in a way to this kind of loud, massive exuberance. 161 00:13:22,723 --> 00:13:27,957 And I talk about these sort of side line mini hacks that happen on an everyday 162 00:13:27,957 --> 00:13:30,632 corporate culture. Biella: Check out your entry. It's really 163 00:13:30,632 --> 00:13:35,666 funny. All right. So now we're going to hacktivists. So who here has heard of 164 00:13:35,666 --> 00:13:40,597 Phineas Fisher? All right. Awesome. Just in case, for those who are watching the 165 00:13:40,597 --> 00:13:45,193 video now or later, I'm going to give a little bit of background. But I love this 166 00:13:45,193 --> 00:13:50,597 video about Phineas Fisher because he's explained what he or the group has done, 167 00:13:50,597 --> 00:13:54,458 but he also does kind of a very clever media hack. So for those that don't know 168 00:13:54,458 --> 00:14:01,415 who Phineas Fisher, is he or the group is a hacktivists that claims to be inspired by 169 00:14:01,415 --> 00:14:06,347 Anonymous, says Jeremy Hammond. He's hacked into various corporations from 170 00:14:06,347 --> 00:14:11,611 FinFisher to hacking team. And what he did was take documents, take e-mail and then 171 00:14:11,611 --> 00:14:16,033 publish them. And these were important in ways that I'll talk about in a moment. 172 00:14:16,033 --> 00:14:23,687 He's donated, I think, stolen bitcoins to rush over government. In this fall. 173 00:14:23,687 --> 00:14:30,042 He published a manifesto kind of calling for public interest hacking and claims he 174 00:14:30,042 --> 00:14:34,761 would give one hundred thousand dollars to anyone who does this. So now I'm going to 175 00:14:34,761 --> 00:14:39,524 show the first and I believe only interview that he has done. And he did 176 00:14:39,524 --> 00:14:49,529 this with Vise News a couple of years ago. *Video starts* 177 00:14:49,529 --> 00:14:54,621 Let's do this. These are the exact words from our live text exchange, voiced by one 178 00:14:54,621 --> 00:14:57,907 of my colleagues. Colleage: So why did you hack hacking 179 00:14:57,907 --> 00:15:01,167 team? Cermet: Well, I just for the citizen lab 180 00:15:01,167 --> 00:15:06,453 reports on FinFisher and hacking team and thought, that's fucked up. And I hacked 181 00:15:06,453 --> 00:15:08,594 them. Colleage: What was the goal on hacking the 182 00:15:08,594 --> 00:15:14,343 hacking team data? Were you tried to stop them? Cermet: For the locals. I don't really 183 00:15:14,343 --> 00:15:18,459 expect leaking data to stop a company, but hopefully can at least set them back a bit 184 00:15:18,459 --> 00:15:23,035 and give some breathing room to the people being targeted with their software. 185 00:15:23,035 --> 00:15:25,944 *Video ends* Biella: OK, so this does not yet exist on 186 00:15:25,944 --> 00:15:29,974 Hack_Curio. I have to write the entry, but because I was so busy getting 187 00:15:29,974 --> 00:15:33,128 the other site in preparation, I haven't done it, but it will happen in the next 188 00:15:33,128 --> 00:15:37,602 few weeks. But what I love about this video is, first of all, he's like hacking 189 00:15:37,602 --> 00:15:44,771 media representations. Right? I mean, even when awesome journalists like Motherboard 190 00:15:44,771 --> 00:15:52,046 publish on hackers or other kind of entities, they still kind of use a masked 191 00:15:52,046 --> 00:15:56,264 hacker even once they published about FinFisher and they put like a mask on him. 192 00:15:56,264 --> 00:16:00,791 And it's like hackers have heat, like they don't need a mask. Right. And there is 193 00:16:00,791 --> 00:16:04,666 this this sense where there's always a kind of demonic, masked figure. And he was 194 00:16:04,666 --> 00:16:09,141 like, OK, I'll do this interview, but you have to represent me as like a lovable 195 00:16:09,141 --> 00:16:15,034 Muppet like figure. Right? So he's there hacking the media. But what's also really 196 00:16:15,034 --> 00:16:20,859 interesting in it. And you watch the full video, it's kind of amazing. Is that, you 197 00:16:20,859 --> 00:16:24,675 know, he kind of claims, oh, I didn't have much of in fact, I don't think he could do 198 00:16:24,675 --> 00:16:29,053 anything, but in fact, first of all, the information that was released really 199 00:16:29,053 --> 00:16:33,704 reaffirms what people suspected. For example, and in the case of hacking team 200 00:16:33,704 --> 00:16:41,308 who was selling problematic exploit spyware to dictatorial regimes. We really 201 00:16:41,308 --> 00:16:46,414 got a confirmation that this was happening. And in fact, eventually hacking 202 00:16:46,414 --> 00:16:51,464 team even lost her license. Right. This was like a direct effect from what 203 00:16:51,464 --> 00:16:56,911 FinFischer did. So really, it's it's a kind of amazing video that showcases what 204 00:16:56,911 --> 00:17:01,465 he was doing, his reasoning, and then was a performance, literally, a puppet that 205 00:17:01,465 --> 00:17:07,397 hacked the media. OK, so now we're going to rewind a little bit and go back in 206 00:17:07,397 --> 00:17:13,624 time. So a lot of hackers care about cryptography. Right? And ever since the 207 00:17:13,624 --> 00:17:21,822 cipher punks. And since that period, there have been projects from TOR to Signal that 208 00:17:21,822 --> 00:17:27,050 have enabled cryptography. That has been really important for human rights 209 00:17:27,050 --> 00:17:34,425 activists and others. But one of the great, great kind of encryption projects 210 00:17:34,425 --> 00:17:39,451 came from this fellow, Tim Jenkins, who here in the room has heard of Tim Jenkins. 211 00:17:39,451 --> 00:17:45,315 OK. This is amazing. This is why we're doing kind of hack_curio. So Tim Jenkins is 212 00:17:45,315 --> 00:17:53,111 from South Africa. And beginning in 1988, secret messages were sent and received 213 00:17:53,111 --> 00:17:58,313 regularly across South Africa borders using an encrypted telematics system 214 00:17:58,313 --> 00:18:03,519 assembling assemble during the final years of the South African liberation struggle 215 00:18:03,519 --> 00:18:08,923 and Tim Jenkins, along with Ronnie Press, who has since passed away, created 216 00:18:08,923 --> 00:18:13,348 the system. And Tim Jenkins was kind of like a phone phreak. And that was one of 217 00:18:13,348 --> 00:18:17,569 the reasons, like he was good at working with phones. And what was amazing about 218 00:18:17,569 --> 00:18:22,856 this system, which was part of Operation Vula, was that allowed people in South 219 00:18:22,856 --> 00:18:30,985 Africa to communicate with leaders in exile - in London. Right? And Tim Jenkin 220 00:18:30,985 --> 00:18:36,040 created this system. I'm going to show a video about it in a moment. And Sophie 221 00:18:36,040 --> 00:18:41,591 Dupin has written a terrific entry. The reason why we have him with the key there 222 00:18:41,591 --> 00:18:46,180 was that like, you know, the South African apartheid government did not really like 223 00:18:46,180 --> 00:18:52,247 Tim Jenkins, so they threw him in jail. Well, a lot of hackers lock pick. He 224 00:18:52,247 --> 00:18:59,918 actually created 10 wooden keys secretly in the wooden shop and broke out of jail. 225 00:18:59,918 --> 00:19:04,324 I mean, talk about like taking lock picking to like another sort of level. All 226 00:19:04,324 --> 00:19:10,161 right. So let's listen and see the video about this incredible program. 227 00:19:10,161 --> 00:19:12,787 *Video starts* Tim Jenkin: After we sent in the first 228 00:19:12,787 --> 00:19:18,102 computer. We expect things to start immediately, but it actually took a couple 229 00:19:18,102 --> 00:19:24,101 of weeks. And then suddenly one day I was sitting at my desk and the telephone 230 00:19:24,101 --> 00:19:29,831 answering machine suddenly started roaring and I thought, this must be the 231 00:19:29,831 --> 00:19:33,831 wrong number or something. But then, sure enough, I heard the distinctive tone of 232 00:19:33,831 --> 00:19:38,039 the messages and I could hear this thing coming through the tape. *Modem 14.5k* 233 00:19:38,039 --> 00:19:42,764 *sound* Word, and word, and word. And then it stopped and I loaded the message 234 00:19:42,764 --> 00:19:48,419 onto my computer. In fact, it was a report from Matt. And sure enough, there was our 235 00:19:48,419 --> 00:19:53,196 first message. Absolutely perfect. *sound of a* 236 00:19:53,196 --> 00:19:58,321 *printer working* *Video ends* 237 00:19:58,321 --> 00:20:05,051 Biella: Ah, fax machine. OK. So this is from the entry by Sophie Dupin, who is writing a 238 00:20:05,051 --> 00:20:08,913 dissertation on this topic. The international hacker community has since 239 00:20:08,913 --> 00:20:12,681 taken notice of Tim Jenkins and the Vula encrypted communication system that 240 00:20:12,681 --> 00:20:17,504 embodies so many qualities often associated with exceptional, with an 241 00:20:17,504 --> 00:20:23,038 exceptional hack. Elegant, clever, usable and pragmatic. Right? Jenkins has been 242 00:20:23,038 --> 00:20:28,650 invited to speak at the Berlin Logan Symposium in 2016 and to lock picking 243 00:20:28,650 --> 00:20:34,573 communities in the Netherlands and the United States. In 2018 the RSA Security 244 00:20:34,573 --> 00:20:40,611 Conference gave Jenkin the first award for excellence in humanitarian service. So 245 00:20:40,611 --> 00:20:46,258 just like one last thing, this is a good reminder that histories of computer 246 00:20:46,258 --> 00:20:51,679 hacking are often skewed. They often actually start with the United States. 247 00:20:51,679 --> 00:20:57,348 When, for example, in Europe with the CCC, that story's been told in bits and pieces, 248 00:20:57,348 --> 00:21:04,330 but deserves a much longer or much larger showcase. And actually this example also 249 00:21:04,330 --> 00:21:10,050 shows that, for example, the history of encryption when it comes to communication 250 00:21:10,050 --> 00:21:14,833 didn't even necessarily start in the United States. Right? And so it's really, 251 00:21:14,833 --> 00:21:19,169 really important to kind of showcase these histories that haven't been told 252 00:21:19,169 --> 00:21:21,379 elsewhere. Paula: So maybe by now you're kind of 253 00:21:21,379 --> 00:21:27,347 getting at the fact that we see hacking as a diverse practice. Hackers as a diverse 254 00:21:27,347 --> 00:21:31,981 group of people who do different things. And at the moment we're going I want to 255 00:21:31,981 --> 00:21:38,054 come back to the ways in which hackers challenge power through challenging really 256 00:21:38,054 --> 00:21:43,196 the very stereotype of what gender means and challenging, really gender politics. 257 00:21:43,196 --> 00:21:48,693 And it will start to turn to this topic by looking at an entry that a woman named 258 00:21:48,693 --> 00:21:54,284 Christina Dunbar Hester has done on a woman named Naomi Cedar. And some of you 259 00:21:54,284 --> 00:22:00,418 probably know Naomi Cedar. This is part of her entry. And she wrote, Naomi Cedar is a 260 00:22:00,418 --> 00:22:04,355 programmer and core participant in the Python programing language community. As a 261 00:22:04,355 --> 00:22:08,677 trans identified person, Cedar grappled with whether she would have to give up 262 00:22:08,677 --> 00:22:13,352 everything in order to transition and whether the community would accept her for 263 00:22:13,352 --> 00:22:19,187 doing so. So let's watch a clip of the video and let's see how Naomi Cedar challenge that. 264 00:22:19,187 --> 00:22:26,087 Biella: I think she gave this talk at PyCon, the Python Open Source Developer 265 00:22:26,087 --> 00:22:30,077 conference, and it's really incredible talk. I really encourage you to watch the 266 00:22:30,077 --> 00:22:33,480 whole talk. But this is a question. This is the moment where she's like, do I have 267 00:22:33,480 --> 00:22:37,232 to leave the community or can I transition in the community? 268 00:22:37,232 --> 00:22:39,632 Paula: Exactly. So let's watch a tiny clip. 269 00:22:39,632 --> 00:22:42,826 *clip starts* I decided that to do that would probably 270 00:22:42,826 --> 00:22:47,928 mean giving up everything. Remember, back at 13, I had absorbed this into my brain 271 00:22:47,928 --> 00:22:51,536 that the only way you were going to get out of this was to basically leave 272 00:22:51,536 --> 00:22:56,577 everything. And this was a very painful thing to think about. But like a lot of 273 00:22:56,577 --> 00:22:59,709 trans people, I had come to the point where even if I lost everything, that was 274 00:22:59,709 --> 00:23:08,744 fine. So I started to think about other alternatives here. I had toyed with the 275 00:23:08,744 --> 00:23:12,640 idea of doing the education summit as a farewell thing to the community. I would 276 00:23:12,640 --> 00:23:16,615 do it and then disappear, go into the witness protection program. The 277 00:23:16,615 --> 00:23:20,754 only problem was I actually started accelerating the pace of my transition 278 00:23:20,754 --> 00:23:26,118 because, well, it was just such freaky relief to start moving in that direction 279 00:23:26,118 --> 00:23:31,145 that that wouldn't work. So I actually thought about what was for me hacking back 280 00:23:31,145 --> 00:23:36,892 to Laverne Cox, a very revolutionary idea. What if I just did it and was open about 281 00:23:36,892 --> 00:23:42,856 it? First thing I looked at codes of conduct. I looked for specifics. What 282 00:23:42,856 --> 00:23:49,137 happens to me if there is a problem? If I am harassed? This was important to me. 283 00:23:49,137 --> 00:23:54,417 Other thing I did was I started telling a few people Jesse Nola, Avi Alaska. Some 284 00:23:54,417 --> 00:23:58,461 people I would work with PyCon on and they were all pretty cool with the idea. And 285 00:23:58,461 --> 00:24:01,821 the more I talked about it, the more I decided that I would go ahead and take 286 00:24:01,821 --> 00:24:06,329 that chance. So I did. I started by teaching at some Python workshops for 287 00:24:06,329 --> 00:24:12,230 women. I spoke at some conferences. We went to PyCon . It was good. The education 288 00:24:12,230 --> 00:24:16,029 summit was fine. Okay. Some of the people I worked with in organizing it were a 289 00:24:16,029 --> 00:24:19,917 little bit confused when the names on the emails changed. I apologize, but in 290 00:24:19,917 --> 00:24:25,533 general it went pretty well. In fact, the more open I was, the easier it was on. It 291 00:24:25,533 --> 00:24:29,429 was for me because I didn't have to worry about being outed. And it was easier for 292 00:24:29,429 --> 00:24:33,272 other people because they certainly knew what to expect. The other interesting 293 00:24:33,272 --> 00:24:37,297 sidelight is that when I told people they sometimes felt an obligation to share some 294 00:24:37,297 --> 00:24:41,728 deep, dark secret about themselves, like I kind of thrump them and they had to answer 295 00:24:41,728 --> 00:24:50,267 back. So my takeaway here is that, we talk a lot about diversity and that's real. So 296 00:24:50,267 --> 00:24:56,818 we should be ending on this point, except that I'm a contrarian in my old age. 297 00:24:56,818 --> 00:25:02,816 So it is not quite all rainbows and unicorns or as you might put it, this is 298 00:25:02,816 --> 00:25:09,441 kind of common in social justice circles right now. We don't get a cookie. 299 00:25:09,441 --> 00:25:11,621 *Video ends* Paula: All right. And yeah, yeah, 300 00:25:11,621 --> 00:25:13,940 *Paula and Biella are applauding * Biella: He's a very powerful player. 301 00:25:13,940 --> 00:25:19,246 Paula: Exactly. And I guess we could also say that the next if I want to show that 302 00:25:19,246 --> 00:25:23,997 after the entry by Christina Dunbar Hester, Naomi Cedar actually gave a response 303 00:25:23,997 --> 00:25:27,242 to this entry, which we've also published, which we also want to do. We want to have 304 00:25:27,242 --> 00:25:32,353 a discussion between some of the responses to the actual very areas. 305 00:25:32,353 --> 00:25:35,188 Biella: So we actually wanted to quote it in full. 306 00:25:35,188 --> 00:25:40,771 Paula: Yeah, exactly. So perhaps. Let's read, let's read this this section from 307 00:25:40,771 --> 00:25:47,416 the response of Naomi Cedar. PyCon itself has continued to evolve into an ever more 308 00:25:47,416 --> 00:25:51,436 diverse place with an ever stronger representation of queer folks, people of 309 00:25:51,436 --> 00:25:55,511 color, people who speak different languages, etc. Codes of conduct are 310 00:25:55,511 --> 00:26:00,615 nearly universal these days, and more often than not, communities insist that 311 00:26:00,615 --> 00:26:05,317 they be well crafted and meaningful and backed up by real enforcement. Even in 312 00:26:05,317 --> 00:26:10,168 these retrograde times of official attacks on the rights of so many groups, we have 313 00:26:10,168 --> 00:26:15,110 come a long way. But just as I said five years ago, it's still not all rainbows and 314 00:26:15,110 --> 00:26:20,113 unicorns. Too many groups throughout the open source world globally are making only 315 00:26:20,113 --> 00:26:24,469 token efforts to foster inclusion. And in my opinion, too many members of privileged 316 00:26:24,469 --> 00:26:29,204 groups tend to focus on supervisual or cosmetic changes rather than addressing 317 00:26:29,204 --> 00:26:33,794 the underlying fundamental issues. Marginalized groups face. It doesn't take 318 00:26:33,794 --> 00:26:39,412 a bit away from how far we've come to also acknowledge how much we still have to do. 319 00:26:39,412 --> 00:26:43,503 Naomi Cedar. So this really part we wanted to discuss this in the way in which 320 00:26:43,503 --> 00:26:48,382 hacking is also a practice of challenging power, challenging stereotypes and 321 00:26:48,382 --> 00:26:52,276 challenging really gender norms in many ways. All right, let's move on. 322 00:26:52,276 --> 00:26:57,083 Biella: All right. So the final frontier. We have three more videos to show. Before 323 00:26:57,083 --> 00:27:03,517 we get to the Q and A. In all videos relate to geopolitics and hacking. You 324 00:27:03,517 --> 00:27:08,482 know, hacking has always been political in some fashion, if for no other reason than 325 00:27:08,482 --> 00:27:12,658 sometimes laws are challenged. You're you're doing what you're doing, something 326 00:27:12,658 --> 00:27:16,520 that someone doesn't want you to do. Right. But there's only been certain 327 00:27:16,520 --> 00:27:22,151 moments, where nation states have been interested in hacking or there have been 328 00:27:22,151 --> 00:27:26,755 sort of ways in which nation states have used hacking. For example, recently in 329 00:27:26,755 --> 00:27:32,699 order to kind of engage in international politics. So we're going to kind of focus 330 00:27:32,699 --> 00:27:39,838 on these last, the last three videos will focus on these issues. We're at the CCC. 331 00:27:39,838 --> 00:27:44,667 So of course, I wanted to show a video related to CCC. Unfortunately, I don't have 332 00:27:44,667 --> 00:27:52,619 one related to the German CCC. Please do send good videos related to the CCC to me. 333 00:27:52,619 --> 00:27:59,665 But I am going to show one related to the FCCC established in Lion by Jean-Bernard 334 00:27:59,665 --> 00:28:05,926 Condat. So do people know what the F stands for? All right. What 335 00:28:05,926 --> 00:28:08,419 does it stand for? One Auditor: French? Biella: French. 336 00:28:08,419 --> 00:28:16,442 OK. Once you see the video. Oh no. Hold on. You will also see that it stands 337 00:28:16,442 --> 00:28:23,583 for fake and fuck as well, because basically the French chapter of the CCC 338 00:28:23,583 --> 00:28:30,070 was established in part to try to entrap hackers in order to kind of work for the 339 00:28:30,070 --> 00:28:35,529 French government. It's a fascinating story that's been told in bits and pieces 340 00:28:35,529 --> 00:28:39,098 and I'm going to say a little bit more about it. But now I'm going to show a clip 341 00:28:39,098 --> 00:28:43,928 from a French documentary that kind of, you know, charts a little bit of that 342 00:28:43,928 --> 00:28:47,025 history. It's in French with subtitles. 343 00:28:47,025 --> 00:31:42,793 *Video is in progress* 344 00:31:42,793 --> 00:31:50,829 Biella: OK. So pretty incredible, right? And this story has been told in bits and 345 00:31:50,829 --> 00:31:54,489 pieces by French journalists. I'm working with another French journalist to 346 00:31:54,489 --> 00:32:00,823 try to kind of uncover the fuller history, as well tell the story of kind of American 347 00:32:00,823 --> 00:32:04,616 and European hackers who did not get recruited by intelligence, but who 348 00:32:04,616 --> 00:32:09,244 nevertheless came from the underground, because they were breaking into systems, 349 00:32:09,244 --> 00:32:13,661 not maliciously, but they learned a lot and they had really valuable knowledge 350 00:32:13,661 --> 00:32:21,661 that no one else had. I mean, it's kind of really incredible, right? And, you know, 351 00:32:21,661 --> 00:32:25,658 this history, whether it's just that the transformation of the underground into 352 00:32:25,658 --> 00:32:31,065 security hackers or in the case of France, where some portion of people were tapped 353 00:32:31,065 --> 00:32:37,907 to work for intelligence informally, formerly with pressure. Right. Has yet to 354 00:32:37,907 --> 00:32:43,199 be written. And there's many remarkable elements about this. But basically, I do 355 00:32:43,199 --> 00:32:48,271 think it's remarkable that it's a bunch of kind of amateurs who just were obsessed 356 00:32:48,271 --> 00:32:53,139 with with networks who were the ones holding the special knowledge that were 357 00:32:53,139 --> 00:32:57,732 needed, that was needed by corporations and intelligence in order to start 358 00:32:57,732 --> 00:33:02,635 securing systems. Right. The other kind of really interesting thing is that some of 359 00:33:02,635 --> 00:33:09,935 the best underground non malicious hacker crews were European. TESO, which had a lot 360 00:33:09,935 --> 00:33:18,297 of Austrian and German members. ADM, which is from France, was considered to be the 361 00:33:18,297 --> 00:33:23,239 best at exploit writing. Rights. So the entry, which I'm going to write with a 362 00:33:23,239 --> 00:33:26,764 French journalist is going to reflect on this. And this is actually a big project 363 00:33:26,764 --> 00:33:32,253 that I'm working on as well. So I'll have more to say about it later. All right. So 364 00:33:32,253 --> 00:33:35,741 going from the past to the present. Paula: Exactly. And I guess we couldn't 365 00:33:35,741 --> 00:33:41,351 talk to you politics and hacking without talking about Trump, talking about Putin. 366 00:33:41,351 --> 00:33:45,915 A slew of politicians that we know in recent years has used the hacker for their 367 00:33:45,915 --> 00:33:51,255 own political discourse, for their somehow political gain. And with this next video 368 00:33:51,255 --> 00:33:57,818 will show us just that. This is our hacker depictions section. It was posted by a 369 00:33:57,818 --> 00:34:01,733 scholar named Marietta Brezovich. So without further ado, let's listen to the 370 00:34:01,733 --> 00:34:04,681 way in which Putin sees the hacker. 371 00:34:04,681 --> 00:35:06,412 *Video is in Progress* 372 00:35:06,412 --> 00:35:15,008 Paula: So I don't know if Putin was reading a Russian Hacker for the night. Biella: best image 373 00:35:15,008 --> 00:35:19,663 of the night. Possibly. I don't know. Paula: We weren't sure if Putin is reading 374 00:35:19,663 --> 00:35:23,929 Paul Graham's Hackers & Painters on the toilet or some of his other Hacker 375 00:35:23,929 --> 00:35:26,449 cultures literature. But it seems like he's getting something right. Right. We 376 00:35:26,449 --> 00:35:30,196 kind of think, hey, you kind of got it. It's not hackers actually. 377 00:35:30,196 --> 00:35:32,932 Biella: well, except for one part. Paula: Exactly. That's what we want to 378 00:35:32,932 --> 00:35:37,288 say. In some ways, yes. It's true. The hackers are artistic and creative, etc. 379 00:35:37,288 --> 00:35:38,960 Biella: They just don't wake up early in the morning. 380 00:35:38,960 --> 00:35:41,747 Paula: Exactly. Maybe they don't wake up early in the morning. But what's 381 00:35:41,747 --> 00:35:46,065 important, I think in here and this is also what Brezovich points out in her 382 00:35:46,065 --> 00:35:51,566 entry, is that he uses this, of course, for his political gain to show that he is 383 00:35:51,566 --> 00:35:55,630 not influencing any hackers or any technologists, who maybe identify as 384 00:35:55,630 --> 00:36:02,041 hackers or not. He's not influencing them. And because they are so free and artistic 385 00:36:02,041 --> 00:36:06,422 and sort of living in their sort of creative world that they're beyond his control. 386 00:36:06,422 --> 00:36:10,451 Right? So partially it's true. But partially he's gonna employing this to 387 00:36:10,451 --> 00:36:15,485 make a political statement about his non involvement with any sort of role. 388 00:36:15,485 --> 00:36:18,321 Biella: And what's interesting is all evidence points to the fact that that 389 00:36:18,321 --> 00:36:23,203 technologists who did the hacking just work at intelligence organizations. 390 00:36:23,203 --> 00:36:25,127 Paula: Exactly. Biella: All right. So we just had one more 391 00:36:25,127 --> 00:36:31,195 video and we'll end on a positive note. Right? A lot of stuff around hackers is 392 00:36:31,195 --> 00:36:34,389 sometimes depressing, especially when it comes to the law. They get arrested, they 393 00:36:34,389 --> 00:36:36,985 get thrown in jail. They commit suicide. Paula: True. 394 00:36:36,985 --> 00:36:40,826 Biella: And so we want to showcase a video that covers British and Finnish hacker 395 00:36:40,826 --> 00:36:45,071 Lauri Love, who's presented here at the CCC. Some of you may know that he face 396 00:36:45,071 --> 00:36:50,127 extradition to the United States due to his alleged involvement with Anonymous 397 00:36:50,127 --> 00:36:57,885 operation called #OpLastResort, which was kind of in support of Aaron Swartz, who 398 00:36:57,885 --> 00:37:01,912 had committed suicide when he was facing many criminal charges. And we'll watch a 399 00:37:01,912 --> 00:37:04,710 clip where parliamentarians and others debate his case. 400 00:37:04,710 --> 00:37:08,299 *Video proceedes* 401 00:37:08,299 --> 00:37:10,615 A young man with Asperger's syndrome 402 00:37:10,615 --> 00:37:16,617 awaits extradition to the United States, facing charges of computer hacking and is 403 00:37:16,617 --> 00:37:21,327 then likely to kill himself. It sounds familiar. He's not, of course, Gary 404 00:37:21,327 --> 00:37:25,913 McKinnon, who is saved by the prime minister. But Lauri Love, who faces, in 405 00:37:25,913 --> 00:37:30,212 a fact, a death sentence. So when the prime minister introduced the form above 406 00:37:30,212 --> 00:37:34,409 to, in her words, provide greater safeguards for individuals, surely she 407 00:37:34,409 --> 00:37:40,738 expensed it to protect the vulnerable, like Gary McKinnon, like Lauri Love. The 408 00:37:40,738 --> 00:37:44,715 honorable gentleman. My honorable friend obviously campaigned long and hard for 409 00:37:44,715 --> 00:37:48,768 Gary McKinnon. And obviously I took that decision because at that time it was a 410 00:37:48,768 --> 00:37:52,949 decision for the home secretary to decide whether there was a human rights case for 411 00:37:52,949 --> 00:37:57,453 an individual not to be extradited. We subsequently changed the legal position on 412 00:37:57,453 --> 00:38:01,148 that. So this is now a matter for the courts. There are certain parameters that 413 00:38:01,148 --> 00:38:04,481 the courts look at in terms of the extradition decision, and that is then 414 00:38:04,481 --> 00:38:07,848 passed to the home secretary. But it is for the courts to determine the human 415 00:38:07,848 --> 00:38:12,702 rights aspects of any case that comes forward. It was right, I think, to 416 00:38:12,702 --> 00:38:19,519 introduce the form box, to make sure that there was that challenge for cases here in 417 00:38:19,519 --> 00:38:22,930 the United Kingdom, as to whether they should be held here in the United Kingdom. 418 00:38:22,930 --> 00:38:26,496 But the legal process is very clear and the home secretary is part of that legal 419 00:38:26,496 --> 00:38:32,000 process. Biella: OK, so the author of the entry, 420 00:38:32,000 --> 00:38:38,612 Naomi Colvin, is right there in front. And she has a great sentence which 421 00:38:38,612 --> 00:38:42,631 says in Lauri Love, the U.S. had definitively chosen the wrong target 422 00:38:42,631 --> 00:38:47,609 principle, passionate and articulate, certainly more articulate than Theresa May 423 00:38:47,609 --> 00:38:53,267 herself in the clip which accompanies this article, Love versus USA would be one for 424 00:38:53,267 --> 00:38:58,176 the underdog. And it was Love one. He's not being extradited. And in part, it was 425 00:38:58,176 --> 00:39:05,107 also because Naomi Colvin was part of the team that stopped it. So let's thank Naomi 426 00:39:05,107 --> 00:39:15,849 as well as. *Applause* And it's just really important to document some of the 427 00:39:15,849 --> 00:39:20,513 wins every once in a while. So do check. Check that out. So we are now going to 428 00:39:20,513 --> 00:39:24,826 wrap up said that there's going to be 10 minutes for Q and A, but a few final 429 00:39:24,826 --> 00:39:31,309 reflections about this project. Paula: So I think these videos show actual 430 00:39:31,309 --> 00:39:35,835 hackers and hackings and at a more level meta level demonstrated how hackers have 431 00:39:35,835 --> 00:39:41,362 become central to our popular imagination. How hackers and hacking are one medium to 432 00:39:41,362 --> 00:39:44,843 think through digital cultures, to think through politics. I mean, we care about 433 00:39:44,843 --> 00:39:49,107 culture. We care about representing digging deep, looking at various angles of 434 00:39:49,107 --> 00:39:52,630 a certain culture. And I think that's the purpose. Where I see this is the purpose 435 00:39:52,630 --> 00:39:57,511 of Biella and mine, and Chris', and our friends projects is that we really want to take 436 00:39:57,511 --> 00:40:02,059 the work that we've been doing and really pay tribute to this really huge, diverse 437 00:40:02,059 --> 00:40:06,868 community that that you are. Biella: On a more practical level being a 438 00:40:06,868 --> 00:40:13,323 little less meta. We do hope that people assign hack_curio entries in their 439 00:40:13,323 --> 00:40:16,756 courses. You could use them in high school. You can use them in college 440 00:40:16,756 --> 00:40:20,840 classes. You know, heck, you know, maybe you could even use them in middle school, 441 00:40:20,840 --> 00:40:26,455 elementary. I don't know if that will work. But get it out there. And also for 442 00:40:26,455 --> 00:40:30,112 some of you, I think it will be fun to look at different tidbits of hacker 443 00:40:30,112 --> 00:40:35,405 history. And when you're at home for the holidays before you come to the CCC and 444 00:40:35,405 --> 00:40:41,051 you're like, man, my parents don't really understand what I do. So you could fire up 445 00:40:41,051 --> 00:40:44,707 a video that kind of represents what you do and fire up another video that 446 00:40:44,707 --> 00:40:49,855 represents what you don't do. Paula: And have a discussion. Haven't 447 00:40:49,855 --> 00:40:51,740 especially. Biella: And so this is our last slide. 448 00:40:51,740 --> 00:40:58,594 What next? The site is Life. Share It. Our Twitter address is up there. We consider 449 00:40:58,594 --> 00:41:03,834 this a soft launch. We have 42 entries, but we'll get some feedback and tweet 450 00:41:03,834 --> 00:41:09,237 things, send video suggestions, spread the word. And to end before Q and A. We just 451 00:41:09,237 --> 00:41:13,476 really want to thank the CCC. We want to thank Lisa for having us here. This is 452 00:41:13,476 --> 00:41:18,830 really an amazing place to launch. And we also want to thank everyone who made this 453 00:41:18,830 --> 00:41:24,242 possible from funding to the authors to the entire hack_curio team. So thank you 454 00:41:24,242 --> 00:41:39,390 so much. And we're here for a little Q & A. *Applause* 455 00:41:39,390 --> 00:41:43,158 Herald: Thanks a lot for this beautiful talk. We are now open for the 456 00:41:43,158 --> 00:41:46,738 question. Mics. If there's any questions from the audience, please just stand up to 457 00:41:46,738 --> 00:41:58,816 one of the mics. Paula: Don't be shy. Herald: Nobody is more interested in hacking culture? Are 458 00:41:58,816 --> 00:42:00,962 you overwhelmed? Paula: Someone. 459 00:42:00,962 --> 00:42:03,621 Herald: Yeah. There's someone on mic 1. Please. 460 00:42:03,621 --> 00:42:07,552 Mic1: Thank you for this talk and for the energy that was in your talk. It was just 461 00:42:07,552 --> 00:42:15,453 amazing! I have one question to ask. What was like the... way more like 462 00:42:15,453 --> 00:42:20,299 surprising moments for you in this, like, research journey. 463 00:42:20,299 --> 00:42:28,318 Paula: OK, that's a good question. Biella: I mean. In terms of the 464 00:42:28,318 --> 00:42:35,760 project, you know, collaborating with others and building a Web site is very 465 00:42:35,760 --> 00:42:39,681 different than what academics often do, where we do often have to rely on 466 00:42:39,681 --> 00:42:44,183 ourselves and we get feedback. You know what I mean? And I think it does give a 467 00:42:44,183 --> 00:42:50,623 sense of the really beautiful relations that form, where you go back and forth 468 00:42:50,623 --> 00:42:55,411 with an author, with a web developer. You know, it really does give you a sense of 469 00:42:55,411 --> 00:43:00,182 the deep social ties that we do have as academics. But I think it's much, much 470 00:43:00,182 --> 00:43:06,552 deeper with hackers. That's one thing. But I do think I mean, I am frustrated as an 471 00:43:06,552 --> 00:43:11,857 academic, where a lot of people do have very, very, very narrow conceptions of 472 00:43:11,857 --> 00:43:16,241 hackers. It's not a perfect world. And there's a lot which, you know, we can 473 00:43:16,241 --> 00:43:21,745 change. There is very clear also that as academics, we weren't necessarily changing 474 00:43:21,745 --> 00:43:28,519 perceptions so much. And this project was an effort to finally do that. It's like 475 00:43:28,519 --> 00:43:32,785 see them like stop listening or reading just my words, because obviously that's 476 00:43:32,785 --> 00:43:37,590 not really changing chat, you know, so come see some of the videos. Yeah. 477 00:43:37,590 --> 00:43:42,279 Paula: Yeah. And I guess for me, I also mean, if you work in your own little 478 00:43:42,279 --> 00:43:46,692 bubble and you work in your own little corner, just in any type of science, you 479 00:43:46,692 --> 00:43:50,403 don't see as much as what's going on out of there. And I for me, the whole 480 00:43:50,403 --> 00:43:55,084 definition of what it is to hack what a hacker actually is, you start really 481 00:43:55,084 --> 00:43:59,939 opening your eyes out when you see while there's 50 hundred other scholars out 482 00:43:59,939 --> 00:44:03,843 there that are actually think that a hacker is this or hackers that. And I 483 00:44:03,843 --> 00:44:07,588 think for me, that opened my eyes up really saying, hey, well, this is what you 484 00:44:07,588 --> 00:44:14,286 think it means. So interesting know. Herald: Thank you. Now a question from 485 00:44:14,286 --> 00:44:16,854 mic 2, too, please. Mic2: Hi, thank you for the talk. It was 486 00:44:16,854 --> 00:44:22,061 very enlightening. I have two questions. The first one would be, could you tell us 487 00:44:22,061 --> 00:44:29,137 maybe a bit more about the server and infrastructure you using or are you just 488 00:44:29,137 --> 00:44:34,642 linking YouTube videos? And the second one would be, how would you envision future 489 00:44:34,642 --> 00:44:39,724 engagement with students? Because I'm teaching a course for computer scientists 490 00:44:39,724 --> 00:44:44,773 undergrads. And we did something similar around movies and descriptions 491 00:44:44,773 --> 00:44:50,444 that they have to make around hacker movies. And they don't really learn how to 492 00:44:50,444 --> 00:44:56,217 reflect on social issues a lot in the studies. So I wonder how does this could 493 00:44:56,217 --> 00:45:02,363 be integrated into platform and how that could how you could engage students further? 494 00:45:02,363 --> 00:45:05,953 Biella: So great questions. I mean, first of all, for the Web site, it runs on 495 00:45:05,953 --> 00:45:13,625 WordPress. It just seemed like an easy way to, like, hack it up for this sort of 496 00:45:13,625 --> 00:45:19,039 thing. And we hired actually a master student from my department at McGill 497 00:45:19,039 --> 00:45:27,993 University. Thanks to all. You're awesome. And then we're hosting the videos on Vimeo 498 00:45:27,993 --> 00:45:31,559 and they come from all sorts of different places. That's actually not the best or 499 00:45:31,559 --> 00:45:36,896 the most ideal solution. And so far as like, you know, who knows if Vimeo is 500 00:45:36,896 --> 00:45:41,585 going to exist in 15 years? Right. Internet Archive. We looked into them and 501 00:45:41,585 --> 00:45:46,050 they were kind of like psyched about it, that it was going to be slower to deliver 502 00:45:46,050 --> 00:45:54,028 the video. Right? So maybe if the project grows, we can at a certain point host our 503 00:45:54,028 --> 00:46:00,005 own videos. Right? But like we'll have to sort of graduate there at the next level. 504 00:46:00,005 --> 00:46:05,912 The entries are all going to be creative comments and we're using clips that then we 505 00:46:05,912 --> 00:46:11,543 cite the entire clip and where it came from. We consider this fair use and for 506 00:46:11,543 --> 00:46:16,147 those that may be wondering. And so we'll see how that goes. 507 00:46:16,147 --> 00:46:19,647 Paula: And for the second, I guess I could take the second question. When ever I 508 00:46:19,647 --> 00:46:23,467 mean, my students are not their digital media students. They're not from computing 509 00:46:23,467 --> 00:46:29,079 science. But if you ever even try to touch along something around culture or 510 00:46:29,079 --> 00:46:34,030 something, maybe real social science is always, I think, ask how is power really 511 00:46:34,030 --> 00:46:37,892 how these people relate to power? How did they relate to critique? How do they use 512 00:46:37,892 --> 00:46:41,814 these tools to critique something? And I think all of these videos and maybe even 513 00:46:41,814 --> 00:46:46,411 the videos that your students chose, if they just asked that question, whether 514 00:46:46,411 --> 00:46:49,915 they're studying computing science, whether they're studying geography or 515 00:46:49,915 --> 00:46:54,176 whatever it is, if they look at it from a form of power and how it's contested, I 516 00:46:54,176 --> 00:46:58,479 think that that's a way in which they they really can engage into a certain topic 517 00:46:58,479 --> 00:47:04,593 really deeply. That's cool. There's a nice little text by Fuko with what's called 518 00:47:04,593 --> 00:47:09,158 what is critique. That's it. I use it for my students that are non maybe cultural 519 00:47:09,158 --> 00:47:13,798 studies students or whatever. And there's a nice little text that could be with 520 00:47:13,798 --> 00:47:16,223 Herald: Thank you. One more question from 521 00:47:16,223 --> 00:47:19,763 mic 2, please. Mic2: So thank you again. And I wanted 522 00:47:19,763 --> 00:47:25,618 to ask you, because I looked at the videos on the site and I see a lot of stories of 523 00:47:25,618 --> 00:47:30,779 single people and I'm quite surprised to find very little stories of communities 524 00:47:30,779 --> 00:47:36,733 and showcases of hacker spaces. And a lot of researchers I've spoke about are 525 00:47:36,733 --> 00:47:42,492 actually focusing on like how communities work. So was there any conscious decision 526 00:47:42,492 --> 00:47:48,200 that you want to tell singular people, singular person stories instead of 527 00:47:48,200 --> 00:47:53,334 like communities? Biella: First of all, it's a great piece 528 00:47:53,334 --> 00:47:57,056 of feedback because I mean, one of the things as an anthropologist that I've 529 00:47:57,056 --> 00:48:01,969 always loved about the hacker world is on the one hand, you know, people often talk 530 00:48:01,969 --> 00:48:07,514 about rights that are are tied to notions of individualism. But hacking is so 531 00:48:07,514 --> 00:48:12,829 collectivist. Right. I mean, look at the CCC. I mean, you can't have a better 532 00:48:12,829 --> 00:48:17,900 example of a kind of ritual, collective, effervescent experience, hacker spaces. 533 00:48:17,900 --> 00:48:23,059 Right. So I do think it's really important to try to showcase that. And we do, we do 534 00:48:23,059 --> 00:48:29,295 have videos around hacker spaces and they're being written up like the authors 535 00:48:29,295 --> 00:48:33,806 are writing about them now. But if that's not coming through the sites, we actually 536 00:48:33,806 --> 00:48:39,869 need to write. But it does show I mean, one of the problems with video and we and 537 00:48:39,869 --> 00:48:44,104 we will reflect on this is that on the one hand, while you could put a face to 538 00:48:44,104 --> 00:48:48,980 hacking, which is great. It's like it's not the hooded person video has its own 539 00:48:48,980 --> 00:48:54,308 limits. Right? Often it's an individual. It's often what journalists are interested 540 00:48:54,308 --> 00:48:59,113 in. And we also have to make sure that this isn't the whole of hacking and also 541 00:48:59,113 --> 00:49:05,194 at times use the video to tell a different story than what the video is showing. So I 542 00:49:05,194 --> 00:49:08,632 think that's a great comment. And we're going to keep that in mind because to me, 543 00:49:08,632 --> 00:49:13,026 the collectivist community, part of hacking. Is one of the most amazing parts 544 00:49:13,026 --> 00:49:16,315 that never makes it into kind of mainstream representation. 545 00:49:16,315 --> 00:49:18,591 Paula: That's right. Thank you. Herald: Thank you. 546 00:49:18,591 --> 00:49:21,458 Herald: Then we have a question from the 547 00:49:21,458 --> 00:49:29,491 Internet. First Internet. Biella: Internet. Tell us. Talk to us. 548 00:49:29,491 --> 00:49:35,746 Signal Angel: That question from the Internet is: when covering international scenes, scenes 549 00:49:35,746 --> 00:49:42,125 like Phrack magazine use as source material. Biella: Is Phrack magazine a source? 550 00:49:42,125 --> 00:49:45,105 Signal Angel: Yeah. Biella: Yeah. I mean, Phrack magazine. 551 00:49:45,105 --> 00:49:52,341 Remember the video that I showed around the fake French CCC? That is a larger 552 00:49:52,341 --> 00:49:59,292 project around how parts of the underground went pro and started doing 553 00:49:59,292 --> 00:50:06,196 security work. And Phrack is amazing. I mean, Phrack tells so much of that story. 554 00:50:06,196 --> 00:50:10,495 And what is also so interesting, because I've done like almost 26 interviews, in- 555 00:50:10,495 --> 00:50:15,752 depth interviews around on this. And like you'd expect in many hacker circles, 556 00:50:15,752 --> 00:50:20,514 there's a lot of diversity of opinions. And the one thing that people agree on was 557 00:50:20,514 --> 00:50:25,349 that like Phrack was awesome, technically. And it brought very different types of 558 00:50:25,349 --> 00:50:32,171 people together. You know, Phrack hasn't come up in the video because it's one of 559 00:50:32,171 --> 00:50:37,487 these things that hasn't been documented. Right? So much in documentaries or film. 560 00:50:37,487 --> 00:50:41,158 And again, it points to that problem, which is on the one hand, we're trying to 561 00:50:41,158 --> 00:50:45,888 show the faces of hacking. But we also have to make very, very clear, that 562 00:50:45,888 --> 00:50:50,816 there's certain parts of hacker history that don't exist in video and don't take 563 00:50:50,816 --> 00:50:56,683 this as the definitive sort of word or record. 564 00:50:56,683 --> 00:50:59,682 Herald: Now the question from microphone 2, please. 565 00:50:59,682 --> 00:51:06,119 Mic2: Hi, I'm... I was wondering, whether you plan to expand your 566 00:51:06,119 --> 00:51:15,106 categories. If I didn't miss anything to something for example as in my PhD. 567 00:51:15,106 --> 00:51:21,581 Examples of hacking connected with biology, genetics and digital fabrication, 568 00:51:21,581 --> 00:51:23,716 neuro-hacking and so on. Biella: Ja. 569 00:51:23,716 --> 00:51:30,873 Mic2.: So here that the CCC does a track dedicated to science that I think it's 570 00:51:30,873 --> 00:51:35,343 somehow related. Thanks. Biella: Great. Yeah. So if I can come 571 00:51:35,343 --> 00:51:40,945 correctly, I think we have 11 categories and we absolutely are expanding and like 572 00:51:40,945 --> 00:51:45,209 bio hacking is one that we want include because actually, you know, hackers are 573 00:51:45,209 --> 00:51:49,767 like creating insulin in the context of the United States, where insulin is 574 00:51:49,767 --> 00:51:54,607 ridiculously expensive, like some of the most important hacking I think is 575 00:51:54,607 --> 00:52:00,291 happening. So we're absolutely going to expand by a handful. We also don't want to 576 00:52:00,291 --> 00:52:07,341 go much more beyond 15 or 18. And one of the ways that we're also then handling 577 00:52:07,341 --> 00:52:14,114 that is that each entry comes with tags and then there's gonna be other groupings 578 00:52:14,114 --> 00:52:20,697 around tags. But it's certainly I mean, what you've seen is alive. It's alive, it's 579 00:52:20,697 --> 00:52:25,223 alive, but it's also very much beta, you know. 580 00:52:25,223 --> 00:52:29,114 Paula: And it and if you've written also on this topic and you have an interesting 581 00:52:29,114 --> 00:52:33,970 video, please email us, send it over. We'd be really interested to hear about your 582 00:52:33,970 --> 00:52:38,705 research. Yeah. Yeah. Herald: And then we have another question 583 00:52:38,705 --> 00:52:40,955 on mic 1, please. Mic1: Thank you. Thank you. 584 00:52:40,955 --> 00:52:47,085 My question is for Biella. And it's about would you say that your 585 00:52:47,085 --> 00:52:55,361 work be done on Anonymous affected the way you engage with working with video 586 00:52:55,361 --> 00:53:03,689 after going deep into seeing, how Anonymous uses video as a medium to engage 587 00:53:03,689 --> 00:53:07,417 with the public as compared to other activist groups who are way less 588 00:53:07,417 --> 00:53:10,549 successful in that? Biella: That's great. I mean, that is 589 00:53:10,549 --> 00:53:15,838 definitely, you know, I on the one hand always use video in my class. And it's not 590 00:53:15,838 --> 00:53:20,010 just like hackers. You know, if I'm talking about Martin Luther King and 591 00:53:20,010 --> 00:53:26,819 something he said, I will show a video of what he said. Because having me repeat it 592 00:53:26,819 --> 00:53:32,661 versus having MLK on the screen it's a lot more persuasive. And we are in a moment 593 00:53:32,661 --> 00:53:37,045 where truth is not winning the game and we have to think about our game of 594 00:53:37,045 --> 00:53:41,142 persuasion. Right? That's just this is a kind of side project. But you're 595 00:53:41,142 --> 00:53:46,154 absolutely right. It was also Anonymous who used so many videos. Right. In a 596 00:53:46,154 --> 00:53:50,510 period where, sure, others had to use videos. But it was groups like, for 597 00:53:50,510 --> 00:53:57,325 example, Indymedia who's turned 20 this year, who took videos of the world around 598 00:53:57,325 --> 00:54:02,524 us, whereas Anonymous created videos as a means for persuasion. And it was very 599 00:54:02,524 --> 00:54:08,526 powerful at the time. And I am... I am inspired to think about how can we think 600 00:54:08,526 --> 00:54:15,762 about persuasive mediums in all contexts in order to get our message out. Because 601 00:54:15,762 --> 00:54:21,937 again, we're not always winning in this regard. Truth can never speak on its own, 602 00:54:21,937 --> 00:54:28,733 right? And we always need adjuncts and adjuvants in order to get truth's message 603 00:54:28,733 --> 00:54:33,294 out there. And certainly it was Anonymous in part that that helped me see the 604 00:54:33,294 --> 00:54:40,001 importance of video and in a new way. So I'm really glad you mentioned that. 605 00:54:40,001 --> 00:54:44,435 Herald: Thank you. And then we have another question from the Internet. 606 00:54:44,435 --> 00:54:48,311 Signal-engel: Yeah, and the next question from the Internet is: how will you select 607 00:54:48,311 --> 00:54:53,514 the right curators for the entries and how do they decide how they are 608 00:54:53,514 --> 00:54:58,851 presented and contextualized? Biella: All right. So, I mean, I've been 609 00:54:58,851 --> 00:55:05,350 working on hacker cultures for since 1998. Paula: Mine is a journey has been a 610 00:55:05,350 --> 00:55:08,703 little bit shorter, but also for about 10 years or so. 611 00:55:08,703 --> 00:55:14,396 Biella: Yeah. And so I do, I know a lot of people working on different topics. And 612 00:55:14,396 --> 00:55:20,281 for the first round, we invited people. And it wasn't just academics. I have 613 00:55:20,281 --> 00:55:24,270 gotten journalists and hackers are writing some entries as well. But they're just 614 00:55:24,270 --> 00:55:28,885 like a little bit harder to kind of get them to turn in their entries. But 615 00:55:28,885 --> 00:55:33,221 hopefully they will, because, again, it's it's not just who's been credentialed to 616 00:55:33,221 --> 00:55:39,864 talk about a topic. It's who knows about a topic, who has something to say and who's 617 00:55:39,864 --> 00:55:45,037 willing to go through the editing process. Because while journalists generally don't 618 00:55:45,037 --> 00:55:48,813 have to go through multiple edits because you all just really know how to write for 619 00:55:48,813 --> 00:55:55,084 the public, everyone else actually does struggle a little bit. And we do really 620 00:55:55,084 --> 00:56:00,119 try to get the entries written in such a way where we're presuming, you know, 621 00:56:00,119 --> 00:56:06,370 nothing about hackers or the video. It's not always easy, then, to write an entry, 622 00:56:06,370 --> 00:56:12,045 that kind of starts from that that low level. And then in terms of the 623 00:56:12,045 --> 00:56:19,431 contextualization, that's where we have three editors and curators. And I would 624 00:56:19,431 --> 00:56:26,010 actually even say four because our final editor, Matt Gorson. He was an M.A. 625 00:56:26,010 --> 00:56:30,554 student under me. He's doing a big project on security, hacking with me at data and 626 00:56:30,554 --> 00:56:35,760 society. He knows a ton. And it's precisely having many eyeballs on one 627 00:56:35,760 --> 00:56:43,047 entry that allows us to hopefully contextualize it properly. But, you know, 628 00:56:43,047 --> 00:56:47,816 again, if something seems off, people should email us. And again, we're also 629 00:56:47,816 --> 00:56:53,271 open to responses from the community as well, which we have one response from 630 00:56:53,271 --> 00:56:58,305 Naomi. But, you know, perhaps that will kind of grow into something larger. 631 00:56:58,305 --> 00:57:02,528 Paula: So when you ask why or why is it us that are curating, who's curating, really, 632 00:57:02,528 --> 00:57:07,161 it's just the three of us that are doing this. And what kind of speech position are 633 00:57:07,161 --> 00:57:10,398 we coming from? I mean, we're anthropologists of hacker cultures. What 634 00:57:10,398 --> 00:57:14,706 does that mean? May for you guys, it doesn't mean much or it means a lot. Or 635 00:57:14,706 --> 00:57:18,607 it's really we've studied you guys for a long time. 636 00:57:18,607 --> 00:57:22,993 Biella: Yes. But it's it's also cool because it's like, well, except for Paula. 637 00:57:22,993 --> 00:57:29,213 I mean, Chris and I like we have tenure and that may mean nothing to you all. But, 638 00:57:29,213 --> 00:57:34,937 you know, hackers care about freedom and free speech and tenure allows you to be 639 00:57:34,937 --> 00:57:36,956 free. Puala: I have tenure now. 640 00:57:36,956 --> 00:57:41,247 Biella: Oh, you do? Sweet. We all are free to kind of do what we want in interesting 641 00:57:41,247 --> 00:57:45,642 ways. And again, we're trying to experiment with mediums that go a little 642 00:57:45,642 --> 00:57:50,402 bit beyond the academic journal, which I'm totally behind. I think there's really 643 00:57:50,402 --> 00:57:53,165 good things about the academic journal. I think there's really good things about the 644 00:57:53,165 --> 00:58:00,216 book. But we have the freedom to experiment with new mediums. And so 645 00:58:00,216 --> 00:58:04,688 hopefully this this new medium will kind of reach different types of publics in a 646 00:58:04,688 --> 00:58:12,429 way that kind of academic journal articles will never reach. 647 00:58:12,429 --> 00:58:17,909 Herald: Are there any more questions? Paula: Party. Party. 648 00:58:17,909 --> 00:58:21,475 Herald: It doesn't look like it. So I would like to invite you for another round 649 00:58:21,475 --> 00:58:22,664 of applause for Biella and Paula. 650 00:58:22,664 --> 00:58:23,523 *Applause* 651 00:58:23,523 --> 00:58:26,075 Biella und Paula: Thank you guys, thank you so much. 652 00:58:26,075 --> 00:58:34,648 *36C3 Postroll music* 653 00:58:34,648 --> 00:58:44,259 Subtitles created by c3subtitles.de in the year 2020. Join, and help us!