1 00:00:00,000 --> 00:00:18,446 *35C3 preroll music* [Filler, please remove in amara] 2 00:00:18,446 --> 00:00:25,130 Herald Angel: And now to announce the speakers. These are Diego Naranjo, who's a 3 00:00:25,130 --> 00:00:32,259 senior policy advisor at EDri and Andreea Belu, who's a campaigns and communications 4 00:00:32,259 --> 00:00:35,810 manager also at EDRi. EDRi stands for European Digital Rights which is an 5 00:00:35,810 --> 00:00:41,420 umbrella organization of European NGOs active in the field of freedom, rights and 6 00:00:41,420 --> 00:00:46,320 the digital sphere. CCC is actually founding member of it. And they will be 7 00:00:46,320 --> 00:00:51,440 talking about "Citizens or subjects? The battle to control our bodies, speech and 8 00:00:51,440 --> 00:01:02,190 communications". The floor is yours. *applause* 9 00:01:02,190 --> 00:01:09,680 Andreea Belu: This one this one here. This is my phone. There are many like it but 10 00:01:09,680 --> 00:01:16,900 this one is mine. My phone is my best friend. It is my life and I should master 11 00:01:16,900 --> 00:01:26,630 it as I master my life. This is my phone. But what makes it mine? I mean it might be 12 00:01:26,630 --> 00:01:31,750 quite obvious right now that I'm holding it for all of you. What is not that 13 00:01:31,750 --> 00:01:41,570 obvious though is that my phone is also holding me. On one hand we use our phones, 14 00:01:41,570 --> 00:01:47,380 we use it to connect to the Internet, get online with our friends, exchange 15 00:01:47,380 --> 00:01:56,960 opinions, coordinate actions. On the other hand, we are used. We are used by third 16 00:01:56,960 --> 00:02:05,630 parties, governmental, private, who through our phones, through our devices, 17 00:02:05,630 --> 00:02:19,410 monitor. They monitor our location, our bodies, our speech, the content we share. 18 00:02:19,410 --> 00:02:26,180 At EU level right now there is a sort of a pattern. There is this tendency, a trend 19 00:02:26,180 --> 00:02:33,490 almost. Certain laws like the ePrivacy, the Copyright Directive, the Terrorist 20 00:02:33,490 --> 00:02:42,000 Regulation, have this very central core that we call the body and speech control. 21 00:02:42,000 --> 00:02:49,170 It looks like it is really the driving force in the moment. So in the next 40 22 00:02:49,170 --> 00:02:56,370 minutes or so, what we will do is give you short updates about these laws. Talk to 23 00:02:56,370 --> 00:03:01,010 you a bit about what their impact is on us, and what do they mean past the Article 24 00:03:01,010 --> 00:03:08,670 X and Y, and hopefully convince you to get involved in changing how they look right 25 00:03:08,670 --> 00:03:13,100 now. Diego Naranjo: While we represent European 26 00:03:13,100 --> 00:03:18,400 Digital Rights as Walter was mentioning before we are 39 human rights 27 00:03:18,400 --> 00:03:23,090 organizations from all across Europe. We work on all sorts of human rights in the 28 00:03:23,090 --> 00:03:27,560 online environments, so-called digital rights. We work on data protection, net 29 00:03:27,560 --> 00:03:33,310 neutrality, privacy, freedom of expression online and so on, and Andreea and I are 30 00:03:33,310 --> 00:03:38,970 glad to be for the very first time here at 35C3. 31 00:03:38,970 --> 00:03:46,140 Andreea: Now to continue that quote, that adapted quote from Full Metal Jacket, my 32 00:03:46,140 --> 00:03:53,940 phone without me is useless. Without my phone I am useless. We spend most of our 33 00:03:53,940 --> 00:03:59,520 seconds of our lifetime around devices that are connected to the internet, 34 00:03:59,520 --> 00:04:06,980 whether a phone, a computer, a fridge or whatnot. This means that these devices 35 00:04:06,980 --> 00:04:14,260 pretty much become attached to our bodies, especially a phone. Tracking these devices 36 00:04:14,260 --> 00:04:24,629 therefore is equal to tracking our bodies. Controlling our bodies. For the purpose of 37 00:04:24,629 --> 00:04:30,300 this presentation, we will talk about online tracking in terms of location 38 00:04:30,300 --> 00:04:38,030 tracking, the tracking of our devices, the behavior tracking of users on websites - 39 00:04:38,030 --> 00:04:43,559 how much do they spend on, I don't know what part of a website, where do they 40 00:04:43,559 --> 00:04:49,370 navigate next, how many clicks they give, and the tracking of communications sent 41 00:04:49,370 --> 00:04:57,770 between two devices. Diego: First location tracking. They are 42 00:04:57,770 --> 00:05:03,479 on average more screens on most of the households than we have people. We carry 43 00:05:03,479 --> 00:05:08,240 some of these devices in our pockets and they have more personal information than 44 00:05:08,240 --> 00:05:13,240 most diaries used to have before. Our phones need to be tracked because they 45 00:05:13,240 --> 00:05:19,460 need to be able to receive and send calls, messages, data. But this opens of course 46 00:05:19,460 --> 00:05:24,639 new ways to use location data for commercial purposes, but also for a state 47 00:05:24,639 --> 00:05:28,469 surveillance. Andreea: When it comes to behavioral 48 00:05:28,469 --> 00:05:33,969 tracking, tracking our behavior online provides a lot more information 49 00:05:33,969 --> 00:05:41,339 than just location. It adds on top of it, right? A user can then be targeted 50 00:05:41,339 --> 00:05:52,490 according to that tracking. And the more this tracking targeting process basically 51 00:05:52,490 --> 00:05:58,089 represents the business model of the Internet nowadays. For this reason the 52 00:05:58,089 --> 00:06:05,400 more complex and detailed someone's profiling is, well, the more accurate the 53 00:06:05,400 --> 00:06:13,139 targeting can be done, the more effective and efficient, most of the times. And 54 00:06:13,139 --> 00:06:18,999 therefore more valuable the data about the profile is. You can see here a really cool 55 00:06:18,999 --> 00:06:30,860 infographic from Cracked Labs, an Acxiom and Oracle profiling of populations. You 56 00:06:30,860 --> 00:06:37,150 see the amount of variables and the amount of information and the depth where it 57 00:06:37,150 --> 00:06:43,110 goes, and you get that business model, a cash flow. 58 00:06:43,110 --> 00:06:48,449 Diego: And this business model is quite interesting. I wouldn't imagine a postman 59 00:06:48,449 --> 00:06:54,030 going to my physical mailbox at home going through my letters and then pouring some 60 00:06:54,030 --> 00:07:01,219 leaflets for advertising in there according to what he reads. Right now 61 00:07:01,219 --> 00:07:05,779 Gmail and many other services, that's what they do, they leave out as, you well know, 62 00:07:05,779 --> 00:07:12,089 reading your emails to sell you stuff that you don't really need. Facebook conversations 63 00:07:12,089 --> 00:07:17,849 now through the API are an option, they want to read them, those 64 00:07:17,849 --> 00:07:23,300 conversations in order to find patterns for example for intellectual property and 65 00:07:23,300 --> 00:07:27,740 infringements, especially for counterfeiting but not only, also for 66 00:07:27,740 --> 00:07:34,539 copyright. Now also WhatsApp metadata is used by Facebook in order to know who your 67 00:07:34,539 --> 00:07:40,590 friends are, who you contact, who your family is, in order for that social media 68 00:07:40,590 --> 00:07:45,469 services to gain more and more power, more and more data, and more and more profit of 69 00:07:45,469 --> 00:07:55,129 course. The life of others. Right? I got a good movie. If you haven't seen the movie, 70 00:07:55,129 --> 00:08:00,649 I guess everyone has seen it. If not you should. It's basically about a Stasi agent 71 00:08:00,649 --> 00:08:05,619 who follows the life of an individual through a period of time, and after 72 00:08:05,619 --> 00:08:13,159 there's no other revelations. This movie has changed in a way from drama to soft 73 00:08:13,159 --> 00:08:19,279 comedy because the capabilities of surveillance services, to veil all of 74 00:08:19,279 --> 00:08:23,339 us, to get so much data, also for companies to get so much intimate 75 00:08:23,339 --> 00:08:31,830 information from us, has doubled, tripled or perhaps exponentially grown compared to 76 00:08:31,830 --> 00:08:39,330 what the Stasi would do back then. And all of this... to some extent it's going to be 77 00:08:39,330 --> 00:08:45,110 regulated by this regulation with a very, very long name. I guess half of you will 78 00:08:45,110 --> 00:08:50,740 have fell asleep already by going through it, but I'll go through it quickly to 79 00:08:50,740 --> 00:08:56,339 let you know what is this about. Why do we need to know about ePrivacy Regulation, 80 00:08:56,339 --> 00:09:01,990 why it is important for the control of our bodies and our devices. The ePrivacy is 81 00:09:01,990 --> 00:09:06,910 about online tracking and you might have heard of the cookie directive, that one 82 00:09:06,910 --> 00:09:12,570 bringing all those cookie banners on your screens in part due to a bad 83 00:09:12,570 --> 00:09:18,360 implementation of this directive. It is also about your emails. Who's going to be 84 00:09:18,360 --> 00:09:22,380 able to read your emails or use the data from your emails to sell you advertising 85 00:09:22,380 --> 00:09:28,709 or not. How confidential that information can be. It's also about your chats. How 86 00:09:28,709 --> 00:09:32,310 would you communicate nowadays with your WhatsApp, Signal, Wire, or any other 87 00:09:32,310 --> 00:09:38,880 devices. And finally it's also about location data. Who can track you, why can 88 00:09:38,880 --> 00:09:44,870 they track you and what are the safeguards that that need to be put in place in order 89 00:09:44,870 --> 00:09:51,410 to safeguard your privacy. And I can imagine many of you saying well don't we 90 00:09:51,410 --> 00:09:58,260 have already this GDPR thingy of the emails I received in May. Yes we do have 91 00:09:58,260 --> 00:10:03,899 that. But the GDPR was not enough. After more than four years of discussions to 92 00:10:03,899 --> 00:10:10,180 achieve this General Data Protection Regulation we have achieved a lot, and 93 00:10:10,180 --> 00:10:17,140 GDPR was our best possible outcome in that current political scenario. And there's a 94 00:10:17,140 --> 00:10:19,370 lot to do regarding the implementation though. 95 00:10:19,370 --> 00:10:20,940 We have seen problems in Romania, 96 00:10:20,940 --> 00:10:26,190 in Spain, and we expect that to happen in many other places. But we still need a 97 00:10:26,190 --> 00:10:30,570 specific instrument to cover the right to privacy in electronic communications, 98 00:10:30,570 --> 00:10:35,430 including everything we mentioned before. Metadata, chats, location data, the 99 00:10:35,430 --> 00:10:41,990 content of your communications and so on. Andreea: So ePrivacy is basically meant to 100 00:10:41,990 --> 00:10:48,740 complement GDPR and be more focused on exactly the topics that Diego mentioned. 101 00:10:48,740 --> 00:10:55,899 What did we advocate for and still do? Privacy by design and privacy by default 102 00:10:55,899 --> 00:11:02,160 should be the core principles, the pillars of this regulation. Moreover politicians 103 00:11:02,160 --> 00:11:10,190 need to recognize the value of maintaining and enhancing secure encryption. Cookie 104 00:11:10,190 --> 00:11:16,029 walls--I mean we should be able to visit a website without having to agree to being 105 00:11:16,029 --> 00:11:23,149 tracked by cookies. This is another topic that we strongly advocated for. 106 00:11:23,149 --> 00:11:29,763 Finally, content should be protected together with metadata in storage and in 107 00:11:29,763 --> 00:11:40,589 transit. And we actually succeeded last year in 2017 at the end. The parliament 108 00:11:40,589 --> 00:11:49,389 adopted a very good text, a very strong text. It supported most of the 109 00:11:49,389 --> 00:11:54,699 problems that, no, it addressed most of the problems that we pointed out and 110 00:11:54,699 --> 00:12:04,889 supported the values that we're going through. But, this has been quite a ride. 111 00:12:04,889 --> 00:12:13,150 I mean it wasn't easy. As Diego said we're a network, 39 organizations. They're not 112 00:12:13,150 --> 00:12:19,279 just the legal people or tech people, it's a combination of both. So when we provided 113 00:12:19,279 --> 00:12:26,900 our input in the shape of analysis or recommendations, some bulleted there, all 114 00:12:26,900 --> 00:12:32,560 sorts of skills were combined. And this played a big part of our success, the fact 115 00:12:32,560 --> 00:12:38,250 that we were able to provide a comprehensive yet complex analysis of what 116 00:12:38,250 --> 00:12:45,389 encryption should look like, of what cookies should act like, and also a legal 117 00:12:45,389 --> 00:12:53,890 analysis of existing legislation. The diversity of our skills became productive. 118 00:12:53,890 --> 00:13:00,209 Diego: Did we win? Well, we are on our way. After the EU parliament adopted its 119 00:13:00,209 --> 00:13:07,029 position, now it needs to sort of enter into discussion with the member states in 120 00:13:07,029 --> 00:13:11,129 what is called the Council of the EU. For the Parliament with a strong position is 121 00:13:11,129 --> 00:13:15,790 now to talk with the rest of the member states. Currently there are negotiations 122 00:13:15,790 --> 00:13:20,430 around the ePrivacy are not really moving forward. They are being delayed by their 123 00:13:20,430 --> 00:13:24,959 national governments. They are claiming that there are issues that need to be 124 00:13:24,959 --> 00:13:29,490 tackled. That it is very technical, that we already had the GDPR and we need to see 125 00:13:29,490 --> 00:13:34,389 how it is implemented, and Member States feared that another layer of protection 126 00:13:34,389 --> 00:13:40,579 may impede that some businesses grow in the European Union. And if this was not 127 00:13:40,579 --> 00:13:47,389 enough they also are afraid of getting bad press from the press right now. It depends 128 00:13:47,389 --> 00:13:51,300 to a high extent on behavioural advertising. They say that without 129 00:13:51,300 --> 00:13:58,200 tracking all over the internet they are unable to to sustain their business model. 130 00:13:58,200 --> 00:14:02,080 And of course since the national governments, the politicians, are afraid 131 00:14:02,080 --> 00:14:09,539 of that bad press from the press, then they are quite cautious to move forward. 132 00:14:09,539 --> 00:14:16,279 Online we exercise our free speech and in many ways, but one of those ways is the 133 00:14:16,279 --> 00:14:22,560 way we produce, share, or enjoy our content online. Our opinions, the people 134 00:14:22,560 --> 00:14:27,689 with whom we communicate, can be seen as a threat in a given time by certain 135 00:14:27,689 --> 00:14:32,899 governments. We have seen the trend in certain governments such in Poland, 136 00:14:32,899 --> 00:14:38,240 Hungary and to a certain extent as well in Spain. All of these 137 00:14:38,240 --> 00:14:43,040 information can be as well very profitable as we see with the mainstream social media 138 00:14:43,040 --> 00:14:49,410 platforms we were mentioning before. So there are political and economical reasons 139 00:14:49,410 --> 00:14:56,430 to control speech and that's why the best way to control speech is to control the 140 00:14:56,430 --> 00:15:04,370 way that content is shared online. Right now there are two proposals that raise a 141 00:15:04,370 --> 00:15:10,629 huge threat to freedom of expression online. Both propose upload filters by 142 00:15:10,629 --> 00:15:15,760 increasing liability for platforms or making platform companies responsible for 143 00:15:15,760 --> 00:15:21,779 the content which they host. One of them is the famous or infamous Article 13 or 144 00:15:21,779 --> 00:15:26,850 the Copyright Directive proposal. And the second one is the regulation to prevent 145 00:15:26,850 --> 00:15:32,819 the dissemination of terrorist content online. Both of them, as you will see, 146 00:15:32,819 --> 00:15:37,930 they are just another way to make private companies the police and the dad of the 147 00:15:37,930 --> 00:15:44,379 Internet. This is the first one. The proposal for act again with a long name. 148 00:15:44,379 --> 00:15:50,530 Just stick to the short name, the Copyright Directive. And this Copyright 149 00:15:50,530 --> 00:15:56,120 Directive is based on a fable. The fable goes like this. There are a wide range of 150 00:15:56,120 --> 00:16:00,829 lonely and poor songwriters in their attic trying to produce songs for their 151 00:16:00,829 --> 00:16:06,620 audience. Then there are these big platforms, mainly YouTube but also others, 152 00:16:06,620 --> 00:16:12,509 that allow these uploads and gain profit. And these platforms give some pennies, 153 00:16:12,509 --> 00:16:17,040 some small amount of money, for these authors. And the difference between what 154 00:16:17,040 --> 00:16:23,600 they earn and what they should be earning is what they call the value gap. The fable 155 00:16:23,600 --> 00:16:28,630 though conveniently hides the fact that the music industry keeps saying year after 156 00:16:28,630 --> 00:16:36,050 year after year that they increase their revenues to a high percentage every year. 157 00:16:36,050 --> 00:16:41,699 And that keeps growing, especially in the online world. What is the solution to this 158 00:16:41,699 --> 00:16:47,399 problem? Well as you can imagine it's a magical algorithm that will solve this 159 00:16:47,399 --> 00:16:52,820 problem. And this algorithm will filter each and every file that you upload to 160 00:16:52,820 --> 00:16:58,209 these platforms, will identify it and match it against a database, and will 161 00:16:58,209 --> 00:17:04,430 block or allow the content depending if it is licensed or not and if they like you or 162 00:17:04,430 --> 00:17:10,650 not and then according to their terms of service in the end. As we will mention 163 00:17:10,650 --> 00:17:14,840 there are some technical and legal problems with upload filters. 164 00:17:14,840 --> 00:17:17,450 In essence, if they are implemented, 165 00:17:17,450 --> 00:17:20,120 it will mean that YouTube and Facebook will 166 00:17:20,120 --> 00:17:27,910 officially become the police and the dad of the internet. The other big fight that 167 00:17:27,910 --> 00:17:34,000 we have is around terrorism, or to be specific, about terrorist content online. 168 00:17:34,000 --> 00:17:39,870 After the Cold War we needed a new official enemy. Once communism fell. 169 00:17:39,870 --> 00:17:45,350 Terrorism is a new threat. It's very real to some extent. We we lived through it in 170 00:17:45,350 --> 00:17:51,660 Brussels recently, but it has been also exaggerated and inserted in our daily 171 00:17:51,660 --> 00:17:56,580 lives. We see that in the airport controls, surveillance online, and 172 00:17:56,580 --> 00:18:01,750 offline, restrictions to freedom of assembly and expression all over Europe. 173 00:18:01,750 --> 00:18:06,320 And whenever a terrorist attack occurs we see pushes for legislation and measures 174 00:18:06,320 --> 00:18:13,190 that restricts our freedoms. Usually those restrictions stay even though the risk or 175 00:18:13,190 --> 00:18:20,380 the threat has disappeared or has been reduced. Again there we go with a long 176 00:18:20,380 --> 00:18:26,700 name. Let's stick to the short name. TERREG. It's the regulation to prevent 177 00:18:26,700 --> 00:18:32,890 dissemination of content of terrorist content. These proposals allegedly aims at 178 00:18:32,890 --> 00:18:40,360 reducing terrorist content online. Note: not illegal content. Terrorist content. In 179 00:18:40,360 --> 00:18:48,660 order to reduce risks of radicalization. I've always -- and what we have seen 180 00:18:48,660 --> 00:18:54,150 through experience that a lot of radicalization happens outside the online 181 00:18:54,150 --> 00:19:01,880 world, and that radicalization has other causes which are not online content. 182 00:19:01,880 --> 00:19:05,815 It seems that the politicians need to send a strong signal before the EU elections. 183 00:19:05,815 --> 00:19:09,539 We need to do something strong against terrorism, and the way to do that is 184 00:19:09,539 --> 00:19:21,750 through three measures. Three. As we will see in a minute. First, Speedy González 185 00:19:21,750 --> 00:19:27,720 takedowns. My favorite. Platforms will need to remove content which has been 186 00:19:27,720 --> 00:19:32,260 declared terrorist content by some competent authorities. And this definition 187 00:19:32,260 --> 00:19:38,030 of terrorist content is of course vague, and also incoherent with other relevant 188 00:19:38,030 --> 00:19:42,610 pieces of legislation which are already in place but not implemented all across the 189 00:19:42,610 --> 00:19:52,110 EU. This removal needs to happen in one hour. This is sort of fast food principles 190 00:19:52,110 --> 00:19:56,740 applied to online world, to other visual material. And they give you some sort of 191 00:19:56,740 --> 00:20:00,630 complaint mechanism, so do you have any problem with that, your content being 192 00:20:00,630 --> 00:20:06,150 taken down, you can go and say this content is legal please take it back. But 193 00:20:06,150 --> 00:20:10,810 in practice as you read it you will see that it's likely to be quite ineffective. 194 00:20:10,810 --> 00:20:16,190 First of all, also overblocking will not be penalized. So if they overblock legal 195 00:20:16,190 --> 00:20:22,090 content nothing will happen. If they leave one piece of content which is illegal on 196 00:20:22,090 --> 00:20:30,640 their platform they will face a sanction. The second issue is those of measures for 197 00:20:30,640 --> 00:20:39,330 voluntary consideration. According to this second measure, the states will be able to 198 00:20:39,330 --> 00:20:44,340 tell platforms, "I have seen this terrorist content in your platform. This 199 00:20:44,340 --> 00:20:50,020 looks bad. Really really bad. So I really felt I had to ask you, could you be so 200 00:20:50,020 --> 00:20:54,710 kind to have a look, just if you wish? Of course I have no worries.", and the 201 00:20:54,710 --> 00:20:59,870 platform then will decide according to their own priorities how to deal with this 202 00:20:59,870 --> 00:21:09,820 voluntary request. Third. Good old upload filters, that's the third measure they are 203 00:21:09,820 --> 00:21:16,080 proposing. Upload filters, or general monitoring obligations in legal jargon, 204 00:21:16,080 --> 00:21:22,630 are prohibited in EU legislation. But anyway let's propose them. We'll see what 205 00:21:22,630 --> 00:21:27,960 happens. And in order to be able to push them in the legislation, let's give them 206 00:21:27,960 --> 00:21:34,750 an Orwellian twist to our filters, like let's call them in a different way. So we 207 00:21:34,750 --> 00:21:40,380 call them proactive measures. Platforms will need to proactively prevent that 208 00:21:40,380 --> 00:21:47,332 certain content is uploaded. How will they prevent this? Upload filters of course. 209 00:21:47,332 --> 00:21:54,316 I meant, proactive measures. And whether it is copyright or terrorist content, we 210 00:21:54,316 --> 00:22:01,790 see the same trend. We see this one size fits all solution: a filter, an algorithm 211 00:22:01,790 --> 00:22:06,330 that will compare all the content that is uploaded, will match it against a certain 212 00:22:06,330 --> 00:22:11,260 database, and that will block it or not. We will need many filters, not only one 213 00:22:11,260 --> 00:22:16,820 filter. We need filters for audio, for images, for text, and also one 214 00:22:16,820 --> 00:22:24,130 specifically for terrorist content. Whatever that is defined. So this is 215 00:22:24,130 --> 00:22:31,640 basically the principle of law making today. We really want filters, what can we 216 00:22:31,640 --> 00:22:38,910 invent to have them? Andreea: So we've got an issue with 217 00:22:38,910 --> 00:22:46,890 filters, well, quite a few issues. But in big, an issue. First of all, they're 218 00:22:46,890 --> 00:22:52,450 illegal. The European Court of Justice said like this: "A social network cannot 219 00:22:52,450 --> 00:22:58,320 be obliged to install a general filtering covering all of its users in order to 220 00:22:58,320 --> 00:23:03,840 prevent the unlawful use of musical and audio-visual work". In the case SABAM 221 00:23:03,840 --> 00:23:15,350 versus Netlog. Despite this it seems that automated content filters are okay. Not 222 00:23:15,350 --> 00:23:19,900 general filtering covering all of its users. Of course there are the technical 223 00:23:19,900 --> 00:23:21,900 issues. Diego: Yeah, there's some technical 224 00:23:21,900 --> 00:23:26,580 issues. One of my best example of that, the magnificent examples of how filters do 225 00:23:26,580 --> 00:23:32,740 not work. Well James Rhodes the pianist that a few weeks ago tried to upload a 226 00:23:32,740 --> 00:23:39,370 video of himself playing Bach in his living room. Then the algorithm detected 227 00:23:39,370 --> 00:23:45,830 some copyrighted content owned by Sony Music and automatically took down the 228 00:23:45,830 --> 00:23:52,820 content. Of course he complained, he took the content back. But it set a good 229 00:23:52,820 --> 00:23:58,950 example of how filters do not work, because one piece of Bach, who died around 230 00:23:58,950 --> 00:24:05,000 three or four hundred years ago, is of course out of copyright. And if the video 231 00:24:05,000 --> 00:24:10,900 of a famous artist is taken down, we can imagine the same for many of your content. 232 00:24:10,900 --> 00:24:15,520 Andreea: So not only that filters don't recognize what is actually copyrighted and 233 00:24:15,520 --> 00:24:23,470 what is not, but they also don't recognize exceptions, such as remixes, caricatures 234 00:24:23,470 --> 00:24:30,920 or parodies. When it comes to copyright, filters can't tell, and this is why memes 235 00:24:30,920 --> 00:24:38,010 were a central part of the protest against Article 13. And this is why we will show 236 00:24:38,010 --> 00:24:47,470 soon why this filter has huge potential for a political tool. Another issue with 237 00:24:47,470 --> 00:24:53,740 the automated content filter is that they don't even recognize contexts either. 238 00:24:53,740 --> 00:25:03,390 When it comes to hate speech or terrorist content, they can't tell nuances. A girl 239 00:25:03,390 --> 00:25:08,710 decided to share her traumatic experience of receiving a lot of injuries and... not 240 00:25:08,710 --> 00:25:17,350 injuries, insults, insults in her mailbox from this person who was hating her a lot 241 00:25:17,350 --> 00:25:22,590 and threatening her, so she took it and copy-pasted it on her Facebook account, 242 00:25:22,590 --> 00:25:30,520 and made a post, and her profile was taken down. Why? Because the automated solutions 243 00:25:30,520 --> 00:25:36,410 can't tell that she was the victim, not the actual perpetrator, right? And this is 244 00:25:36,410 --> 00:25:41,431 very likely to continue happening if this is the "solution" put forward. 245 00:25:41,431 --> 00:25:45,610 Diego: That is also a problem for SMEs of course, because these tools, these 246 00:25:45,610 --> 00:25:52,350 filters, are very expensive. YouTube spent around $100,000,000 to develop Content ID 247 00:25:52,350 --> 00:25:57,980 which is the best, worst, filter we have now online. So we can imagine how this is 248 00:25:57,980 --> 00:26:02,970 gonna go for European SMEs that will need to copy that model, probably getting a 249 00:26:02,970 --> 00:26:08,696 license for them, I can imagine, in order to implement those filters online. In the 250 00:26:08,696 --> 00:26:12,580 end this will just empower these big companies who already have their filters 251 00:26:12,580 --> 00:26:17,130 in place or they will just keep doing their business as usual and these new 252 00:26:17,130 --> 00:26:21,190 companies that would like to develop a different business model will be prevented 253 00:26:21,190 --> 00:26:26,240 from doing so because they will need to spend a lot of money on these filters. 254 00:26:26,240 --> 00:26:30,490 Andreea: Then there's the issue of privatized law enforcement, the 255 00:26:30,490 --> 00:26:37,730 privatization of law enforcement. Attributes change. Past state attributes 256 00:26:37,730 --> 00:26:44,410 are now shifted over - "Can you take care of it?" - to entities that are not really 257 00:26:44,410 --> 00:26:50,940 driven by the same values that a state should at least be driven by. I'll just 258 00:26:50,940 --> 00:26:56,140 give you one example from a project called Demon Dollar Project, a study commissioned 259 00:26:56,140 --> 00:27:03,400 by the Parliament to look at hate speech definition in different EU member states. 260 00:27:03,400 --> 00:27:09,010 Their conclusion: There are huge disparities between what it means, "hate 261 00:27:09,010 --> 00:27:13,380 speech" - what hate speech means in Germany compared to what hate speech means 262 00:27:13,380 --> 00:27:23,840 in Romania to what it means in the UK. So in this context, how can we ask a company 263 00:27:23,840 --> 00:27:34,290 like Google or Facebook to find *the* definition? I mean, are their terms and 264 00:27:34,290 --> 00:27:41,890 conditions the standard that we should see as the *one* influencing our legal 265 00:27:41,890 --> 00:27:47,490 definitions, am I the only one seeing a conflict of interest here? 266 00:27:47,490 --> 00:27:51,770 Diego: And there's a problem there that once we have these filters for copyright 267 00:27:51,770 --> 00:27:57,170 infringements or any other purposes, like terrorist content, we of course will have 268 00:27:57,170 --> 00:28:01,110 it as a political tool. Once we have these for copyright why are you not going to 269 00:28:01,110 --> 00:28:07,650 look for those dissidents in in every country. These things change very often, I 270 00:28:07,650 --> 00:28:11,640 see that in Spain but I see it all across the EU nowadays. So once we have them in 271 00:28:11,640 --> 00:28:15,440 place over one thing one small thing like copyright, why not for something else, 272 00:28:15,440 --> 00:28:20,710 something more political. Andreea: There is a really interesting 273 00:28:20,710 --> 00:28:27,920 example coming from Denmark some year or a year and a half ago. The Social Democrats 274 00:28:27,920 --> 00:28:36,250 announced their immigration plan. They made the video in which Mette Fredriksen 275 00:28:36,250 --> 00:28:41,060 talked about how great their plan is and so on. Some people were happy some sad. 276 00:28:41,060 --> 00:28:52,040 Some of the sad ones decided to criticize the plan and made a video about it. 277 00:28:52,040 --> 00:28:59,670 It was a critique during which they caricatured her, but they used two audio 278 00:28:59,670 --> 00:29:07,630 bits from their announcement video. Social Democrats sent a letter to the NGO 279 00:29:07,630 --> 00:29:13,170 accusing them of copyright infringement and threatening a lawsuit. Obviously the 280 00:29:13,170 --> 00:29:19,820 NGO thought: "Yeah we don't really have enough money to go through a big court 281 00:29:19,820 --> 00:29:26,600 case, so we're just going to take the video down." They took it down. Now, why 282 00:29:26,600 --> 00:29:34,690 is this case important? If an automated content filter for copyrighted material 283 00:29:34,690 --> 00:29:39,490 would have been in place, the Social Democrats wouldn't have to even lift a 284 00:29:39,490 --> 00:29:47,390 finger. The job would be automatically done. Why? Automated content filters can't 285 00:29:47,390 --> 00:29:54,540 tell exceptions, such as parodies. And this is a very clear case on how copyright 286 00:29:54,540 --> 00:29:59,390 infringement can be strategically used to silence any critical voices in the 287 00:29:59,390 --> 00:30:05,130 political sphere. Diego: So we see a few threats to 288 00:30:05,130 --> 00:30:11,270 fundamental rights. First on privacy that we need to scan every piece of content so 289 00:30:11,270 --> 00:30:17,470 they can discard this information. Then we will live in a sort of blackbox society 290 00:30:17,470 --> 00:30:23,160 and that will affect the freedom of speech. We will face also overcensoring, 291 00:30:23,160 --> 00:30:29,360 overblocking, chilling effects and these tools are going to be repurposed as a 292 00:30:29,360 --> 00:30:36,230 political tool. In a nutshell, rights can only be restricted when there is approved 293 00:30:36,230 --> 00:30:41,040 necessity, when the measure is proportional and when this measure is also 294 00:30:41,040 --> 00:30:47,340 effective. This filters are not necessary for the ends they want to achieve. They 295 00:30:47,340 --> 00:30:51,450 are not proportional, as we have seen, and they are not effective, as we have seen as 296 00:30:51,450 --> 00:30:56,070 well. So these in effect these are an unlawful restriction of freedom of 297 00:30:56,070 --> 00:31:03,480 expression and privacy rights. Andreea: Now obviously we were also 298 00:31:03,480 --> 00:31:09,840 unhappy about these and, I mentioned before, how we organize within our network 299 00:31:09,840 --> 00:31:19,690 to fight to get strong a privacy. When it comes to copyright, this fight went out of 300 00:31:19,690 --> 00:31:30,380 our network. It got a lot of people mad, people like librarians, startups, the U.N. 301 00:31:30,380 --> 00:31:37,330 special rapporteur, all of those there, basically. And more. And even YouTube 302 00:31:37,330 --> 00:31:45,810 in the end who thought about endorsing our right to campaign. What we learned from 303 00:31:45,810 --> 00:31:54,510 these fights is that we really need to share knowledge between us. We need to 304 00:31:54,510 --> 00:32:01,820 team up, coordinate actions, be patient with each other. When it comes to 305 00:32:01,820 --> 00:32:08,110 different skills, it is important to unite them. When it comes to different 306 00:32:08,110 --> 00:32:14,830 perspectives, it is important to acknowledge them. If we're separate 307 00:32:14,830 --> 00:32:22,410 individuals, by ourselves we're just many; but if we're together, we're one big 308 00:32:22,410 --> 00:32:34,460 giant. That is where the impact lays. Now, this is basically a call to you. If you're 309 00:32:34,460 --> 00:32:40,800 worried about anything that we've told you today, if you want to support our fight, 310 00:32:40,800 --> 00:32:46,640 if you think that laws aimed at controlling our bodies and our speech 311 00:32:46,640 --> 00:32:54,050 should not be the ones that should rule us and our internet, I think it's time to get 312 00:32:54,050 --> 00:33:00,860 involved. Whether you're a journalist writing about privacy or other topics, 313 00:33:00,860 --> 00:33:05,940 whether you're a lawyer working in a human rights organization, whether you're a 314 00:33:05,940 --> 00:33:12,960 technical mindset, whether you have no clue about laws or anything like that, 315 00:33:12,960 --> 00:33:18,730 come talk to us. We will have two workshops, one on e-privacy, one on 316 00:33:18,730 --> 00:33:25,030 upload filters we'll be answering more questions if you have and you can't ask 317 00:33:25,030 --> 00:33:33,440 them today and try to put together an action plan. We also have a cluster called 318 00:33:33,440 --> 00:33:45,239 "about freedom" that you can see there but is right by the info point in SSL. 319 00:33:45,239 --> 00:34:03,745 And do you have any questions or comments? Thank you. *applause* 320 00:34:03,745 --> 00:34:09,500 Angel: There is ample time for Q and A, so fire away if you have questions. Go to 321 00:34:09,500 --> 00:34:18,209 a microphone, wave your hands. Signal Angel, are there any questions on 322 00:34:18,209 --> 00:34:28,909 the internet? Nope. Uh, mycrophone number one! 323 00:34:28,909 --> 00:34:40,429 *inaudible question from the audience* Diego: So the question is if the content 324 00:34:40,429 --> 00:34:43,940 is encrypted, how companies will be obliged to implement its filters? 325 00:34:43,940 --> 00:34:46,612 Microphone 1: Yes. Diego: Good question, I don't know. 326 00:34:46,612 --> 00:34:50,440 I don't think that's gonna be possible. They got to find a way to do that because 327 00:34:50,440 --> 00:34:54,940 either did they ban the encryption in the channels or they -- it doesn't matter 328 00:34:54,940 --> 00:34:58,588 because they will make you liable. If you have platforms with very encrypted 329 00:34:58,588 --> 00:35:03,190 channels and you have everything on either but by any reason they 330 00:35:03,190 --> 00:35:07,800 find any copyrighted content which is not licensed, which you're not paying money 331 00:35:07,800 --> 00:35:12,590 for, that will make you liable. Perhaps in practice they will not be able to find you 332 00:35:12,590 --> 00:35:16,540 to make you liable because they will not be able to access the content but if they 333 00:35:16,540 --> 00:35:19,440 find a way to do so they will they will make you pay. 334 00:35:19,440 --> 00:35:22,820 M1: Thank you. Angel: Okay, microphone number two. 335 00:35:22,820 --> 00:35:27,320 Microphone 2: Thank you very much for the presentation. You've been talking a 336 00:35:27,320 --> 00:35:32,770 lot about upload filters. A lot of the telcos and lobbyists are saying that the 337 00:35:32,770 --> 00:35:42,850 upload filters don't exist, that trial mechanism for the copyright reform is as 338 00:35:42,850 --> 00:35:48,910 I've heard ending in January and there will be a solution in the European 339 00:35:48,910 --> 00:35:56,810 legislation process. How will we be able to inform this process and influence it to 340 00:35:56,810 --> 00:36:00,870 try and make it better before the European elections? 341 00:36:00,870 --> 00:36:05,930 Diego: Well we still have time, that's why we are here, one of our main goals to be 342 00:36:05,930 --> 00:36:11,200 35C3 apart from enjoying the conference for the very first time is to mobilize all 343 00:36:11,200 --> 00:36:15,520 of those who have not been mobilized yet. Thousands of people have been active they 344 00:36:15,520 --> 00:36:20,150 have been tweeting they have been calling their MEPs, the members of the European 345 00:36:20,150 --> 00:36:24,590 Parliament, they have been contacting the national governments. We still have time. 346 00:36:24,590 --> 00:36:29,350 The vote will be some time around January February. We still don't know. We. We are 347 00:36:29,350 --> 00:36:33,230 afraid there's gonna be sooner than expected, but this is the last push, the 348 00:36:33,230 --> 00:36:38,040 last push to say no to the entire directive and say no to upload filters. 349 00:36:38,040 --> 00:36:41,640 And that's why we are here because we still have time. Worst case scenario: 350 00:36:41,640 --> 00:36:45,490 We'll go to the implementation phase, of course, we go to a national member state 351 00:36:45,490 --> 00:36:49,710 and say "Do not do this. This goes against the Charter of Fundamental Rights and then 352 00:36:49,710 --> 00:36:54,710 we will stop it there. Better either now, which is my my hope, or in the worst case 353 00:36:54,710 --> 00:37:00,230 scenario we will stop it for sure, in member states. 354 00:37:00,230 --> 00:37:06,710 *applause* Angel: Microphone number one. 355 00:37:06,710 --> 00:37:12,966 Microphone 1: You talked about voluntary measures that companies are 356 00:37:12,966 --> 00:37:19,880 somehow asked to implement. What are the incentives for the companies to do so? 357 00:37:19,880 --> 00:37:22,640 Diego: What do the companies have to do with that? 358 00:37:22,640 --> 00:37:25,980 M1: Why should they do that, because it's voluntary, you said. 359 00:37:25,980 --> 00:37:29,610 Diego: Well they could do that for different reasons because they could get 360 00:37:29,610 --> 00:37:34,790 bad PR. Imagine you are a small company in Hungary and then goes Orban and tells you 361 00:37:34,790 --> 00:37:38,570 "You need to block this because I think that this is terrorism. It comes from a 362 00:37:38,570 --> 00:37:43,030 human rights organization." What would you do, if you understand me? That depends on 363 00:37:43,030 --> 00:37:47,010 perhaps not on the government, but on the general structure. You could get bad PR 364 00:37:47,010 --> 00:37:52,209 from the government, you could be, perhaps too because you're not acting promptly on 365 00:37:52,209 --> 00:37:55,670 on these serious content, but it's true that is only for your voluntary 366 00:37:55,670 --> 00:38:02,880 consideration. Angel: Again microphone number one. 367 00:38:02,880 --> 00:38:08,158 Microphone 1: Thanks for the talk. So, I also think, when I see a problem, oh 368 00:38:08,158 --> 00:38:13,380 there is a technical solution. So it's hard for me to admit, maybe not. But it 369 00:38:13,380 --> 00:38:19,350 does look like it's the case but also when you mention in the workshop maybe more 370 00:38:19,350 --> 00:38:23,470 with a... I mean anybody can come, but more eventually with a legal background. I 371 00:38:23,470 --> 00:38:27,560 don't have it. I'm a developer, but I want to understand how a system is working and 372 00:38:27,560 --> 00:38:31,270 I understand a little bit about the European process and the regulatory 373 00:38:31,270 --> 00:38:35,110 process but not so much. So what's the most efficient way for me as a developer 374 00:38:35,110 --> 00:38:42,730 to get a better grasp of how this system, all those laws and regulation, are getting 375 00:38:42,730 --> 00:38:48,130 implemented and all the different steps. Diego: Well yeah. We didn't come to the 376 00:38:48,130 --> 00:38:52,061 Lawyers Computer Congress,we came to the Chaos Computer Congress, so we can 377 00:38:52,061 --> 00:38:56,750 make chaos out of it. We need developers, we need lawyers, we need journalists, we 378 00:38:56,750 --> 00:39:01,900 need graphic designers, we need people with all sorts of skills. As Andreea was 379 00:39:01,900 --> 00:39:06,500 saying before, and we need developers to develop tools that work so we are capable 380 00:39:06,500 --> 00:39:10,600 of developing any calling tool, any tweeting or any sort of tool that we can 381 00:39:10,600 --> 00:39:15,662 use to transport our message and take it to Brussels, take it to the members of the 382 00:39:15,662 --> 00:39:19,770 European Parliament, take to the national member states. We really need you, if we 383 00:39:19,770 --> 00:39:23,610 need something, it's developers. We have enough lawyers in this world. I think we 384 00:39:23,610 --> 00:39:29,030 have too many with myself already, so we need you tomorrow and the day after 385 00:39:29,030 --> 00:39:38,970 tomorrow. Angel: Okay. Any other questions? In that 386 00:39:38,970 --> 00:39:42,781 case, I'll ask one myself. Andreea, what will be a good start at the member state 387 00:39:42,781 --> 00:39:46,760 level to start campaigning if you've never campaigned before? Andreea: What, what? 388 00:39:46,760 --> 00:39:50,180 Can you please repeat? Angel: What would be a good start. If you 389 00:39:50,180 --> 00:39:52,750 wanted to campaign at the member state level ... ? 390 00:39:52,750 --> 00:39:58,090 Andreea: ...and never campaigned before. Angel: Yes. Campaigning for Dummies. 391 00:39:58,090 --> 00:40:05,920 Andreea: Well we've got a lot of organizations in EU member states. So, as 392 00:40:05,920 --> 00:40:11,540 a person who has never campaigned before and was looking for someone to campaign 393 00:40:11,540 --> 00:40:18,390 with two years ago in Denmark, I was advised to look for the Danish EDRi 394 00:40:18,390 --> 00:40:24,110 member. So I did and we managed to organize a lot of great workshops in 395 00:40:24,110 --> 00:40:34,220 Denmark when nothing existed, because IT- Pol, the Danish member, had a very complex 396 00:40:34,220 --> 00:40:42,250 grasp of the political environment and most of EDRi members understand how this 397 00:40:42,250 --> 00:40:49,720 is, how the dynamic is working, both politically, but also journalists, also 398 00:40:49,720 --> 00:40:58,570 what the the interests of certain nationalities are. So I would say that: 399 00:40:58,570 --> 00:41:05,930 find your first EDRi organization, that is the first step and then unite with the 400 00:41:05,930 --> 00:41:08,940 rest. Diego: And there's another way. Remember 401 00:41:08,940 --> 00:41:13,380 you can always contact consumer organizations. You can contact directly 402 00:41:13,380 --> 00:41:17,590 your members of the parliament. You can organize yourself with two or three 403 00:41:17,590 --> 00:41:22,480 friends and make a few phone calls, that's also already enough you can do, there are 404 00:41:22,480 --> 00:41:29,640 many ways for you to help out. Andreea: Of course, make sure you contact 405 00:41:29,640 --> 00:41:36,940 your country's MEP at the European level. We are being represented and we get to 406 00:41:36,940 --> 00:41:41,730 actually elect the parliamentaries, they're the only ones who are elected by 407 00:41:41,730 --> 00:41:54,060 us and not just proposed by governments or other politicians. So if we want to be 408 00:41:54,060 --> 00:41:59,619 connected to our country member state but influence a law at the European level like 409 00:41:59,619 --> 00:42:04,600 the ones we talked about, it is very important to let our EU parliamentaries 410 00:42:04,600 --> 00:42:10,620 know that we are here and we hear them and they came from our country to represent us 411 00:42:10,620 --> 00:42:17,250 at EU level. Angel: Thank you. Any other questions? 412 00:42:17,250 --> 00:42:19,770 Signal Angel, do we have questions from the Internet? 413 00:42:19,770 --> 00:42:23,500 Signal Angel: Unfortunately not. Angel: Well in that case we're finished. 414 00:42:23,500 --> 00:42:28,370 Thank you all for your attention. 415 00:42:28,370 --> 00:42:39,230 *applause* 416 00:42:39,230 --> 00:43:00,000 subtitles created by c3subtitles.de in the year 2020. Join, and help us!