0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/376 Thanks! 1 00:00:09,530 --> 00:00:11,749 Um, yes, so today my talk 2 00:00:11,750 --> 00:00:14,449 is about, uh, hacking ethics in education 3 00:00:14,450 --> 00:00:16,789 and, um, I 4 00:00:16,790 --> 00:00:19,219 sort of on purpose, made the 5 00:00:19,220 --> 00:00:21,409 title a little bit, um, 6 00:00:21,410 --> 00:00:23,569 uh, have different meanings. 7 00:00:23,570 --> 00:00:24,979 So I'll be talking about hacking and 8 00:00:24,980 --> 00:00:26,059 education first. 9 00:00:26,060 --> 00:00:27,979 So, um, a little bit of background, why 10 00:00:27,980 --> 00:00:30,079 we do hacking, um, and 11 00:00:30,080 --> 00:00:31,699 what kind of hacking we do. 12 00:00:31,700 --> 00:00:33,919 Um, then I'll be talking about 13 00:00:33,920 --> 00:00:36,019 hacking ethics in education, how 14 00:00:36,020 --> 00:00:38,089 to um how people feel 15 00:00:38,090 --> 00:00:40,189 about the hacking, um 16 00:00:40,190 --> 00:00:41,929 and how they work on hacking. 17 00:00:41,930 --> 00:00:44,359 And then I'm talking about hacking 18 00:00:44,360 --> 00:00:46,519 ethics into education. 19 00:00:46,520 --> 00:00:48,709 Um and I'll be talking about the 20 00:00:48,710 --> 00:00:50,149 different viewpoints, we'll be talking 21 00:00:50,150 --> 00:00:52,129 about the viewpoints from the students, 22 00:00:52,130 --> 00:00:54,499 uh, the teachers and 23 00:00:54,500 --> 00:00:56,179 most importantly, the administration of 24 00:00:56,180 --> 00:00:57,180 the university. 25 00:00:57,890 --> 00:00:58,890 Um. 26 00:01:00,300 --> 00:01:02,819 The hacking and education we have a 27 00:01:02,820 --> 00:01:04,708 system in network engineering master is, 28 00:01:04,709 --> 00:01:07,139 uh, where I'm a teacher, um, 29 00:01:07,140 --> 00:01:08,819 and this started at the University of 30 00:01:08,820 --> 00:01:10,589 Amsterdam in 2003. 31 00:01:10,590 --> 00:01:12,809 So we have been doing this for 11 32 00:01:12,810 --> 00:01:13,719 years. 33 00:01:13,720 --> 00:01:16,019 Um, there are 34 00:01:16,020 --> 00:01:18,389 so this deals with various subjects 35 00:01:18,390 --> 00:01:20,679 of a system and network engineering. 36 00:01:20,680 --> 00:01:22,739 Um, we have networking experts, 37 00:01:22,740 --> 00:01:25,109 but we also, um, educate 38 00:01:25,110 --> 00:01:26,759 forensics experts. 39 00:01:26,760 --> 00:01:28,889 Um, so we 40 00:01:28,890 --> 00:01:31,049 have various security related, uh, 41 00:01:31,050 --> 00:01:33,059 courses too, that, uh, throughout the 42 00:01:33,060 --> 00:01:35,189 year, um, 43 00:01:35,190 --> 00:01:37,319 and there are many different 44 00:01:37,320 --> 00:01:38,879 research projects which are very 45 00:01:38,880 --> 00:01:40,469 intensive research projects where the 46 00:01:40,470 --> 00:01:42,719 students work on a single topic for one 47 00:01:42,720 --> 00:01:45,029 month. Um, and they really 48 00:01:45,030 --> 00:01:47,279 delve deep into, uh, into 49 00:01:47,280 --> 00:01:49,769 a particular subject and 50 00:01:49,770 --> 00:01:52,229 some examples of what they do within 51 00:01:52,230 --> 00:01:54,059 a month or six weeks. 52 00:01:54,060 --> 00:01:56,849 If this is during the course, um, 53 00:01:56,850 --> 00:01:59,009 one of the first acts of the of a chip 54 00:01:59,010 --> 00:02:01,499 cards, uh, the Dutch public transport 55 00:02:01,500 --> 00:02:04,049 chip card. And that, um, happens 56 00:02:04,050 --> 00:02:06,539 by students, by our education. 57 00:02:06,540 --> 00:02:09,569 Um, we 58 00:02:09,570 --> 00:02:12,029 did some hack on Tinder, which 59 00:02:12,030 --> 00:02:14,369 I may talk about if I have some time. 60 00:02:14,370 --> 00:02:16,439 We have found vulnerabilities 61 00:02:16,440 --> 00:02:18,539 in mobile banking apps, um, 62 00:02:18,540 --> 00:02:19,649 that students worked on. 63 00:02:19,650 --> 00:02:21,749 And, uh, also the 64 00:02:21,750 --> 00:02:23,909 the Dutch passport tip, 65 00:02:23,910 --> 00:02:27,059 um, was cloned by a student of ours. 66 00:02:27,060 --> 00:02:29,189 Um, now if we talk about 67 00:02:29,190 --> 00:02:31,949 hacking ethics in education, um, 68 00:02:31,950 --> 00:02:33,689 what students think they do when they're 69 00:02:33,690 --> 00:02:35,789 when they're hacking, um, they 70 00:02:35,790 --> 00:02:37,439 feel they're very powerful and they can 71 00:02:37,440 --> 00:02:38,399 hack anything. 72 00:02:38,400 --> 00:02:40,289 Um, they want to break stuff and they 73 00:02:40,290 --> 00:02:42,809 think they have the knowledge 74 00:02:42,810 --> 00:02:45,239 and they are really into 75 00:02:45,240 --> 00:02:47,699 hacking and they want to 76 00:02:47,700 --> 00:02:50,009 scrutinize the security of products. 77 00:02:50,010 --> 00:02:52,169 And their first aim is to break 78 00:02:52,170 --> 00:02:53,219 stuff. 79 00:02:53,220 --> 00:02:54,479 They don't want to see if something is 80 00:02:54,480 --> 00:02:57,059 secure or not. They want to break stuff. 81 00:02:57,060 --> 00:02:58,739 Then if you look at what the teachers 82 00:02:58,740 --> 00:02:59,740 think they do 83 00:03:00,990 --> 00:03:03,569 is they're shooting these, um, 84 00:03:03,570 --> 00:03:05,789 most of them are not very convinced yet. 85 00:03:05,790 --> 00:03:08,219 And most of these things are reasonably 86 00:03:08,220 --> 00:03:10,259 secure and it's well thought out. 87 00:03:10,260 --> 00:03:11,260 Um. 88 00:03:12,060 --> 00:03:14,039 It's interesting that the students learn 89 00:03:14,040 --> 00:03:16,499 about, uh, looking at the security 90 00:03:16,500 --> 00:03:18,599 of different products, 91 00:03:18,600 --> 00:03:20,969 um, and it's good that they start 92 00:03:20,970 --> 00:03:23,519 shooting at it and see how that 93 00:03:23,520 --> 00:03:24,749 feels like. 94 00:03:24,750 --> 00:03:27,659 Um, but most of the things aren't 95 00:03:27,660 --> 00:03:28,599 really effective. 96 00:03:28,600 --> 00:03:29,789 That doesn't mean. 97 00:03:29,790 --> 00:03:32,039 So the security of the hacking itself 98 00:03:32,040 --> 00:03:34,229 may not be effective, but the experience 99 00:03:34,230 --> 00:03:36,809 for the students is very effective. 100 00:03:36,810 --> 00:03:37,810 Um, 101 00:03:38,940 --> 00:03:41,339 uh, what the administration 102 00:03:41,340 --> 00:03:43,739 thinks that they do is, is different. 103 00:03:43,740 --> 00:03:44,789 They're very scared. 104 00:03:44,790 --> 00:03:46,559 Um, they think that these students are 105 00:03:46,560 --> 00:03:48,779 all very elite hackers and 106 00:03:48,780 --> 00:03:50,879 are breaking stuff and doing this in very 107 00:03:50,880 --> 00:03:53,009 nasty ways that the teachers 108 00:03:53,010 --> 00:03:54,010 don't see. 109 00:03:54,600 --> 00:03:57,119 Now, the problem with this 110 00:03:57,120 --> 00:03:59,519 is, um, 111 00:03:59,520 --> 00:04:01,529 even if you're shooting these, sometimes 112 00:04:01,530 --> 00:04:03,659 you shoot at the right spot and you 113 00:04:03,660 --> 00:04:05,309 break it off. 114 00:04:05,310 --> 00:04:07,169 This is what happened with the mobile 115 00:04:07,170 --> 00:04:09,269 banking security. We thought, well, we 116 00:04:09,270 --> 00:04:10,979 can let the students look at mobile 117 00:04:10,980 --> 00:04:13,139 banking security and they probably 118 00:04:13,140 --> 00:04:15,839 have their have their act together. 119 00:04:15,840 --> 00:04:18,208 Um, and, uh, 120 00:04:18,209 --> 00:04:19,979 it was even audited. 121 00:04:19,980 --> 00:04:22,139 Um, the mobile banking app 122 00:04:22,140 --> 00:04:24,419 was audited by, uh, very 123 00:04:24,420 --> 00:04:26,219 professional company. 124 00:04:26,220 --> 00:04:28,319 But, um, the students looked at 125 00:04:28,320 --> 00:04:30,509 this for a day or two and they they 126 00:04:30,510 --> 00:04:32,339 it was completely broken. 127 00:04:32,340 --> 00:04:33,340 Um, 128 00:04:34,470 --> 00:04:36,849 um, they could, uh, they scared 129 00:04:36,850 --> 00:04:38,969 the banking official by showing 130 00:04:38,970 --> 00:04:41,159 him his his, um, pin 131 00:04:41,160 --> 00:04:42,839 number when they did the demonstration. 132 00:04:51,390 --> 00:04:53,639 Um, but even 133 00:04:53,640 --> 00:04:55,499 even if you're not breaking stuff, if 134 00:04:55,500 --> 00:04:57,869 you're shooting at somebody, uh, 135 00:04:57,870 --> 00:04:59,939 they can get very annoyed, 136 00:04:59,940 --> 00:05:02,019 um, even if you're not 137 00:05:02,020 --> 00:05:03,399 really breaking stuff. 138 00:05:03,400 --> 00:05:05,579 Um, and then if they get 139 00:05:05,580 --> 00:05:08,009 annoyed and then, um, 140 00:05:08,010 --> 00:05:10,199 very often the old boys network kicks 141 00:05:10,200 --> 00:05:12,269 in, um, and 142 00:05:12,270 --> 00:05:14,429 the staff, the upper staff 143 00:05:14,430 --> 00:05:16,529 of the company contacts the upper 144 00:05:16,530 --> 00:05:18,779 staff of the administration and 145 00:05:18,780 --> 00:05:20,759 they panic and freak out. 146 00:05:20,760 --> 00:05:21,899 And um 147 00:05:23,970 --> 00:05:26,099 and that's when, um, we 148 00:05:26,100 --> 00:05:27,959 started doing the hacking ethics in 149 00:05:27,960 --> 00:05:29,609 education because what. 150 00:05:29,610 --> 00:05:32,219 So the hacking ethics in education. 151 00:05:32,220 --> 00:05:34,679 Um, because we wanted to see 152 00:05:34,680 --> 00:05:36,539 that uh we wanted to show the 153 00:05:36,540 --> 00:05:38,669 administration that they they really 154 00:05:38,670 --> 00:05:40,559 are ethical hackers, that they're 155 00:05:40,560 --> 00:05:42,419 thinking about what they're doing. 156 00:05:42,420 --> 00:05:44,579 So, um, the administration, they 157 00:05:44,580 --> 00:05:46,739 freaked out and they did what they always 158 00:05:46,740 --> 00:05:48,689 do when when university administration 159 00:05:48,690 --> 00:05:50,759 freaks out, they formed a committee, 160 00:05:52,030 --> 00:05:54,449 um, they formed an ethical 161 00:05:54,450 --> 00:05:56,639 committee, uh, for the computer science 162 00:05:56,640 --> 00:05:58,049 faculty. 163 00:05:58,050 --> 00:06:00,269 And this was for the whole faculty, 164 00:06:00,270 --> 00:06:02,399 um, uh, and it 165 00:06:02,400 --> 00:06:04,469 included all nonstandard research. 166 00:06:04,470 --> 00:06:06,569 So the standard research that 167 00:06:06,570 --> 00:06:08,879 each of the, uh, research groups 168 00:06:08,880 --> 00:06:10,829 do that doesn't have to go through the 169 00:06:10,830 --> 00:06:13,019 whole ethical, um, evaluation 170 00:06:13,020 --> 00:06:14,219 procedure. 171 00:06:14,220 --> 00:06:15,779 But if you do anything out of the 172 00:06:15,780 --> 00:06:18,419 ordinary, then you have to, uh, um, 173 00:06:18,420 --> 00:06:20,819 or you should be doing something 174 00:06:20,820 --> 00:06:22,019 with the ethical committee. 175 00:06:23,110 --> 00:06:25,289 Um, the problem a little bit with 176 00:06:25,290 --> 00:06:26,669 this ethical committee was that it's 177 00:06:26,670 --> 00:06:29,579 formed by, uh, group leaders 178 00:06:29,580 --> 00:06:32,279 of the computer science faculty. 179 00:06:32,280 --> 00:06:34,199 Um, and the problem with group leaders is 180 00:06:34,200 --> 00:06:35,910 that they have a very busy agenda. 181 00:06:39,000 --> 00:06:41,189 So the Ethics 182 00:06:41,190 --> 00:06:43,469 Committee doesn't meet very often and 183 00:06:43,470 --> 00:06:45,029 they don't have a lot of time to review 184 00:06:45,030 --> 00:06:47,279 projects. And, um, like 185 00:06:47,280 --> 00:06:49,859 I mentioned earlier in my talk, 186 00:06:49,860 --> 00:06:51,899 we have a one year program and it's a 187 00:06:51,900 --> 00:06:53,549 very intensive program. 188 00:06:53,550 --> 00:06:55,319 And we don't have a lot of time if 189 00:06:55,320 --> 00:06:57,569 students are doing projects to review 190 00:06:57,570 --> 00:06:59,759 those projects, especially not 191 00:06:59,760 --> 00:07:02,099 for, um, uh, 192 00:07:02,100 --> 00:07:05,219 talking about, um, the the 193 00:07:05,220 --> 00:07:07,439 review time in in a matter 194 00:07:07,440 --> 00:07:09,749 of months, that doesn't work with 195 00:07:09,750 --> 00:07:10,750 our education. 196 00:07:11,700 --> 00:07:13,919 Um, and finally, there was also a lawyer 197 00:07:13,920 --> 00:07:16,619 on the Ethics Committee, um, 198 00:07:16,620 --> 00:07:18,749 but there in this Ethics Committee, there 199 00:07:18,750 --> 00:07:22,019 is no ethical um, there's no ethicist, 200 00:07:22,020 --> 00:07:23,639 um, which is a bit strange, I think, 201 00:07:23,640 --> 00:07:25,799 because, uh, I wonder how 202 00:07:25,800 --> 00:07:27,660 this makes it an ethical committee, but 203 00:07:28,830 --> 00:07:30,059 they think that this is an ethical 204 00:07:30,060 --> 00:07:31,199 committee. Um, 205 00:07:33,020 --> 00:07:35,189 um, we so we found 206 00:07:35,190 --> 00:07:37,079 we had to deal with this because this was 207 00:07:37,080 --> 00:07:38,939 the ethical committee for our computer 208 00:07:38,940 --> 00:07:40,049 science faculty. 209 00:07:40,050 --> 00:07:42,179 Uh, and they were imposing us that, 210 00:07:42,180 --> 00:07:44,159 uh, all of our project had to go through 211 00:07:44,160 --> 00:07:46,919 them. And then we said, well, 212 00:07:46,920 --> 00:07:49,289 um, maybe we can hack the process. 213 00:07:49,290 --> 00:07:51,689 Um, uh, and 214 00:07:51,690 --> 00:07:54,119 we form a committee ourselves. 215 00:07:54,120 --> 00:07:56,669 Um, so we formed an ethical committee 216 00:07:56,670 --> 00:07:59,309 for the Master Education, 217 00:07:59,310 --> 00:08:01,439 um, and, uh, we wrote a 218 00:08:01,440 --> 00:08:03,599 procedure how to make 219 00:08:03,600 --> 00:08:05,429 the students think about, uh, the ethical 220 00:08:05,430 --> 00:08:07,529 aspects of their security research. 221 00:08:07,530 --> 00:08:09,749 Um, when they do a 222 00:08:09,750 --> 00:08:11,789 research project or a security project, 223 00:08:12,840 --> 00:08:14,309 the first thing they have to do is they 224 00:08:14,310 --> 00:08:15,539 write the proposal, what they were 225 00:08:15,540 --> 00:08:16,540 planning to do. 226 00:08:17,250 --> 00:08:19,499 Um, this is so that we know, 227 00:08:19,500 --> 00:08:21,869 um, we have some idea that they 228 00:08:21,870 --> 00:08:24,689 know what they're doing, um, that 229 00:08:24,690 --> 00:08:27,149 we think that this is a valuable project, 230 00:08:27,150 --> 00:08:29,549 um, that they that this is interesting 231 00:08:29,550 --> 00:08:31,919 enough for the course that they can get 232 00:08:31,920 --> 00:08:34,079 a passing grade at the end of the course. 233 00:08:34,080 --> 00:08:36,298 Um, but we changed this so that they 234 00:08:36,299 --> 00:08:38,369 also now have to write an 235 00:08:38,370 --> 00:08:41,339 ethical paragraph into that project plan. 236 00:08:41,340 --> 00:08:43,589 So, um, 237 00:08:43,590 --> 00:08:45,089 at the end of the project plan, they have 238 00:08:45,090 --> 00:08:47,279 to say, what are the the 239 00:08:47,280 --> 00:08:49,469 ethical impacts, um, 240 00:08:49,470 --> 00:08:50,879 of this project? 241 00:08:50,880 --> 00:08:53,069 What is the, uh, who is going to get 242 00:08:53,070 --> 00:08:54,899 harmed if you if you actually break 243 00:08:54,900 --> 00:08:55,979 stuff? 244 00:08:55,980 --> 00:08:58,259 Um, uh, are you impacting 245 00:08:58,260 --> 00:08:59,969 on the privacy of user data? 246 00:08:59,970 --> 00:09:01,710 Are you getting access to user data 247 00:09:02,760 --> 00:09:04,649 and these kinds of things they have to 248 00:09:04,650 --> 00:09:06,749 write down and they have to 249 00:09:06,750 --> 00:09:07,889 defend this? 250 00:09:07,890 --> 00:09:10,499 Uh, to the teacher, 251 00:09:10,500 --> 00:09:12,929 then the teacher of the course 252 00:09:12,930 --> 00:09:15,059 with the Ethical Committee of the 253 00:09:15,060 --> 00:09:17,349 Master Education is a very small, 254 00:09:17,350 --> 00:09:19,439 very small ethical committee. 255 00:09:19,440 --> 00:09:21,599 Um, and they meet, uh, 256 00:09:21,600 --> 00:09:23,699 and they evaluate all 257 00:09:23,700 --> 00:09:26,249 of the projects that the students do, 258 00:09:26,250 --> 00:09:27,929 uh, handed in. 259 00:09:27,930 --> 00:09:30,629 And we, uh, first of all, we check 260 00:09:30,630 --> 00:09:32,939 whether the and the ethical paragraphs 261 00:09:32,940 --> 00:09:35,219 that they handed in is actually complete, 262 00:09:35,220 --> 00:09:37,709 that they identified all the major issues 263 00:09:37,710 --> 00:09:40,449 that they identified, um, uh, 264 00:09:40,450 --> 00:09:42,089 um, all of the things that could go 265 00:09:42,090 --> 00:09:44,039 wrong, all the actors that may get 266 00:09:44,040 --> 00:09:45,040 involved. 267 00:09:46,080 --> 00:09:47,580 And then, um, 268 00:09:48,990 --> 00:09:49,759 uh. 269 00:09:49,760 --> 00:09:52,099 And then we give this a traffic 270 00:09:52,100 --> 00:09:54,319 light status, we go 271 00:09:54,320 --> 00:09:56,600 from green to red, 272 00:09:58,040 --> 00:09:59,989 green means that there is basically no 273 00:09:59,990 --> 00:10:02,119 ethical considerations involved. 274 00:10:02,120 --> 00:10:04,189 This is an evaluation of a 275 00:10:04,190 --> 00:10:06,409 security tool or an offline analysis 276 00:10:06,410 --> 00:10:08,479 of something where 277 00:10:08,480 --> 00:10:10,789 they don't get into contact with any user 278 00:10:10,790 --> 00:10:12,919 data, that it doesn't have 279 00:10:12,920 --> 00:10:15,829 an impact on our user privacy, 280 00:10:15,830 --> 00:10:16,779 etc.. 281 00:10:16,780 --> 00:10:19,039 Um, yellow is if they may get 282 00:10:19,040 --> 00:10:21,230 into contact with, uh, um, 283 00:10:22,700 --> 00:10:24,949 with personal data, but 284 00:10:24,950 --> 00:10:27,319 in a very confined way or, 285 00:10:27,320 --> 00:10:29,809 um, we don't think that they 286 00:10:29,810 --> 00:10:31,489 will break stuff. 287 00:10:31,490 --> 00:10:33,559 And if they do, it 288 00:10:33,560 --> 00:10:35,119 will not have a very high impact on 289 00:10:35,120 --> 00:10:36,709 society. 290 00:10:36,710 --> 00:10:38,899 Um, orange is where things get, 291 00:10:38,900 --> 00:10:41,479 uh, a little bit serious, where they 292 00:10:41,480 --> 00:10:43,549 have access to a lot of they 293 00:10:43,550 --> 00:10:45,379 possibly have access to a lot of user 294 00:10:45,380 --> 00:10:47,929 data privacy impacting 295 00:10:47,930 --> 00:10:49,219 data. 296 00:10:49,220 --> 00:10:51,109 Or if they actually manage to break 297 00:10:51,110 --> 00:10:52,129 things, then 298 00:10:53,450 --> 00:10:54,949 this is going to have a very high 299 00:10:54,950 --> 00:10:55,969 security impact. 300 00:10:57,590 --> 00:10:59,659 Um, and red is 301 00:10:59,660 --> 00:11:01,849 something that we, um, it's 302 00:11:01,850 --> 00:11:04,459 a status that we use for projects 303 00:11:04,460 --> 00:11:06,559 that that actually cross the line but 304 00:11:06,560 --> 00:11:08,659 may be important 305 00:11:08,660 --> 00:11:10,579 to do anyway. 306 00:11:10,580 --> 00:11:12,679 Um, or 307 00:11:12,680 --> 00:11:13,759 not. 308 00:11:13,760 --> 00:11:16,009 I mean, sometimes you have product 309 00:11:16,010 --> 00:11:18,529 and you're going to cross 310 00:11:18,530 --> 00:11:20,839 an ethical line, but it may 311 00:11:20,840 --> 00:11:23,599 be that the problem is so important 312 00:11:23,600 --> 00:11:25,250 or so politically relevant 313 00:11:26,450 --> 00:11:28,639 that you think that this is a project 314 00:11:28,640 --> 00:11:29,779 that has to be done anyway. 315 00:11:30,860 --> 00:11:32,959 But then we agreed with the 316 00:11:32,960 --> 00:11:35,329 Ethical Committee for the Faculty 317 00:11:35,330 --> 00:11:37,099 that any project that is going to be 318 00:11:37,100 --> 00:11:39,349 read, uh, we're going to report to them 319 00:11:39,350 --> 00:11:41,299 before we start doing the project, then 320 00:11:41,300 --> 00:11:43,010 we get permission to do this. 321 00:11:44,060 --> 00:11:46,279 Um, and this 322 00:11:46,280 --> 00:11:48,439 has worked very well over the last 323 00:11:48,440 --> 00:11:49,219 year. 324 00:11:49,220 --> 00:11:51,349 Um, from the 325 00:11:51,350 --> 00:11:53,449 student's standpoint, 326 00:11:53,450 --> 00:11:55,909 we've looked we've now, uh, forced 327 00:11:55,910 --> 00:11:58,129 the students to think about ethics 328 00:11:58,130 --> 00:11:59,779 when they're doing the research. 329 00:11:59,780 --> 00:12:01,369 And this has led to very valuable 330 00:12:01,370 --> 00:12:02,809 insights for them. 331 00:12:02,810 --> 00:12:04,789 And it's been a very good experience for 332 00:12:04,790 --> 00:12:05,779 them also. 333 00:12:05,780 --> 00:12:07,699 And it also means that we force them to 334 00:12:07,700 --> 00:12:09,439 to think about the ethics before they 335 00:12:09,440 --> 00:12:11,059 start doing the research. 336 00:12:11,060 --> 00:12:13,609 And, um, they they, 337 00:12:13,610 --> 00:12:16,159 um, do an ethics by design 338 00:12:16,160 --> 00:12:18,499 in their research plan 339 00:12:18,500 --> 00:12:20,959 and how they set up, um, 340 00:12:20,960 --> 00:12:23,209 experiments and they think about privacy 341 00:12:23,210 --> 00:12:24,739 when they're setting up experiments with 342 00:12:24,740 --> 00:12:25,740 user data. 343 00:12:26,480 --> 00:12:27,480 Um. 344 00:12:28,330 --> 00:12:30,579 And this is something that wasn't 345 00:12:30,580 --> 00:12:33,189 so we had some ethical courses 346 00:12:33,190 --> 00:12:35,469 and ethics classes during 347 00:12:35,470 --> 00:12:37,989 the year, but then it was mostly abstract 348 00:12:37,990 --> 00:12:40,689 and we we started the discussions. 349 00:12:40,690 --> 00:12:42,339 Can you take a computer home with you? 350 00:12:42,340 --> 00:12:44,409 If you if it's an old computer 351 00:12:44,410 --> 00:12:46,509 at your company, can you can 352 00:12:46,510 --> 00:12:48,399 you take it? Can you look into user's 353 00:12:48,400 --> 00:12:50,979 email boxes and you do the discussion, 354 00:12:50,980 --> 00:12:52,869 but it's still a little bit at an 355 00:12:52,870 --> 00:12:54,999 abstract level that doesn't really 356 00:12:55,000 --> 00:12:57,339 touch the students at that moment. 357 00:12:57,340 --> 00:12:59,499 And then it's hard to see if you 358 00:12:59,500 --> 00:13:01,179 really made an impression about the 359 00:13:01,180 --> 00:13:03,399 ethics of of that 360 00:13:05,680 --> 00:13:06,680 of that issue. 361 00:13:07,510 --> 00:13:09,849 But now this is 362 00:13:09,850 --> 00:13:11,919 by incorporating it completely into the 363 00:13:11,920 --> 00:13:14,019 research, they're forced to think about 364 00:13:14,020 --> 00:13:17,139 this. And this impacts them in a very 365 00:13:17,140 --> 00:13:18,699 profound way because they have to think 366 00:13:18,700 --> 00:13:20,709 about their own research that they're 367 00:13:20,710 --> 00:13:22,969 doing right now and 368 00:13:22,970 --> 00:13:24,369 have to think about this and how do I do 369 00:13:24,370 --> 00:13:26,589 this in the right way and what is 370 00:13:26,590 --> 00:13:27,590 the right way? 371 00:13:28,960 --> 00:13:29,960 Um. 372 00:13:31,050 --> 00:13:33,449 Of course, they're still students, so 373 00:13:33,450 --> 00:13:35,729 we work with them to 374 00:13:35,730 --> 00:13:37,739 help them see these kind of issues. 375 00:13:37,740 --> 00:13:39,869 And even like 376 00:13:39,870 --> 00:13:41,989 I mentioned, the traffic lights 377 00:13:41,990 --> 00:13:44,279 status also means 378 00:13:44,280 --> 00:13:46,469 that the orange projects get more 379 00:13:46,470 --> 00:13:49,289 supervision by the ethics adviser, 380 00:13:49,290 --> 00:13:52,199 that they get a 381 00:13:52,200 --> 00:13:54,269 little bit closer interaction, that 382 00:13:54,270 --> 00:13:56,579 we monitor what they do and, 383 00:13:56,580 --> 00:13:58,649 um, that we take care that they 384 00:13:58,650 --> 00:13:59,650 don't cross the line. 385 00:14:01,290 --> 00:14:03,509 Um, finally, at the end 386 00:14:03,510 --> 00:14:05,639 of the of the line, if they do 387 00:14:05,640 --> 00:14:07,709 find issues, we do 388 00:14:07,710 --> 00:14:09,839 do a coordinated disclosure 389 00:14:09,840 --> 00:14:11,699 or responsible disclosure, as it's also 390 00:14:11,700 --> 00:14:13,029 called. 391 00:14:13,030 --> 00:14:15,269 Um, and this is done by the teachers 392 00:14:15,270 --> 00:14:16,829 of the education. 393 00:14:16,830 --> 00:14:19,259 Um, this is a conscious choice because 394 00:14:20,580 --> 00:14:23,159 we find it coordinated or responsible. 395 00:14:23,160 --> 00:14:25,439 Disclosure is not really accepted 396 00:14:25,440 --> 00:14:27,059 that much yet. 397 00:14:27,060 --> 00:14:29,129 It's getting better, but it's not there 398 00:14:29,130 --> 00:14:31,349 yet and it usually takes a very long 399 00:14:31,350 --> 00:14:32,429 time. 400 00:14:32,430 --> 00:14:33,430 Um. 401 00:14:34,870 --> 00:14:37,359 We've, uh, we had some projects 402 00:14:37,360 --> 00:14:39,609 before the summer, and it took more 403 00:14:39,610 --> 00:14:41,709 than two months to finish all of the, 404 00:14:41,710 --> 00:14:43,479 uh, coordinated disclosure project, and 405 00:14:43,480 --> 00:14:45,369 this is just time that the students don't 406 00:14:45,370 --> 00:14:47,649 have or if they're doing a project 407 00:14:47,650 --> 00:14:49,119 at the end of the of the year, 408 00:14:50,260 --> 00:14:52,089 they're basically gone. 409 00:14:52,090 --> 00:14:54,489 And we have to we have to make sure 410 00:14:54,490 --> 00:14:56,559 that the procedure is still, uh, 411 00:14:56,560 --> 00:14:57,560 finished. 412 00:14:58,290 --> 00:14:59,290 Um, 413 00:15:00,460 --> 00:15:03,339 I have some time for the extra slides. 414 00:15:03,340 --> 00:15:05,559 Um, like I said, 415 00:15:05,560 --> 00:15:08,499 uh, Tinder, um, 416 00:15:08,500 --> 00:15:10,149 this was a very interesting research. 417 00:15:10,150 --> 00:15:12,219 And we we showed very 418 00:15:12,220 --> 00:15:14,289 much how the, um, the 419 00:15:14,290 --> 00:15:16,269 privacy by design and ethics by design 420 00:15:16,270 --> 00:15:18,849 worked very well in this research. 421 00:15:18,850 --> 00:15:21,369 Um, little bit of 422 00:15:21,370 --> 00:15:23,889 background, um, in 423 00:15:23,890 --> 00:15:25,479 July 2013. 424 00:15:25,480 --> 00:15:28,239 I mean, everybody knows Tinder, right. 425 00:15:28,240 --> 00:15:29,240 Who doesn't know Tina? 426 00:15:31,310 --> 00:15:33,529 OK, Tinder is 427 00:15:33,530 --> 00:15:34,530 a dating app. 428 00:15:37,300 --> 00:15:39,369 And it's a very shallow dating app. 429 00:15:39,370 --> 00:15:41,709 This is the interface, and 430 00:15:41,710 --> 00:15:43,809 the way you manage the interface 431 00:15:43,810 --> 00:15:46,120 is you swipe either left or right, 432 00:15:47,320 --> 00:15:49,699 saying you like this person or you don't. 433 00:15:49,700 --> 00:15:51,369 And if you both like each other, then you 434 00:15:51,370 --> 00:15:52,899 get into contact with each other and then 435 00:15:52,900 --> 00:15:56,379 you could start chatting and, um, 436 00:15:56,380 --> 00:15:59,759 uh, and maybe go on a date or something. 437 00:15:59,760 --> 00:16:00,760 Um, 438 00:16:03,760 --> 00:16:06,849 the power of Tinder is that it uses tips 439 00:16:06,850 --> 00:16:08,529 to find matches near you. 440 00:16:08,530 --> 00:16:11,319 It doesn't really make sense if you get a 441 00:16:11,320 --> 00:16:13,809 match that's in in America 442 00:16:13,810 --> 00:16:15,849 and the chance that you actually meet is 443 00:16:15,850 --> 00:16:17,019 very slim. 444 00:16:17,020 --> 00:16:19,059 Um, so they have to have the GPS 445 00:16:19,060 --> 00:16:21,189 coordinates, two of the users 446 00:16:21,190 --> 00:16:23,349 in order to find matches near 447 00:16:23,350 --> 00:16:24,350 you. 448 00:16:25,740 --> 00:16:28,349 The problem was that in July 2013, 449 00:16:28,350 --> 00:16:29,909 somebody looked at the traffic that they 450 00:16:29,910 --> 00:16:32,089 was sending and they found 451 00:16:32,090 --> 00:16:34,109 that Tinder was actually sending the GPS 452 00:16:34,110 --> 00:16:36,209 coordinates of other users to 453 00:16:36,210 --> 00:16:38,459 your phone in 454 00:16:38,460 --> 00:16:40,919 order to measure the distance of between 455 00:16:40,920 --> 00:16:42,419 the two users. 456 00:16:42,420 --> 00:16:44,489 Um, this is, 457 00:16:44,490 --> 00:16:46,349 of course, a little bit of a privacy 458 00:16:46,350 --> 00:16:47,350 problem. 459 00:16:50,250 --> 00:16:52,079 So they fix this. 460 00:16:52,080 --> 00:16:55,409 And then in February to February 2014, 461 00:16:55,410 --> 00:16:57,209 somebody started looking at this again 462 00:16:57,210 --> 00:17:00,299 and they, uh, they fix this by sending 463 00:17:00,300 --> 00:17:02,429 the exact distance to 464 00:17:02,430 --> 00:17:03,749 the person. 465 00:17:03,750 --> 00:17:05,818 Um, but the problem with this is that the 466 00:17:05,819 --> 00:17:08,219 API that Tinder exposes 467 00:17:08,220 --> 00:17:09,239 is actually open. 468 00:17:09,240 --> 00:17:12,059 You can just use, uh, write the scripts, 469 00:17:12,060 --> 00:17:13,559 query the API. 470 00:17:13,560 --> 00:17:16,019 Um, so you take three different points 471 00:17:17,280 --> 00:17:18,839 and then you do try illustration 472 00:17:19,859 --> 00:17:21,449 and then you still find the exact 473 00:17:21,450 --> 00:17:23,219 location of the user. 474 00:17:23,220 --> 00:17:25,348 So this is this was 475 00:17:25,349 --> 00:17:27,059 all external research. 476 00:17:27,060 --> 00:17:29,159 And then in May 2014, 477 00:17:29,160 --> 00:17:30,929 our students started looking into this 478 00:17:30,930 --> 00:17:33,899 because the the the way the Tinder 479 00:17:33,900 --> 00:17:36,389 tried to fix this is because 480 00:17:36,390 --> 00:17:38,639 they implemented a, uh, 481 00:17:38,640 --> 00:17:41,079 rounding. So you get, um, 482 00:17:41,080 --> 00:17:43,199 if you if the matches are 483 00:17:43,200 --> 00:17:45,329 very close to you, they say that 484 00:17:45,330 --> 00:17:47,579 this is this user is when within one 485 00:17:47,580 --> 00:17:48,580 kilometer. 486 00:17:49,590 --> 00:17:51,969 But the API actually exposes, 487 00:17:51,970 --> 00:17:54,239 uh, lower boundary, uh, 488 00:17:54,240 --> 00:17:56,729 that you can set very much lower 489 00:17:56,730 --> 00:17:57,730 than one kilometer. 490 00:17:58,890 --> 00:18:01,079 And if you do so, you get 491 00:18:01,080 --> 00:18:02,639 different matches. 492 00:18:02,640 --> 00:18:04,199 So they started looking into this and it 493 00:18:04,200 --> 00:18:06,389 turns out that they may say 494 00:18:06,390 --> 00:18:08,549 that this is within one kilometer, but 495 00:18:08,550 --> 00:18:09,550 because you're using 496 00:18:10,620 --> 00:18:12,809 the lower lower bound, 497 00:18:12,810 --> 00:18:14,969 it's actually within one 500 498 00:18:14,970 --> 00:18:17,250 meters or within 300 499 00:18:18,300 --> 00:18:20,369 within three hundred meters 500 00:18:20,370 --> 00:18:22,439 or something like that. So you do 501 00:18:22,440 --> 00:18:24,629 a little bit more querying, but you still 502 00:18:24,630 --> 00:18:26,939 find the user, the user location. 503 00:18:31,300 --> 00:18:33,579 But then how they research 504 00:18:33,580 --> 00:18:35,649 this is they did privacy by design 505 00:18:35,650 --> 00:18:36,650 in their research plan. 506 00:18:38,200 --> 00:18:40,540 And at first they started 507 00:18:41,740 --> 00:18:43,269 I mean, the naive approach is just to 508 00:18:43,270 --> 00:18:45,369 match everything, store everything, and 509 00:18:45,370 --> 00:18:47,829 then start comparing the results. 510 00:18:47,830 --> 00:18:49,539 And we didn't want to do that because we 511 00:18:49,540 --> 00:18:51,130 wanted to preserve 512 00:18:52,360 --> 00:18:53,609 the privacy of the users. 513 00:18:56,950 --> 00:18:59,679 But they use the test user, 514 00:18:59,680 --> 00:19:00,680 but. 515 00:19:01,510 --> 00:19:02,949 They were at the university and at the 516 00:19:02,950 --> 00:19:04,569 university, students are very, 517 00:19:06,160 --> 00:19:08,739 very enthusiastic users of Tinder. 518 00:19:08,740 --> 00:19:10,329 So it was very hard to find their own 519 00:19:10,330 --> 00:19:11,330 test user. 520 00:19:13,900 --> 00:19:15,459 There were too many users. 521 00:19:15,460 --> 00:19:17,529 So what we 522 00:19:17,530 --> 00:19:19,779 ended up doing and, uh, I work with 523 00:19:19,780 --> 00:19:21,489 the students to think about the problem 524 00:19:21,490 --> 00:19:23,619 and think about how to solve this 525 00:19:23,620 --> 00:19:25,719 in a privacy friendly way 526 00:19:25,720 --> 00:19:28,569 and finally resolved by doing this, 527 00:19:28,570 --> 00:19:30,309 that we only stored the hashed idea of 528 00:19:30,310 --> 00:19:32,409 users that 529 00:19:32,410 --> 00:19:34,599 they found in their location and no other 530 00:19:34,600 --> 00:19:35,600 result. 531 00:19:39,100 --> 00:19:41,679 And in this way, you can find 532 00:19:41,680 --> 00:19:44,109 you can find the matches and you can find 533 00:19:44,110 --> 00:19:45,519 you can verify results 534 00:19:46,690 --> 00:19:48,759 without impeding on 535 00:19:48,760 --> 00:19:49,930 the privacy of the users. 536 00:19:51,910 --> 00:19:53,889 So this was a very good example of doing 537 00:19:53,890 --> 00:19:55,750 the privacy by design and 538 00:19:56,890 --> 00:19:58,989 the ethics by design in a in 539 00:19:58,990 --> 00:20:00,579 a security research. 540 00:20:02,890 --> 00:20:05,199 And that is actually what I wanted to 541 00:20:05,200 --> 00:20:07,449 say and what I wanted 542 00:20:07,450 --> 00:20:09,159 to tell you about with our ethical 543 00:20:09,160 --> 00:20:10,160 committee. 544 00:20:22,040 --> 00:20:24,139 Thank you very, very much for this 545 00:20:24,140 --> 00:20:26,179 interesting talk. 546 00:20:26,180 --> 00:20:28,199 Do we have some questions? 547 00:20:28,200 --> 00:20:30,259 So please line up at the 548 00:20:30,260 --> 00:20:31,279 microphones now. 549 00:20:31,280 --> 00:20:33,559 We have some time for a Q&A 550 00:20:34,850 --> 00:20:35,900 microphone to please. 551 00:20:36,980 --> 00:20:39,109 Did your institution, I guess in this 552 00:20:39,110 --> 00:20:41,239 case school, did they see any value 553 00:20:41,240 --> 00:20:42,739 in what you were doing or do they only 554 00:20:42,740 --> 00:20:43,880 perceive you to be a threat? 555 00:20:47,060 --> 00:20:49,189 It took a while, so we started this 556 00:20:49,190 --> 00:20:51,349 was last year and it took 557 00:20:51,350 --> 00:20:53,149 a while for them to see the value of this 558 00:20:53,150 --> 00:20:54,150 approach. 559 00:20:56,060 --> 00:20:58,379 But now that I'm very that 560 00:20:58,380 --> 00:21:00,049 I was enthusiastic and that we get 561 00:21:00,050 --> 00:21:02,119 actually very good results, they see 562 00:21:02,120 --> 00:21:03,120 some value in this. 563 00:21:04,220 --> 00:21:05,659 But it took a very long time. 564 00:21:05,660 --> 00:21:07,059 Yes. 565 00:21:07,060 --> 00:21:08,060 Thank you. 566 00:21:08,480 --> 00:21:10,039 Microphone three, please. 567 00:21:11,810 --> 00:21:13,730 Do you have any tips for 568 00:21:14,960 --> 00:21:17,239 organizations that are refreshing their 569 00:21:17,240 --> 00:21:19,519 security curriculum to take 570 00:21:19,520 --> 00:21:20,520 into account? 571 00:21:21,800 --> 00:21:24,229 I would take I would take 572 00:21:24,230 --> 00:21:25,729 if you're refreshing your curriculum 573 00:21:26,900 --> 00:21:29,299 and you're doing project, then I 574 00:21:29,300 --> 00:21:31,399 would very much recommend taking 575 00:21:31,400 --> 00:21:32,809 this procedure into account. 576 00:21:35,700 --> 00:21:37,289 That you start with the students 577 00:21:38,460 --> 00:21:40,529 with ethics in their project 578 00:21:40,530 --> 00:21:41,530 immediately. 579 00:21:43,230 --> 00:21:46,289 There is some question from the I.R.S.. 580 00:21:46,290 --> 00:21:48,449 Yeah, I have actually 581 00:21:48,450 --> 00:21:50,909 to sort of catch up. 582 00:21:50,910 --> 00:21:51,819 Both Yeah. 583 00:21:51,820 --> 00:21:53,939 OK, so the 584 00:21:53,940 --> 00:21:55,769 process of ethics is natural. 585 00:21:55,770 --> 00:21:58,019 To what extent does the ethical 586 00:21:58,020 --> 00:22:00,359 come into conflict, the norms or values 587 00:22:00,360 --> 00:22:01,889 on the students? 588 00:22:01,890 --> 00:22:04,169 And the second question is, 589 00:22:04,170 --> 00:22:06,689 to what extent can students 590 00:22:06,690 --> 00:22:08,819 act even if their norms and values 591 00:22:08,820 --> 00:22:10,919 are not in line with the common T.. 592 00:22:13,050 --> 00:22:14,400 OK, um. 593 00:22:16,400 --> 00:22:18,529 Um, we try not to impose. 594 00:22:23,000 --> 00:22:25,159 Sorry, and we try not to impose 595 00:22:25,160 --> 00:22:27,349 too much of our values into the students, 596 00:22:30,050 --> 00:22:32,129 but we are I mean, the 597 00:22:32,130 --> 00:22:34,520 university is a very public organization 598 00:22:35,990 --> 00:22:38,269 and has its reputation to think about 599 00:22:38,270 --> 00:22:39,270 also. 600 00:22:40,160 --> 00:22:42,439 Um, so we have to be careful 601 00:22:42,440 --> 00:22:44,659 with that. But we do, 602 00:22:44,660 --> 00:22:46,939 um, we do do a very open 603 00:22:46,940 --> 00:22:49,399 uh, we have a very open debate about 604 00:22:49,400 --> 00:22:51,109 the norms 605 00:22:52,820 --> 00:22:55,219 and the evaluations 606 00:22:55,220 --> 00:22:56,149 that the students do. 607 00:22:56,150 --> 00:22:57,470 And this is in a discussion, 608 00:22:58,880 --> 00:23:01,429 um, so we try not to impose this. 609 00:23:01,430 --> 00:23:03,589 And the second question was. 610 00:23:09,260 --> 00:23:11,379 Um, uh, 611 00:23:11,380 --> 00:23:13,869 to what extent can students act, even 612 00:23:13,870 --> 00:23:15,879 the norms and values are not in line with 613 00:23:15,880 --> 00:23:16,880 the committee, 614 00:23:18,160 --> 00:23:19,160 um, 615 00:23:22,210 --> 00:23:24,519 in the end, the Ethics Committee 616 00:23:25,570 --> 00:23:27,999 decides and then the teacher 617 00:23:28,000 --> 00:23:30,069 decides what what we can do. 618 00:23:31,750 --> 00:23:34,179 And if that doesn't align and 619 00:23:34,180 --> 00:23:36,279 we don't agree that this is an 620 00:23:36,280 --> 00:23:38,409 important subject to do, 621 00:23:38,410 --> 00:23:39,939 then the students are not allowed to do 622 00:23:39,940 --> 00:23:41,739 it during the curriculum, is free to do 623 00:23:41,740 --> 00:23:43,389 it in his own time. 624 00:23:43,390 --> 00:23:45,309 But it won't be, uh, great work 625 00:23:48,100 --> 00:23:51,069 at the microphone to please. 626 00:23:51,070 --> 00:23:53,259 Um, are you considering 627 00:23:53,260 --> 00:23:55,769 an ethical code of conduct? 628 00:23:55,770 --> 00:23:57,849 A student could, uh, take an oath 629 00:23:57,850 --> 00:24:00,819 when they graduate from the university. 630 00:24:00,820 --> 00:24:02,619 Similar thing is happening in the 631 00:24:02,620 --> 00:24:04,689 University of Groningen, the physics 632 00:24:04,690 --> 00:24:06,819 department, that a student can take an 633 00:24:06,820 --> 00:24:08,649 oath. They don't work in their arms race 634 00:24:08,650 --> 00:24:10,039 and history and work on. 635 00:24:10,040 --> 00:24:12,529 Uh, well, you know what? 636 00:24:12,530 --> 00:24:14,439 Um, something similar. 637 00:24:14,440 --> 00:24:17,139 I think if I sent some code of ethics. 638 00:24:17,140 --> 00:24:19,269 Yeah. And if they, 639 00:24:19,270 --> 00:24:21,459 um, I 640 00:24:21,460 --> 00:24:23,589 forgot the name. There's a code of 641 00:24:23,590 --> 00:24:25,239 ethics for a system administrators. 642 00:24:26,830 --> 00:24:29,139 Um, and we we try to, 643 00:24:29,140 --> 00:24:31,569 uh, we educate them about that one. 644 00:24:31,570 --> 00:24:33,939 Um, but we don't 645 00:24:33,940 --> 00:24:35,799 we don't enforce this on onto the 646 00:24:35,800 --> 00:24:37,179 students after the curriculum. 647 00:24:37,180 --> 00:24:39,669 No, I'm not saying I'm forcing no. 648 00:24:39,670 --> 00:24:40,959 Indian groaning And this is just an 649 00:24:40,960 --> 00:24:42,969 optional thing you can do when you take 650 00:24:42,970 --> 00:24:43,859 your ball. 651 00:24:43,860 --> 00:24:44,529 Go, go. 652 00:24:44,530 --> 00:24:47,169 OK, no, 653 00:24:47,170 --> 00:24:48,559 we haven't really thought about it. 654 00:24:48,560 --> 00:24:50,829 It's something that we present and that 655 00:24:50,830 --> 00:24:53,229 we present as a good thing, 656 00:24:53,230 --> 00:24:55,359 because it also helps you a little 657 00:24:55,360 --> 00:24:57,609 bit with, uh, if 658 00:24:57,610 --> 00:24:59,079 you were forced to make hard choices, 659 00:24:59,080 --> 00:25:00,789 then you can point to this public 660 00:25:00,790 --> 00:25:01,790 document and say, 661 00:25:03,280 --> 00:25:05,589 this is an oath that I abide by. 662 00:25:05,590 --> 00:25:07,869 Um, and 663 00:25:07,870 --> 00:25:09,219 this is not something that I thought of, 664 00:25:09,220 --> 00:25:11,589 but this is something that that exists 665 00:25:11,590 --> 00:25:13,299 and that other people have thought about. 666 00:25:13,300 --> 00:25:15,639 Yeah. Um, 667 00:25:15,640 --> 00:25:16,599 so that's what we do. 668 00:25:16,600 --> 00:25:17,409 Yeah. 669 00:25:17,410 --> 00:25:19,599 I think you might 670 00:25:19,600 --> 00:25:21,729 make number 671 00:25:21,730 --> 00:25:22,959 three. 672 00:25:22,960 --> 00:25:25,059 I first let me say, uh, it's really great 673 00:25:25,060 --> 00:25:26,319 that you're doing this. 674 00:25:26,320 --> 00:25:28,659 And I was at the University of Amsterdam 675 00:25:28,660 --> 00:25:30,879 also ages ago, and we had some 676 00:25:30,880 --> 00:25:33,039 extracurricular classes on ethics, on the 677 00:25:33,040 --> 00:25:35,409 faculty of computer sciences. 678 00:25:35,410 --> 00:25:37,869 Um, but what I'm questioning 679 00:25:37,870 --> 00:25:39,999 is why don't you, 680 00:25:40,000 --> 00:25:41,889 uh, expand that? The whole ethics 681 00:25:41,890 --> 00:25:44,469 question also into the design phase 682 00:25:44,470 --> 00:25:46,899 is this is just security research, 683 00:25:46,900 --> 00:25:48,639 but there's also ethics questions in 684 00:25:48,640 --> 00:25:51,219 picking libraries or making 685 00:25:51,220 --> 00:25:53,469 design decisions on networking equipment 686 00:25:53,470 --> 00:25:54,789 or software design. 687 00:25:54,790 --> 00:25:56,889 Um, I think it would be great if it 688 00:25:56,890 --> 00:25:58,149 could be expanded. 689 00:25:58,150 --> 00:26:00,309 A lot of flaws that we seen presented 690 00:26:00,310 --> 00:26:02,709 here on the CCC come from poor design 691 00:26:02,710 --> 00:26:04,719 decisions or because people think, well, 692 00:26:04,720 --> 00:26:05,720 I can get away with it. 693 00:26:07,420 --> 00:26:09,609 And that's, um, it's 694 00:26:09,610 --> 00:26:10,689 something that we've been 695 00:26:11,980 --> 00:26:14,589 subconsciously working already with. 696 00:26:14,590 --> 00:26:15,590 Um, 697 00:26:16,840 --> 00:26:18,279 uh, but it's not something that we 698 00:26:18,280 --> 00:26:20,379 explicitly have been doing, but 699 00:26:20,380 --> 00:26:21,700 maybe something for the future 700 00:26:24,700 --> 00:26:25,700 then, 701 00:26:26,860 --> 00:26:28,779 uh, it's working again. 702 00:26:28,780 --> 00:26:30,759 So there are some more questions from the 703 00:26:30,760 --> 00:26:32,199 Internet. 704 00:26:32,200 --> 00:26:34,299 What one, um, that 705 00:26:34,300 --> 00:26:36,399 the lawyer even into win or do 706 00:26:36,400 --> 00:26:37,599 anything like. 707 00:26:37,600 --> 00:26:39,099 Did you understand the problem? 708 00:26:39,100 --> 00:26:40,450 He was presented with, 709 00:26:41,470 --> 00:26:44,229 um, the the lawyer and the 710 00:26:44,230 --> 00:26:46,449 ethical committee for the faculty 711 00:26:46,450 --> 00:26:48,039 is actually very constructive. 712 00:26:48,040 --> 00:26:50,439 And, uh, 713 00:26:50,440 --> 00:26:52,729 she helps with the, um, 714 00:26:53,740 --> 00:26:55,209 discussing the law and what the 715 00:26:55,210 --> 00:26:56,559 boundaries of the law are. 716 00:26:57,850 --> 00:26:58,850 Um, 717 00:27:00,340 --> 00:27:02,229 I'm not completely sure whether she 718 00:27:02,230 --> 00:27:03,549 actually stopped anything 719 00:27:04,600 --> 00:27:06,879 yet, but it is sometimes 720 00:27:06,880 --> 00:27:08,859 helpful to understand what the what the 721 00:27:08,860 --> 00:27:09,880 actual boundaries are 722 00:27:11,020 --> 00:27:12,309 and what you have to think about 723 00:27:13,600 --> 00:27:15,219 at microphone to please. 724 00:27:16,600 --> 00:27:18,999 You talk about the the traffic light 725 00:27:19,000 --> 00:27:21,939 was red, green, yellow. 726 00:27:21,940 --> 00:27:24,309 Um, the parameters 727 00:27:24,310 --> 00:27:26,439 you have to classify which one 728 00:27:26,440 --> 00:27:28,179 is red and yellow and green. 729 00:27:28,180 --> 00:27:29,829 Do you have that public somewhere? 730 00:27:29,830 --> 00:27:31,719 Can I have a look at that because. 731 00:27:31,720 --> 00:27:34,059 Yeah, sounds interesting. 732 00:27:34,060 --> 00:27:35,230 And so this is still 733 00:27:36,700 --> 00:27:38,619 very much a work in progress, 734 00:27:39,820 --> 00:27:41,889 but the whole procedure is online at, uh, 735 00:27:41,890 --> 00:27:43,419 the website. 736 00:27:43,420 --> 00:27:45,549 If you click through there, the info and 737 00:27:45,550 --> 00:27:48,939 then ethics and you see the, uh, the, 738 00:27:48,940 --> 00:27:51,039 um, the evaluation procedure 739 00:27:51,040 --> 00:27:53,079 and some of the examples that we have 740 00:27:54,160 --> 00:27:55,869 for each of the different 741 00:27:57,040 --> 00:27:58,040 classes. Yeah. 742 00:27:59,980 --> 00:28:03,309 So as I can see, no more questions. 743 00:28:03,310 --> 00:28:05,549 Thanks a lot. Again, give 744 00:28:05,550 --> 00:28:06,750 them the applause.