0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/762 Thanks! 1 00:00:14,900 --> 00:00:17,149 All right, now let's listen 2 00:00:17,150 --> 00:00:19,249 to what Adam Howie has to 3 00:00:19,250 --> 00:00:22,309 say. He's an artist living in Berlin and 4 00:00:22,310 --> 00:00:24,139 he studied in New York City, 5 00:00:25,580 --> 00:00:26,869 actually hands up in the audience who a 6 00:00:26,870 --> 00:00:29,120 few back his Kickstarter campaign 7 00:00:30,680 --> 00:00:32,959 on RF signal blocking phone 8 00:00:32,960 --> 00:00:33,960 cases. And he. 9 00:00:36,270 --> 00:00:38,969 OK, so 10 00:00:38,970 --> 00:00:40,769 how are we going to talk about it is 11 00:00:40,770 --> 00:00:43,259 going to talk about Rita 12 00:00:43,260 --> 00:00:45,779 countersurveillance 15 most 13 00:00:45,780 --> 00:00:47,969 unwanted and wanted 14 00:00:47,970 --> 00:00:50,039 things for 15 00:00:50,040 --> 00:00:51,270 surveillance and countersurveillance. 16 00:00:52,710 --> 00:00:53,710 All right. 17 00:01:03,970 --> 00:01:06,099 Thank you for everyone to coming 18 00:01:06,100 --> 00:01:08,799 here, this is my first C.C.C., 19 00:01:08,800 --> 00:01:11,079 and in the title 20 00:01:11,080 --> 00:01:13,389 of this presentation, what I wanted 21 00:01:13,390 --> 00:01:16,029 to talk about originally 22 00:01:16,030 --> 00:01:18,220 was 50 companies 23 00:01:19,630 --> 00:01:22,209 doing retail surveillance 24 00:01:22,210 --> 00:01:24,399 and come up with 50 25 00:01:24,400 --> 00:01:26,499 ways of doing it, a 26 00:01:26,500 --> 00:01:29,100 retail model for countersurveillance. 27 00:01:30,700 --> 00:01:33,069 But what I found as I began 28 00:01:33,070 --> 00:01:35,349 working on that presentation 29 00:01:35,350 --> 00:01:38,050 is that it's relentless. 30 00:01:39,280 --> 00:01:41,859 It's literally thousands 31 00:01:41,860 --> 00:01:44,439 of companies and 32 00:01:44,440 --> 00:01:46,929 nonstop battle to keep up 33 00:01:46,930 --> 00:01:49,449 with all the different retail 34 00:01:49,450 --> 00:01:51,609 surveillance tactics 35 00:01:51,610 --> 00:01:54,189 and modify your life to 36 00:01:56,020 --> 00:01:58,209 adjust your privacy settings accordingly. 37 00:01:59,800 --> 00:02:02,019 So I've had I've changed 38 00:02:02,020 --> 00:02:04,149 the presentation a little bit to 39 00:02:04,150 --> 00:02:06,429 focus on one 40 00:02:06,430 --> 00:02:08,769 one aspect of that. 41 00:02:08,770 --> 00:02:10,929 So I've narrowed it down to 42 00:02:10,930 --> 00:02:13,539 photography or 43 00:02:13,540 --> 00:02:15,669 computer vision and 44 00:02:15,670 --> 00:02:18,279 even within computer vision, 45 00:02:18,280 --> 00:02:20,679 narrowed it down to facial 46 00:02:20,680 --> 00:02:21,680 analysis. 47 00:02:22,810 --> 00:02:25,089 And for me, this started 48 00:02:25,090 --> 00:02:27,579 when I used to work as a photographer. 49 00:02:27,580 --> 00:02:30,069 When I moved to New York City 50 00:02:30,070 --> 00:02:32,199 about 12 years ago, I worked 51 00:02:32,200 --> 00:02:34,299 as a photographer, but I came 52 00:02:34,300 --> 00:02:36,999 across some quotes from 53 00:02:37,000 --> 00:02:38,469 Susan Sontag. 54 00:02:38,470 --> 00:02:40,689 And as I read her book on 55 00:02:40,690 --> 00:02:42,879 photography, it really influenced 56 00:02:42,880 --> 00:02:45,909 my perspective about the power, 57 00:02:45,910 --> 00:02:48,429 the ability for the camera 58 00:02:48,430 --> 00:02:49,930 to capture, 59 00:02:51,430 --> 00:02:53,679 to possess, to turn 60 00:02:53,680 --> 00:02:56,409 people into objects 61 00:02:56,410 --> 00:02:58,689 that you can possess and control 62 00:02:58,690 --> 00:03:00,429 the narrative for. 63 00:03:00,430 --> 00:03:02,559 And I think that's very clear 64 00:03:02,560 --> 00:03:05,109 when you look at aggressive 65 00:03:05,110 --> 00:03:06,759 paparazzi behavior, 66 00:03:07,960 --> 00:03:10,419 such as these photographers 67 00:03:10,420 --> 00:03:12,789 kind of attacking Britney Spears. 68 00:03:12,790 --> 00:03:15,729 But I think it's a little bit less clear 69 00:03:15,730 --> 00:03:17,859 how that narrative unfolds over 70 00:03:17,860 --> 00:03:19,959 time online 71 00:03:19,960 --> 00:03:22,749 for photos that are posted 72 00:03:22,750 --> 00:03:25,239 to the Internet, to social media 73 00:03:25,240 --> 00:03:26,240 and so on. 74 00:03:28,120 --> 00:03:30,489 And to talk about computer 75 00:03:30,490 --> 00:03:32,709 vision, I want to introduce a little 76 00:03:32,710 --> 00:03:34,839 bit of the history of computer vision, 77 00:03:34,840 --> 00:03:36,370 but it's not entirely new. 78 00:03:38,080 --> 00:03:41,109 Nineteen sixty three is the first 79 00:03:41,110 --> 00:03:42,939 recorded instance that I've come across 80 00:03:42,940 --> 00:03:45,519 from a recently declassified 81 00:03:45,520 --> 00:03:46,719 CIA memo 82 00:03:47,950 --> 00:03:50,979 that was a proposal for a simplified 83 00:03:50,980 --> 00:03:52,719 face recognition machine. 84 00:03:53,920 --> 00:03:56,139 Ultimately, in 1963, 85 00:03:56,140 --> 00:03:58,539 the technology was not quite 86 00:03:58,540 --> 00:04:00,549 where it needed to be to perform, 87 00:04:01,750 --> 00:04:03,819 you know, robust, accurate 88 00:04:03,820 --> 00:04:06,519 facial recognition system. 89 00:04:06,520 --> 00:04:09,339 But over the next few decades, 90 00:04:09,340 --> 00:04:12,609 and especially in the year 1969, 91 00:04:12,610 --> 00:04:14,919 three Japanese researchers 92 00:04:14,920 --> 00:04:16,989 made a lot of progress and were able to 93 00:04:16,990 --> 00:04:19,509 detect the first human face 94 00:04:19,510 --> 00:04:20,510 with computer vision. 95 00:04:21,740 --> 00:04:22,810 I like to think of these 96 00:04:24,130 --> 00:04:26,349 first human face detections, which 97 00:04:26,350 --> 00:04:28,149 look a little bit like the head of 98 00:04:28,150 --> 00:04:30,369 broccoli or a light bulb and 99 00:04:30,370 --> 00:04:32,529 the first cave paintings for the first 100 00:04:32,530 --> 00:04:35,079 time that a computer was able to 101 00:04:35,080 --> 00:04:37,269 to understand in 102 00:04:37,270 --> 00:04:39,399 a very primitive way what it was to 103 00:04:39,400 --> 00:04:40,400 appear as a human. 104 00:04:41,890 --> 00:04:44,319 Throughout the 70s and 80s, 105 00:04:44,320 --> 00:04:46,689 computer vision made only moderate 106 00:04:46,690 --> 00:04:48,159 gains. 107 00:04:48,160 --> 00:04:50,229 And in the 1990s, this 108 00:04:50,230 --> 00:04:51,669 program called Ferrett, 109 00:04:52,780 --> 00:04:55,179 which was funded by, of course, 110 00:04:55,180 --> 00:04:57,160 military Department of Defense, 111 00:04:59,200 --> 00:05:01,449 it was a feasibility study to 112 00:05:01,450 --> 00:05:03,669 determine if facial recognition could 113 00:05:03,670 --> 00:05:05,889 play a significant role in 114 00:05:05,890 --> 00:05:08,529 law enforcement and 115 00:05:08,530 --> 00:05:10,719 in identifying enemy combatants at 116 00:05:10,720 --> 00:05:11,720 a distance. 117 00:05:13,520 --> 00:05:15,379 So this set into motion 118 00:05:16,700 --> 00:05:18,859 what Paul really calls 119 00:05:18,860 --> 00:05:21,470 the logistics of military perception. 120 00:05:25,110 --> 00:05:27,539 And in 2001, 121 00:05:27,540 --> 00:05:29,849 a breakthrough algorithm came out 122 00:05:29,850 --> 00:05:32,639 called the Viola Jones algorithm. 123 00:05:32,640 --> 00:05:34,410 So the goal, the Jones algorithm 124 00:05:35,670 --> 00:05:37,919 was unique because 125 00:05:37,920 --> 00:05:40,589 it was very efficient and 126 00:05:40,590 --> 00:05:43,109 offered enough accuracy 127 00:05:43,110 --> 00:05:45,179 that you could deal deal 128 00:05:45,180 --> 00:05:48,029 with the tolerance of the 129 00:05:48,030 --> 00:05:50,159 the kind of cost benefit analysis of 130 00:05:50,160 --> 00:05:52,409 a very lightweight computer 131 00:05:52,410 --> 00:05:54,629 vision system that could be put 132 00:05:54,630 --> 00:05:56,699 on to embedded hardware at a 133 00:05:56,700 --> 00:05:58,919 very low cost with a decent 134 00:05:58,920 --> 00:05:59,920 frame rate. 135 00:06:01,020 --> 00:06:03,449 So what that set into motion 136 00:06:03,450 --> 00:06:06,359 is the ubiquity of computer vision, 137 00:06:06,360 --> 00:06:08,399 face detection, appearing on all sorts of 138 00:06:08,400 --> 00:06:11,609 different devices in 2001. 139 00:06:11,610 --> 00:06:14,459 For that that really changed 140 00:06:14,460 --> 00:06:16,349 the model for where you could put 141 00:06:16,350 --> 00:06:17,519 computer vision. 142 00:06:17,520 --> 00:06:19,439 You didn't need a giant computer to do 143 00:06:19,440 --> 00:06:21,569 it. We needed a very lightweight, 144 00:06:21,570 --> 00:06:23,220 small embedded system. 145 00:06:24,630 --> 00:06:26,579 And of course, that brought a lot of 146 00:06:26,580 --> 00:06:28,049 problems for privacy. 147 00:06:28,050 --> 00:06:29,759 Now you have computers that can recognize 148 00:06:29,760 --> 00:06:31,869 faces, you can extract the 149 00:06:31,870 --> 00:06:33,569 faces, you can begin to do facial 150 00:06:33,570 --> 00:06:35,669 analysis, and it doesn't cost 151 00:06:35,670 --> 00:06:37,799 you a lot of money. You can do it on very 152 00:06:37,800 --> 00:06:40,040 cheap, low cost hardware. 153 00:06:41,100 --> 00:06:43,799 So as as open KVI 154 00:06:43,800 --> 00:06:46,409 and this face detection 155 00:06:46,410 --> 00:06:48,629 algorithm began to propagate throughout 156 00:06:48,630 --> 00:06:49,769 culture. 157 00:06:49,770 --> 00:06:51,839 Now, what what was appearing in 158 00:06:51,840 --> 00:06:54,539 the two thousand eight 159 00:06:54,540 --> 00:06:56,449 around that around that time? 160 00:06:56,450 --> 00:06:58,649 There's a real push towards 161 00:06:58,650 --> 00:07:01,259 computational photography, using 162 00:07:01,260 --> 00:07:03,419 cameras almost without 163 00:07:03,420 --> 00:07:05,999 a human in the loop to recognize 164 00:07:06,000 --> 00:07:08,459 people and extract knowledge 165 00:07:08,460 --> 00:07:09,479 about them. 166 00:07:09,480 --> 00:07:11,339 The same way that Susan Sontag talked 167 00:07:11,340 --> 00:07:13,469 about people using a 168 00:07:13,470 --> 00:07:15,749 camera to extract knowledge 169 00:07:15,750 --> 00:07:18,449 and possess people as an object. 170 00:07:18,450 --> 00:07:20,549 Now, it's very clear, you know, 171 00:07:20,550 --> 00:07:23,039 around 2005, 2010, 172 00:07:23,040 --> 00:07:24,509 that this is going to be the future 173 00:07:24,510 --> 00:07:26,339 narrative, that computers will be 174 00:07:26,340 --> 00:07:28,529 extracting the narratives 175 00:07:28,530 --> 00:07:30,419 and labeling us and tagging us. 176 00:07:30,420 --> 00:07:31,740 So in 2010, 177 00:07:32,880 --> 00:07:34,979 I worked on a project that's 178 00:07:34,980 --> 00:07:37,079 all about modulating 179 00:07:37,080 --> 00:07:39,419 your appearance to reduce 180 00:07:39,420 --> 00:07:41,789 your confidence score to 181 00:07:41,790 --> 00:07:42,929 computer vision algorithm. 182 00:07:44,490 --> 00:07:47,339 The project is called CV Dazzle 183 00:07:47,340 --> 00:07:49,649 and it uses the 184 00:07:49,650 --> 00:07:52,079 vulnerabilities in the face detection 185 00:07:52,080 --> 00:07:54,269 profiles and 186 00:07:54,270 --> 00:07:56,549 exploits those with hair and makeup. 187 00:07:56,550 --> 00:07:58,649 So by doing hair and makeup in 188 00:07:58,650 --> 00:08:01,139 a certain location, you decrease 189 00:08:01,140 --> 00:08:03,479 the confidence score that that phase 190 00:08:03,480 --> 00:08:04,619 will appear. 191 00:08:04,620 --> 00:08:06,839 What it looks like when you 192 00:08:06,840 --> 00:08:08,459 run a test, probably. 193 00:08:09,720 --> 00:08:11,040 Speed this up, I can. 194 00:08:13,230 --> 00:08:15,329 You can watch in very 195 00:08:15,330 --> 00:08:17,649 slow down version 196 00:08:17,650 --> 00:08:20,039 how fast the action works, and 197 00:08:20,040 --> 00:08:21,509 it's really just reading an image from 198 00:08:21,510 --> 00:08:23,040 left to right like a book, 199 00:08:24,420 --> 00:08:26,879 and you can see the results 200 00:08:26,880 --> 00:08:29,159 of the algorithm on the left, 201 00:08:29,160 --> 00:08:31,679 very high confidence score 202 00:08:31,680 --> 00:08:33,629 for the face. Now, if we fast forward 203 00:08:34,770 --> 00:08:37,139 to the end, 204 00:08:37,140 --> 00:08:39,239 what we see is a zero 205 00:08:39,240 --> 00:08:40,319 confidence score. 206 00:08:41,780 --> 00:08:42,780 Well, there's one 207 00:08:44,210 --> 00:08:46,609 misplaced rectangle compared 208 00:08:46,610 --> 00:08:48,919 to a 209 00:08:48,920 --> 00:08:50,660 very high confidence score on the left. 210 00:08:52,020 --> 00:08:54,509 Another way to look at that is to use 211 00:08:54,510 --> 00:08:56,789 what's called a salience map to 212 00:08:56,790 --> 00:08:58,949 understand it's kind of like a heat map 213 00:08:58,950 --> 00:09:00,749 for where computer vision algorithm is 214 00:09:00,750 --> 00:09:03,059 looking. What it found interesting 215 00:09:03,060 --> 00:09:04,799 and salient in that image. 216 00:09:04,800 --> 00:09:07,109 So you can go back and 217 00:09:07,110 --> 00:09:09,179 and see kind of where the computer 218 00:09:09,180 --> 00:09:12,089 vision eyes have been looking at an image 219 00:09:12,090 --> 00:09:14,699 recently, researcher 220 00:09:14,700 --> 00:09:17,549 Voytek Frige in 221 00:09:17,550 --> 00:09:19,799 University of West Bohemia 222 00:09:19,800 --> 00:09:21,959 ran a study on 223 00:09:21,960 --> 00:09:24,389 TV Dazzle to determine 224 00:09:24,390 --> 00:09:26,399 how effective or whether it was effective 225 00:09:26,400 --> 00:09:27,400 at all. 226 00:09:27,810 --> 00:09:29,130 And what he found is not 227 00:09:30,240 --> 00:09:32,009 that effective one hundred percent of the 228 00:09:32,010 --> 00:09:34,349 time, which I don't think should be 229 00:09:34,350 --> 00:09:36,779 a requirement for camouflage. 230 00:09:36,780 --> 00:09:38,879 And I think camouflage is often 231 00:09:38,880 --> 00:09:41,069 misunderstood as a Harry 232 00:09:41,070 --> 00:09:43,259 Potter invisibility cloak, when 233 00:09:43,260 --> 00:09:45,149 camouflage actually actually is about 234 00:09:45,150 --> 00:09:47,309 optimizing the way that 235 00:09:47,310 --> 00:09:50,489 you appear and reducing visibility, 236 00:09:50,490 --> 00:09:52,589 moving matter between different 237 00:09:52,590 --> 00:09:54,749 parts of the electromagnetic spectrum, 238 00:09:54,750 --> 00:09:56,909 possibly just 239 00:09:56,910 --> 00:09:59,279 even for a brief moment to evade 240 00:09:59,280 --> 00:10:00,569 observation. 241 00:10:00,570 --> 00:10:02,969 So achieving 100 percent, 242 00:10:02,970 --> 00:10:04,889 of course, would be great, but I don't 243 00:10:04,890 --> 00:10:06,629 think that should be a requirement for 244 00:10:06,630 --> 00:10:08,419 the way that we think about camouflage. 245 00:10:09,600 --> 00:10:11,669 The results of his 246 00:10:11,670 --> 00:10:14,129 analysis show that the most effective 247 00:10:14,130 --> 00:10:16,919 pattern was when you cover 248 00:10:16,920 --> 00:10:18,359 the nose bridge area. 249 00:10:18,360 --> 00:10:19,979 So that's one of the biggest 250 00:10:19,980 --> 00:10:21,929 vulnerabilities of open sea, these face 251 00:10:21,930 --> 00:10:23,009 detectors. 252 00:10:23,010 --> 00:10:25,619 And the result was about 69 253 00:10:25,620 --> 00:10:27,629 percent reduction in detection. 254 00:10:30,740 --> 00:10:31,820 Now, if we compare that 255 00:10:32,930 --> 00:10:35,239 to World War One, dazzle, 256 00:10:35,240 --> 00:10:37,140 camouflage, the original devil. 257 00:10:38,540 --> 00:10:40,609 This has been debated whether it was 258 00:10:40,610 --> 00:10:42,499 effective at all or not. 259 00:10:42,500 --> 00:10:45,319 But Roy Barens, a camouflage historian, 260 00:10:45,320 --> 00:10:47,539 has said now, in fact, 261 00:10:47,540 --> 00:10:49,729 Dalzell was evaluated and 262 00:10:49,730 --> 00:10:52,289 it was about 50 percent effective 263 00:10:52,290 --> 00:10:54,889 that he could say 50 percent is not great 264 00:10:54,890 --> 00:10:55,819 for camouflage. 265 00:10:55,820 --> 00:10:57,949 But if you avoid one out 266 00:10:57,950 --> 00:11:00,199 of every two torpedoes, that 267 00:11:00,200 --> 00:11:01,429 would explode your ship. 268 00:11:03,650 --> 00:11:04,929 That's that's pretty great. 269 00:11:08,350 --> 00:11:10,359 Since I worked on the project, it's kind 270 00:11:10,360 --> 00:11:12,739 of taken on a life of its own, 271 00:11:12,740 --> 00:11:14,979 appearing on TV show 272 00:11:14,980 --> 00:11:15,980 Elementary 273 00:11:17,050 --> 00:11:19,209 people have kind of taken the 274 00:11:19,210 --> 00:11:21,789 hints from what I posted online, 275 00:11:21,790 --> 00:11:23,859 reinterpreted in their own way. 276 00:11:24,940 --> 00:11:27,349 Which sometimes turns out great, 277 00:11:27,350 --> 00:11:28,600 sometimes turns out. 278 00:11:30,140 --> 00:11:31,730 Very interesting. 279 00:11:36,630 --> 00:11:37,630 But I'm happy, 280 00:11:38,970 --> 00:11:40,739 but I'm happy to see overall that people 281 00:11:40,740 --> 00:11:43,019 are experimenting with 282 00:11:43,020 --> 00:11:45,149 this idea of just appearing 283 00:11:45,150 --> 00:11:47,819 in a new way, and I think you can also be 284 00:11:47,820 --> 00:11:48,820 very playful. 285 00:11:50,500 --> 00:11:53,529 After that project, which is in 2010, 286 00:11:53,530 --> 00:11:55,689 in 2013, became 287 00:11:55,690 --> 00:11:57,519 very aware and concerned about a 288 00:11:57,520 --> 00:11:59,109 different type of imaging, which is 289 00:11:59,110 --> 00:12:01,149 thermal surveillance doesn't relate as 290 00:12:01,150 --> 00:12:03,909 much to retail yet, 291 00:12:03,910 --> 00:12:05,979 but thermal is becoming, you know, 292 00:12:05,980 --> 00:12:06,999 very cheap. 293 00:12:07,000 --> 00:12:09,219 Ten years ago, 640 by 294 00:12:09,220 --> 00:12:11,169 480 would cost you twenty thousand 295 00:12:11,170 --> 00:12:13,239 dollars, 320 by 296 00:12:13,240 --> 00:12:15,489 240 today cost two hundred 297 00:12:15,490 --> 00:12:16,539 dollars. 298 00:12:16,540 --> 00:12:18,879 So the price has 299 00:12:18,880 --> 00:12:20,529 changed the way that we use this 300 00:12:20,530 --> 00:12:23,499 technology and thermal is becoming 301 00:12:23,500 --> 00:12:25,839 more and more of a consumer level 302 00:12:25,840 --> 00:12:27,129 technology. 303 00:12:27,130 --> 00:12:28,899 So what do you see here as a way to 304 00:12:28,900 --> 00:12:30,939 block? This is more of the Harry Potter 305 00:12:30,940 --> 00:12:34,029 invisibility cloak kind of technology. 306 00:12:34,030 --> 00:12:36,399 It's a silver plated 307 00:12:36,400 --> 00:12:38,769 metal fabric fashioned 308 00:12:38,770 --> 00:12:40,839 in Islamic dress as 309 00:12:40,840 --> 00:12:42,939 a anti drone hijab 310 00:12:42,940 --> 00:12:44,610 and an anti drone burka. 311 00:12:46,120 --> 00:12:48,519 The idea of the burqa 312 00:12:48,520 --> 00:12:50,619 is that it's reinterpreting 313 00:12:50,620 --> 00:12:52,749 religious dress in an era 314 00:12:52,750 --> 00:12:55,329 of mass surveillance to 315 00:12:55,330 --> 00:12:57,669 instead of create a separation 316 00:12:57,670 --> 00:12:59,769 between man and God, 317 00:12:59,770 --> 00:13:01,839 create a separation between man 318 00:13:01,840 --> 00:13:02,840 and drone. 319 00:13:04,700 --> 00:13:06,320 You can see that. 320 00:13:11,160 --> 00:13:13,409 Now, this is a test for you, 321 00:13:14,580 --> 00:13:16,649 like that game you play for 322 00:13:16,650 --> 00:13:17,869 Hunt. 323 00:13:17,870 --> 00:13:20,009 There are four people and you can see 324 00:13:20,010 --> 00:13:22,889 them very clearly, their heat signature. 325 00:13:22,890 --> 00:13:25,169 But there's a fifth person wearing 326 00:13:25,170 --> 00:13:26,279 the anti drone burqa. 327 00:13:28,560 --> 00:13:29,899 May be hard to see because of the 328 00:13:29,900 --> 00:13:31,429 projection I'll give you the 329 00:13:32,600 --> 00:13:34,339 benefit of the doubt here, I'm going to 330 00:13:34,340 --> 00:13:36,439 play the animation will become very 331 00:13:36,440 --> 00:13:38,269 clear when there's motion in it, but 332 00:13:38,270 --> 00:13:39,270 without motion. 333 00:13:40,440 --> 00:13:42,599 I can see that the visibility is is 334 00:13:42,600 --> 00:13:45,179 near zero for wearing the 335 00:13:45,180 --> 00:13:46,200 anti drone burqa. 336 00:13:47,550 --> 00:13:49,169 Therefore, it's very clear now you can 337 00:13:49,170 --> 00:13:51,029 see the legs and that's intentional. 338 00:13:52,920 --> 00:13:55,019 Now what you'll see is somebody walk out 339 00:13:55,020 --> 00:13:57,039 of a store. This is in the winter. 340 00:13:57,040 --> 00:13:59,039 There's actually quite a high temperature 341 00:13:59,040 --> 00:14:00,040 differential. 342 00:14:01,050 --> 00:14:02,610 And this person is just glowing. 343 00:14:15,290 --> 00:14:17,809 These projects, I approach them 344 00:14:17,810 --> 00:14:19,309 in a in a playful way, 345 00:14:20,360 --> 00:14:23,209 but they also touch on some 346 00:14:23,210 --> 00:14:25,639 very serious issues of national security 347 00:14:25,640 --> 00:14:27,049 and surveillance. 348 00:14:27,050 --> 00:14:29,179 And what you 349 00:14:29,180 --> 00:14:31,909 know, I can't predict is who will 350 00:14:31,910 --> 00:14:34,160 find these threatening or interesting 351 00:14:35,480 --> 00:14:38,899 after releasing these earlier projects. 352 00:14:38,900 --> 00:14:39,979 One of the people that found that 353 00:14:39,980 --> 00:14:42,139 interesting was the Air Force 354 00:14:42,140 --> 00:14:44,389 general counsel at the Pentagon 355 00:14:44,390 --> 00:14:45,889 who treated the project. 356 00:14:45,890 --> 00:14:49,099 And the other one, it's kind of funny, 357 00:14:49,100 --> 00:14:51,439 a request for an internal use 358 00:14:51,440 --> 00:14:52,789 only publication 359 00:14:53,810 --> 00:14:56,479 from a three letter agency 360 00:14:56,480 --> 00:14:57,679 asking for permission. 361 00:15:00,150 --> 00:15:01,229 Well, I'm never going to see it. 362 00:15:02,370 --> 00:15:05,519 I guess so, um, 363 00:15:05,520 --> 00:15:07,679 so this kind of working in this 364 00:15:07,680 --> 00:15:09,369 area, uh, 365 00:15:10,440 --> 00:15:12,059 a lot of uncertainty about the way that 366 00:15:12,060 --> 00:15:14,369 people would perceive these projects. 367 00:15:14,370 --> 00:15:16,439 As for a comment, 368 00:15:16,440 --> 00:15:18,749 the NSA declined, 369 00:15:18,750 --> 00:15:21,479 obviously, to say anything about it. 370 00:15:21,480 --> 00:15:24,659 But it always makes me wonder where 371 00:15:24,660 --> 00:15:27,029 where the line is in doing this kind of 372 00:15:27,030 --> 00:15:30,629 artwork. How far can you really go before 373 00:15:30,630 --> 00:15:31,739 you've gone too far? 374 00:15:33,390 --> 00:15:35,609 And I think you should just go 375 00:15:35,610 --> 00:15:38,039 further. So I've taken the idea 376 00:15:38,040 --> 00:15:40,229 of these garments and 377 00:15:40,230 --> 00:15:42,509 created a kind of shop called the Privacy 378 00:15:42,510 --> 00:15:44,999 Gift Shop, where I try to further 379 00:15:45,000 --> 00:15:46,000 these ideas 380 00:15:48,030 --> 00:15:50,249 and kind of I using the 381 00:15:50,250 --> 00:15:52,319 store to normalize 382 00:15:52,320 --> 00:15:53,369 through commerce. 383 00:15:53,370 --> 00:15:55,649 I think Holly Herndon said it best in 384 00:15:55,650 --> 00:15:57,779 a talk she gave using pop 385 00:15:57,780 --> 00:15:59,909 culture as a carrier signal 386 00:15:59,910 --> 00:16:01,919 to transmit these ideas. 387 00:16:01,920 --> 00:16:04,079 And I think commerce and the gift shop 388 00:16:04,080 --> 00:16:06,719 can have a friendly normalizing effect 389 00:16:06,720 --> 00:16:09,899 on otherwise kind of terrifying 390 00:16:09,900 --> 00:16:11,590 discourse around national security. 391 00:16:12,870 --> 00:16:15,269 So originally, I did want to talk about 392 00:16:15,270 --> 00:16:17,129 these topics. I've noticed some great 393 00:16:17,130 --> 00:16:19,199 talks here that 394 00:16:19,200 --> 00:16:21,419 I want to point you to, 395 00:16:21,420 --> 00:16:22,420 to kind of. 396 00:16:25,100 --> 00:16:27,259 Allow me to expand on on the 397 00:16:27,260 --> 00:16:28,789 facial recognition computer vision, 398 00:16:28,790 --> 00:16:30,919 wolfie crystals, talk about corporate 399 00:16:30,920 --> 00:16:33,259 surveillance and another 400 00:16:33,260 --> 00:16:35,539 interesting talk about the ultrasound 401 00:16:35,540 --> 00:16:37,099 tracking ecosystem. 402 00:16:37,100 --> 00:16:39,019 So these are kind of all part of the 403 00:16:39,020 --> 00:16:40,639 corporate retail surveillance 404 00:16:40,640 --> 00:16:41,949 infrastructure. 405 00:16:41,950 --> 00:16:44,149 I'll just add two things to 406 00:16:44,150 --> 00:16:46,879 that. With wi fi, you can now detect 407 00:16:46,880 --> 00:16:48,559 motions with wireless signals. 408 00:16:49,640 --> 00:16:51,499 And I think this is worth spending too 409 00:16:51,500 --> 00:16:53,809 much time on. It's very easy to block 410 00:16:53,810 --> 00:16:56,239 this with about a ten dollar 411 00:16:56,240 --> 00:16:59,059 E.S.P, 80 to 66 412 00:16:59,060 --> 00:17:01,039 wi fi. You can turn this into a kind of 413 00:17:01,040 --> 00:17:03,439 jammer to create a lot of noise 414 00:17:03,440 --> 00:17:05,358 in the Wi-Fi spectrum and disrupt the 415 00:17:05,359 --> 00:17:07,219 motion detection signals. 416 00:17:07,220 --> 00:17:09,439 Another company to highlight is Indore 417 00:17:09,440 --> 00:17:12,588 Atlas, which uses the geomagnetic 418 00:17:12,589 --> 00:17:14,809 signature measuring 419 00:17:14,810 --> 00:17:17,118 the gas on your phone's digital 420 00:17:17,119 --> 00:17:19,789 magnetometer to 421 00:17:19,790 --> 00:17:22,009 get a to meet a resolution 422 00:17:22,010 --> 00:17:24,348 position within an indoor retail 423 00:17:24,349 --> 00:17:25,249 environment. 424 00:17:25,250 --> 00:17:27,409 But that's also very easy to block 425 00:17:27,410 --> 00:17:29,959 with a street magnet, put a sheet magnet, 426 00:17:29,960 --> 00:17:30,960 put a 427 00:17:33,320 --> 00:17:35,149 magnetic shielding material on your phone 428 00:17:35,150 --> 00:17:37,729 case, or even a small piece of metal 429 00:17:37,730 --> 00:17:39,949 will change that gas signature 430 00:17:39,950 --> 00:17:41,599 enough to throw off. 431 00:17:41,600 --> 00:17:43,699 So these are both very easy 432 00:17:43,700 --> 00:17:45,259 technologies to circumvent. 433 00:17:47,390 --> 00:17:49,099 What I want to talk about then, is 434 00:17:49,100 --> 00:17:51,259 computer vision and kind of modulating 435 00:17:51,260 --> 00:17:53,329 your appearance to minimize the damage 436 00:17:53,330 --> 00:17:55,069 to privacy. 437 00:17:55,070 --> 00:17:57,109 As I was preparing this, I came across 438 00:17:58,580 --> 00:18:00,649 there as a designer, were 439 00:18:00,650 --> 00:18:02,059 influenced by everything. 440 00:18:02,060 --> 00:18:04,189 But what better element to pull from than 441 00:18:04,190 --> 00:18:06,529 something that's evolved over time? 442 00:18:06,530 --> 00:18:08,659 Birds have shorter wingspans now today 443 00:18:08,660 --> 00:18:10,909 because they need to be more aerodynamic 444 00:18:10,910 --> 00:18:12,409 to not be hit by cars. 445 00:18:12,410 --> 00:18:13,699 That's something that's happened in the 446 00:18:13,700 --> 00:18:14,959 last hundred years. 447 00:18:17,890 --> 00:18:20,169 And I thought this is a great metaphor 448 00:18:20,170 --> 00:18:21,939 for thinking through technology that we 449 00:18:21,940 --> 00:18:25,149 also need to evolve like birds. 450 00:18:25,150 --> 00:18:27,310 The problem is that's not really true. 451 00:18:29,230 --> 00:18:31,539 So Lockheed Martin, 452 00:18:31,540 --> 00:18:33,789 based that on 453 00:18:33,790 --> 00:18:35,680 a 2013 study 454 00:18:37,180 --> 00:18:39,309 about I think 455 00:18:39,310 --> 00:18:41,439 Cliff swallows and it's 456 00:18:41,440 --> 00:18:43,629 a bit misleading to 457 00:18:43,630 --> 00:18:45,729 put it nicely, because 458 00:18:45,730 --> 00:18:48,279 the study is very short sighted, 459 00:18:48,280 --> 00:18:50,919 looking only at Nebraska 460 00:18:50,920 --> 00:18:52,089 and the 461 00:18:53,500 --> 00:18:54,549 birds lived 462 00:18:56,140 --> 00:18:58,419 near near traffic in a bridge 463 00:18:58,420 --> 00:18:59,649 structure. 464 00:18:59,650 --> 00:19:02,029 So to make this, 465 00:19:02,030 --> 00:19:03,880 you know, extrapolation from the data, 466 00:19:05,090 --> 00:19:06,789 I think that's actually the metaphor 467 00:19:08,410 --> 00:19:10,959 is that we're we're over interpreting 468 00:19:10,960 --> 00:19:13,089 statistics to create a lot of 469 00:19:13,090 --> 00:19:16,029 hype and oversell technologies 470 00:19:16,030 --> 00:19:18,249 that mislead us about their true cost to 471 00:19:18,250 --> 00:19:19,599 society. 472 00:19:19,600 --> 00:19:21,309 So that's the metaphor. 473 00:19:28,300 --> 00:19:30,679 And we can now we finally get to 474 00:19:30,680 --> 00:19:32,769 more computer vision, part of it 475 00:19:32,770 --> 00:19:34,960 to to put things 476 00:19:36,280 --> 00:19:38,559 to put the scale and reference of 477 00:19:38,560 --> 00:19:40,210 how much data can be 478 00:19:41,290 --> 00:19:43,599 gleaned from a 479 00:19:43,600 --> 00:19:46,389 very small amount of visual information. 480 00:19:46,390 --> 00:19:48,519 We started the scale of one pixel, a one 481 00:19:48,520 --> 00:19:50,769 by one transparent pixel. 482 00:19:50,770 --> 00:19:52,899 And this is the most popular image in 483 00:19:52,900 --> 00:19:55,059 the world in terms of the 484 00:19:55,060 --> 00:19:57,339 number of times it's been downloaded 485 00:19:57,340 --> 00:19:58,449 displayed. 486 00:19:58,450 --> 00:20:00,219 Of course, you can't see it because it's 487 00:20:00,220 --> 00:20:01,359 transparent. 488 00:20:01,360 --> 00:20:03,639 And the only purpose of this image 489 00:20:03,640 --> 00:20:05,440 is to collect information about you. 490 00:20:06,940 --> 00:20:09,819 The pixel lives on Google dot com 491 00:20:09,820 --> 00:20:10,809 lives all over. 492 00:20:10,810 --> 00:20:13,299 The ad ecosphere can be represented 493 00:20:13,300 --> 00:20:14,979 in 43 bytes. 494 00:20:14,980 --> 00:20:16,909 So it's the most lightweight image. 495 00:20:16,910 --> 00:20:18,969 I think this might be a better 496 00:20:18,970 --> 00:20:21,519 metaphor for what images are today 497 00:20:21,520 --> 00:20:23,679 in image. In some ways, it's become 498 00:20:23,680 --> 00:20:26,019 a shell for data 499 00:20:26,020 --> 00:20:28,060 collecting data and surveillance. 500 00:20:30,700 --> 00:20:33,309 When we move up and fill in that square, 501 00:20:33,310 --> 00:20:35,379 we have two hundred and fifty six 502 00:20:35,380 --> 00:20:37,869 different values at grayscale, 503 00:20:37,870 --> 00:20:39,040 that number 504 00:20:40,210 --> 00:20:42,399 increases very quickly 505 00:20:42,400 --> 00:20:44,559 as we increase the size of the 506 00:20:44,560 --> 00:20:47,619 image to four 507 00:20:47,620 --> 00:20:49,779 billion possible combinations and a two 508 00:20:49,780 --> 00:20:51,939 by two grayscale image, 509 00:20:51,940 --> 00:20:53,709 they go to four by four. 510 00:20:53,710 --> 00:20:55,989 Now we're at three point four times ten 511 00:20:55,990 --> 00:20:56,990 to the thirty eight. 512 00:20:58,060 --> 00:20:59,389 We go to seven by six. 513 00:20:59,390 --> 00:21:00,939 And now we have enough information to do 514 00:21:00,940 --> 00:21:01,960 facial recognition. 515 00:21:03,130 --> 00:21:05,319 Seven by six at 516 00:21:05,320 --> 00:21:07,059 two hundred and fifty six. 517 00:21:07,060 --> 00:21:09,119 Great values is enough to do 518 00:21:09,120 --> 00:21:11,079 it. 95 percent accurate facial 519 00:21:11,080 --> 00:21:13,209 recognition on the AT&T 520 00:21:13,210 --> 00:21:15,459 faces data set that granted the 521 00:21:15,460 --> 00:21:17,559 AT&T faces data set is 522 00:21:17,560 --> 00:21:19,779 all white guys and it's not very 523 00:21:19,780 --> 00:21:21,399 large data set. 524 00:21:21,400 --> 00:21:23,709 Even with a larger 525 00:21:23,710 --> 00:21:25,899 data set called Face Scrub, you 526 00:21:25,900 --> 00:21:27,609 only get an 18 percent performance 527 00:21:27,610 --> 00:21:29,799 reduction compared to the original 528 00:21:29,800 --> 00:21:30,939 faces. 529 00:21:30,940 --> 00:21:33,219 When you pixellated to 14 530 00:21:33,220 --> 00:21:34,569 by 14 pixels 531 00:21:36,340 --> 00:21:39,429 and only 12 by 16 pixels, 532 00:21:39,430 --> 00:21:41,919 you can build an encoder to do 533 00:21:41,920 --> 00:21:44,619 scene recognition activity recognition. 534 00:21:44,620 --> 00:21:46,029 So what do you do in this case as you 535 00:21:46,030 --> 00:21:47,709 train your neural network on this low 536 00:21:47,710 --> 00:21:49,899 resolution data and then instead 537 00:21:49,900 --> 00:21:52,029 of interpreting a 640 538 00:21:52,030 --> 00:21:54,099 by 480 image scale, 539 00:21:54,100 --> 00:21:56,259 that and you use the knowledge at 540 00:21:56,260 --> 00:21:58,489 12 by 16 to interpret it. 541 00:21:58,490 --> 00:22:00,009 We have a very large amount of 542 00:22:00,010 --> 00:22:02,379 information in a very small image 543 00:22:02,380 --> 00:22:04,509 space and we 544 00:22:04,510 --> 00:22:07,059 go out to 20 by 20 here we have 545 00:22:07,060 --> 00:22:08,709 two hundred and fifty six to the four 546 00:22:08,710 --> 00:22:11,169 hundredths of very large dimensional 547 00:22:11,170 --> 00:22:13,479 space. And these are 548 00:22:13,480 --> 00:22:15,969 the next four images are the optimal 549 00:22:15,970 --> 00:22:18,309 activations of open curves. 550 00:22:18,310 --> 00:22:19,239 Ha Kaskade. 551 00:22:19,240 --> 00:22:21,609 So if you were to ask 552 00:22:21,610 --> 00:22:22,610 the algorithm 553 00:22:23,740 --> 00:22:26,019 to describe the perfect 554 00:22:26,020 --> 00:22:28,149 face, these would be 555 00:22:28,150 --> 00:22:30,189 the most perfect face that the algorithm 556 00:22:30,190 --> 00:22:32,439 would want to see activates 557 00:22:32,440 --> 00:22:34,959 it maximally 558 00:22:34,960 --> 00:22:36,819 very high confidence score. 559 00:22:36,820 --> 00:22:39,039 So the different profiles are called alt 560 00:22:39,040 --> 00:22:40,040 Ottery 561 00:22:41,440 --> 00:22:43,659 just default frontal face. 562 00:22:43,660 --> 00:22:45,789 Now we go up to one 563 00:22:45,790 --> 00:22:47,170 hundred by 100. 564 00:22:48,640 --> 00:22:49,629 What can we do at one hundred. 565 00:22:49,630 --> 00:22:51,789 By one hundred I have a feeling, but 566 00:22:51,790 --> 00:22:53,409 I run out of time talking about 567 00:22:53,410 --> 00:22:55,419 everything that we can do. 568 00:22:55,420 --> 00:22:57,729 One by one hundred two point five percent 569 00:22:57,730 --> 00:22:58,930 of one Instagram photo, 570 00:23:01,450 --> 00:23:03,669 as we've seen going up from one 571 00:23:03,670 --> 00:23:05,769 pixel, even add one pixel. 572 00:23:05,770 --> 00:23:07,359 There's a lot of information. 573 00:23:07,360 --> 00:23:09,219 You have one hundred percent unique 574 00:23:09,220 --> 00:23:11,499 severability from zero 575 00:23:11,500 --> 00:23:13,959 to nine with times new Roman. 576 00:23:13,960 --> 00:23:15,879 You have ninety seven percent unique 577 00:23:15,880 --> 00:23:16,869 severability. 578 00:23:16,870 --> 00:23:18,939 When you reduce each character to one 579 00:23:18,940 --> 00:23:21,069 pixel, you know what that means is you 580 00:23:21,070 --> 00:23:23,199 can take a lot of redacted 581 00:23:23,200 --> 00:23:25,269 text, run it through a genetic 582 00:23:25,270 --> 00:23:28,269 algorithm or Markov chain, 583 00:23:28,270 --> 00:23:31,119 and then you can ascertain 584 00:23:31,120 --> 00:23:33,399 what that text was in merely 585 00:23:33,400 --> 00:23:35,859 a two by six pixilated image. 586 00:23:35,860 --> 00:23:38,499 Pixelation is not really redaction. 587 00:23:38,500 --> 00:23:41,019 It's simply a reduction and sometimes 588 00:23:41,020 --> 00:23:43,269 that's enough information to tell 589 00:23:43,270 --> 00:23:44,349 you what you want to know. 590 00:23:45,970 --> 00:23:46,970 So 591 00:23:48,790 --> 00:23:50,319 every day there are about three hundred 592 00:23:50,320 --> 00:23:52,449 and seventy million photos uploaded 593 00:23:52,450 --> 00:23:54,699 to the Facebook family, 594 00:23:54,700 --> 00:23:56,649 Instagram and Facebook. 595 00:23:56,650 --> 00:23:58,509 And what happens to all those photos 596 00:23:58,510 --> 00:24:00,639 which contain a lot of faces and 597 00:24:00,640 --> 00:24:02,289 a lot of those faces are larger than one 598 00:24:02,290 --> 00:24:03,999 hundred pixels. 599 00:24:04,000 --> 00:24:07,179 Every face that's uploaded to 600 00:24:07,180 --> 00:24:09,609 Instagram or Facebook 601 00:24:09,610 --> 00:24:11,679 is analyzed and 602 00:24:11,680 --> 00:24:13,749 possessed and the knowledge is 603 00:24:13,750 --> 00:24:16,299 extracted not only from the face, 604 00:24:16,300 --> 00:24:18,909 but also the metadata 605 00:24:18,910 --> 00:24:20,829 around that region. 606 00:24:20,830 --> 00:24:23,019 Now, I do like a very quick 607 00:24:23,020 --> 00:24:25,599 survey of some of the companies and 608 00:24:25,600 --> 00:24:27,970 what they're doing with that data. 609 00:24:29,180 --> 00:24:30,859 And I'll I'll focus on the most 610 00:24:30,860 --> 00:24:33,059 pernicious ones, face ception, 611 00:24:33,060 --> 00:24:35,329 there's a company out of Israel 612 00:24:35,330 --> 00:24:37,909 that's looking at giving people, 613 00:24:37,910 --> 00:24:38,839 ranking your image. 614 00:24:38,840 --> 00:24:41,029 If you look like a terrorist, 615 00:24:41,030 --> 00:24:42,919 if you look like a poker player, is one 616 00:24:42,920 --> 00:24:45,589 of them bingo player, 617 00:24:45,590 --> 00:24:48,019 you could be an academic researcher. 618 00:24:48,020 --> 00:24:49,879 You can have a high IQ. 619 00:24:49,880 --> 00:24:51,949 And they know all of this 620 00:24:51,950 --> 00:24:53,989 by looking at your face. 621 00:24:53,990 --> 00:24:56,539 So the idea is that your face 622 00:24:56,540 --> 00:24:58,819 is somehow linked to your DNA 623 00:24:58,820 --> 00:25:00,979 and that your physical traits 624 00:25:00,980 --> 00:25:03,139 can describe who you are, 625 00:25:03,140 --> 00:25:05,329 what you know, and your performance 626 00:25:05,330 --> 00:25:06,330 as a human. 627 00:25:09,570 --> 00:25:12,179 That sounds kind of crazy, 628 00:25:12,180 --> 00:25:14,160 like Francis Galton eugenics, 629 00:25:15,240 --> 00:25:17,939 inferring someone's 630 00:25:17,940 --> 00:25:20,039 capabilities purely based on physical 631 00:25:20,040 --> 00:25:22,259 traits know, but they're not alone. 632 00:25:22,260 --> 00:25:24,269 So some of these scores I mentioned, high 633 00:25:24,270 --> 00:25:25,649 IQ extrovert. 634 00:25:25,650 --> 00:25:28,049 And another thing they can tell purely 635 00:25:28,050 --> 00:25:29,429 by looking at your face is whether you 636 00:25:29,430 --> 00:25:31,049 would be a good brand promoter. 637 00:25:34,680 --> 00:25:37,469 Another research group is looking at 638 00:25:37,470 --> 00:25:39,539 predicting criminality using 639 00:25:39,540 --> 00:25:41,609 lip curvature, ie, inner 640 00:25:41,610 --> 00:25:44,249 corner distance and nose mouth angle. 641 00:25:44,250 --> 00:25:46,289 What they find is that criminal and 642 00:25:46,290 --> 00:25:48,719 non-criminal face images populate 643 00:25:48,720 --> 00:25:51,030 two quite distinctive manifolds. 644 00:25:54,200 --> 00:25:55,549 Again, not allowing a lot of other 645 00:25:55,550 --> 00:25:57,709 researchers are looking at how to take 646 00:25:57,710 --> 00:25:59,509 that very small one hundred by one 647 00:25:59,510 --> 00:26:02,089 hundred pixel amount of data 648 00:26:02,090 --> 00:26:04,429 and turn it into insights 649 00:26:04,430 --> 00:26:05,839 which could be used for marketing. 650 00:26:05,840 --> 00:26:07,909 So here we have a 651 00:26:07,910 --> 00:26:10,099 long list, including how trustworthy 652 00:26:10,100 --> 00:26:12,799 you are, sociable, typical 653 00:26:12,800 --> 00:26:13,800 or weird. 654 00:26:16,980 --> 00:26:17,980 And what all this. 655 00:26:19,370 --> 00:26:21,619 Reminds me of is Francis 656 00:26:21,620 --> 00:26:23,899 Galton and eugenics and 657 00:26:23,900 --> 00:26:26,359 and who the real criminal 658 00:26:26,360 --> 00:26:28,579 in these cases would 659 00:26:28,580 --> 00:26:30,739 be as people who are perpetrating this 660 00:26:30,740 --> 00:26:32,959 idea, not the people who are 661 00:26:34,100 --> 00:26:35,100 being looked at. 662 00:26:42,660 --> 00:26:44,219 There are some interesting things to 663 00:26:44,220 --> 00:26:46,229 learn from this, and I think by by 664 00:26:46,230 --> 00:26:47,489 learning the ways that you're being 665 00:26:47,490 --> 00:26:49,799 looked at, then you can modify 666 00:26:49,800 --> 00:26:51,359 begin to game the system. 667 00:26:51,360 --> 00:26:53,699 Of course, one of them is 668 00:26:53,700 --> 00:26:55,139 if you have a wider mouth, you're more 669 00:26:55,140 --> 00:26:56,639 likely to be chosen as leader 670 00:26:58,020 --> 00:26:59,729 of CEO of a company. 671 00:26:59,730 --> 00:27:01,379 But conversely, if you have a narrow 672 00:27:01,380 --> 00:27:03,449 mouth, you're going to do better for 673 00:27:03,450 --> 00:27:05,630 an NGO, says this paper. 674 00:27:06,930 --> 00:27:08,849 You can just looking at 675 00:27:09,930 --> 00:27:12,209 the relationship between these 100 by 100 676 00:27:12,210 --> 00:27:14,399 pixel regions of face, determine who's 677 00:27:14,400 --> 00:27:16,529 the most important person in an 678 00:27:16,530 --> 00:27:19,199 image, spatial information. 679 00:27:19,200 --> 00:27:21,549 You can ascertain somebody's pulse 680 00:27:21,550 --> 00:27:24,059 if you have a video by amplifying 681 00:27:24,060 --> 00:27:26,249 the green channel again. 682 00:27:26,250 --> 00:27:27,719 Still within one hundred by one 100 683 00:27:27,720 --> 00:27:28,720 pixels. 684 00:27:29,460 --> 00:27:31,649 I don't think I have time to tell this 685 00:27:31,650 --> 00:27:33,179 great story. 686 00:27:33,180 --> 00:27:35,429 Jetpack, the company that analyzed 687 00:27:35,430 --> 00:27:38,579 every public pixel of Instagram, 688 00:27:38,580 --> 00:27:40,709 they then sold that technology 689 00:27:40,710 --> 00:27:42,029 to Google. 690 00:27:42,030 --> 00:27:44,939 Instagram is, of course, Facebook 691 00:27:44,940 --> 00:27:47,069 brilliant if you sell 692 00:27:47,070 --> 00:27:49,019 Facebook to Google. 693 00:27:49,020 --> 00:27:51,239 But what they did is just take 694 00:27:51,240 --> 00:27:53,519 information again from 695 00:27:53,520 --> 00:27:54,989 the facial region and they built a 696 00:27:54,990 --> 00:27:55,989 guidebook. 697 00:27:55,990 --> 00:27:58,349 So what's a cool place to go? 698 00:27:58,350 --> 00:28:00,089 Oh, the place with a lot of hipster 699 00:28:00,090 --> 00:28:01,090 mustaches. 700 00:28:02,160 --> 00:28:04,619 Where's a place to pick up girls 701 00:28:04,620 --> 00:28:07,019 where photos of people with lipstick 702 00:28:07,020 --> 00:28:08,039 are geotagged? 703 00:28:08,040 --> 00:28:09,989 That's the idea for their product. 704 00:28:09,990 --> 00:28:11,489 And that's what happens to photos that 705 00:28:11,490 --> 00:28:13,439 are uploaded to Instagram. 706 00:28:13,440 --> 00:28:15,239 But beyond that, you can also begin to 707 00:28:15,240 --> 00:28:17,549 predict economic behavior 708 00:28:17,550 --> 00:28:19,889 purely from one photo. 709 00:28:19,890 --> 00:28:21,809 You can predict the decision making 710 00:28:21,810 --> 00:28:24,149 capability about 711 00:28:24,150 --> 00:28:26,159 20 percent better than a human can with 712 00:28:26,160 --> 00:28:27,420 computer vision algorithms. 713 00:28:31,200 --> 00:28:33,209 And I have to move briefly, but kind of 714 00:28:33,210 --> 00:28:35,429 an example of what it would look like to 715 00:28:35,430 --> 00:28:37,679 impose a lot of algorithms 716 00:28:37,680 --> 00:28:40,019 on top of an image, a selfie, 717 00:28:40,020 --> 00:28:41,699 I like to say, contains more information 718 00:28:41,700 --> 00:28:43,409 than a mug shot because you not only have 719 00:28:43,410 --> 00:28:45,149 the face, but you have all of the 720 00:28:45,150 --> 00:28:47,279 metadata and relational 721 00:28:47,280 --> 00:28:48,890 information around that. 722 00:28:50,880 --> 00:28:53,099 One of the companies, 723 00:28:53,100 --> 00:28:55,169 a few of them chiros 724 00:28:55,170 --> 00:28:56,170 emotions 725 00:28:57,660 --> 00:29:00,179 clarify Affectiva are operating 726 00:29:00,180 --> 00:29:02,249 in this space using 727 00:29:02,250 --> 00:29:03,929 some of the attributes I mentioned in 728 00:29:03,930 --> 00:29:06,119 those earlier papers, as well 729 00:29:06,120 --> 00:29:08,849 as this list of about 730 00:29:08,850 --> 00:29:09,989 seventy eight 731 00:29:11,580 --> 00:29:13,229 attributes that you can extract from a 732 00:29:13,230 --> 00:29:15,359 face and 47 733 00:29:15,360 --> 00:29:17,789 kind of knowledge points that you can 734 00:29:17,790 --> 00:29:20,369 then infer based on those attributes. 735 00:29:20,370 --> 00:29:23,069 So what to do with all this 736 00:29:23,070 --> 00:29:25,679 information contained in a very 737 00:29:25,680 --> 00:29:27,779 small two point five percent of one 738 00:29:27,780 --> 00:29:28,859 Instagram photo? 739 00:29:30,300 --> 00:29:32,549 Well, as I've looked at in an earlier 740 00:29:32,550 --> 00:29:34,679 project, you can change 741 00:29:34,680 --> 00:29:36,739 the way that you appear. 742 00:29:36,740 --> 00:29:38,489 But I think there's also an opportunity 743 00:29:38,490 --> 00:29:41,159 to change, you know, in camouflage. 744 00:29:41,160 --> 00:29:43,649 You can think about the figure 745 00:29:43,650 --> 00:29:45,299 and the ground relationship. 746 00:29:45,300 --> 00:29:47,459 I think there's also an opportunity 747 00:29:47,460 --> 00:29:49,919 to begin to modify 748 00:29:49,920 --> 00:29:52,619 the ground, things that appear behind you 749 00:29:52,620 --> 00:29:55,619 next to you, and that can possibly 750 00:29:55,620 --> 00:29:57,779 interfere also with the computer vision. 751 00:29:57,780 --> 00:29:58,919 Confidence score 752 00:30:00,510 --> 00:30:02,789 in that new project is called hyper 753 00:30:02,790 --> 00:30:05,339 phase. And what this is doing 754 00:30:05,340 --> 00:30:07,950 is taking those maximal activations. 755 00:30:09,210 --> 00:30:11,639 Either from a more traditional classifier 756 00:30:11,640 --> 00:30:13,829 like the open KVI, the 757 00:30:13,830 --> 00:30:15,989 all the drones, and here is a heat map 758 00:30:15,990 --> 00:30:18,149 of the most important areas 759 00:30:18,150 --> 00:30:20,669 of the face for two profiles, 760 00:30:20,670 --> 00:30:22,619 taking that information and then just 761 00:30:22,620 --> 00:30:25,139 giving an algorithm over, 762 00:30:25,140 --> 00:30:27,209 overloading it with what it wants, 763 00:30:27,210 --> 00:30:29,489 the kind of oversaturating an area 764 00:30:29,490 --> 00:30:31,799 with faces to 765 00:30:31,800 --> 00:30:34,019 increase, uh, to kind of 766 00:30:34,020 --> 00:30:36,239 divert the gaze of a computer 767 00:30:36,240 --> 00:30:37,240 vision algorithm, 768 00:30:38,370 --> 00:30:41,159 know an early prototype for this 769 00:30:41,160 --> 00:30:43,169 looked like something like this is a 770 00:30:43,170 --> 00:30:45,419 little bit spooky. Maybe. 771 00:30:45,420 --> 00:30:47,669 But what you see here is all the 772 00:30:47,670 --> 00:30:50,559 maximal activation is kind of overlaid. 773 00:30:50,560 --> 00:30:52,649 And, you know, when 774 00:30:52,650 --> 00:30:54,809 you when you put this through a computer 775 00:30:54,810 --> 00:30:57,179 vision phase detector, 776 00:30:57,180 --> 00:30:59,819 you get about twelve hundred possible 777 00:30:59,820 --> 00:31:01,589 phase detections from it. 778 00:31:01,590 --> 00:31:04,469 Not faces but but confidence 779 00:31:04,470 --> 00:31:05,470 mappings. 780 00:31:06,520 --> 00:31:08,009 You can refine that a little bit to 781 00:31:08,010 --> 00:31:10,109 create more of the kind of cad 782 00:31:10,110 --> 00:31:12,539 pad pixellated camouflage look. 783 00:31:12,540 --> 00:31:14,429 And then you can do something similar for 784 00:31:14,430 --> 00:31:16,649 a neural network to activate the 785 00:31:16,650 --> 00:31:19,529 face neuron of a neural network. 786 00:31:19,530 --> 00:31:21,899 Then you can use these and combine 787 00:31:21,900 --> 00:31:24,359 them to create for this project 788 00:31:24,360 --> 00:31:26,819 textile textile patterns 789 00:31:26,820 --> 00:31:29,639 that could be used, hopefully to 790 00:31:29,640 --> 00:31:32,039 modify the environment around you, 791 00:31:32,040 --> 00:31:34,439 whether somebody next to you, 792 00:31:34,440 --> 00:31:36,419 whether you're wearing it, maybe around 793 00:31:36,420 --> 00:31:38,249 your head or in a new way. 794 00:31:38,250 --> 00:31:40,649 That project is in collaboration 795 00:31:40,650 --> 00:31:43,829 with HyperMED Labs in New York City 796 00:31:43,830 --> 00:31:46,079 for their new work, Neuro Speculative 797 00:31:46,080 --> 00:31:48,159 Aphro, Feminism and 798 00:31:48,160 --> 00:31:49,160 Asaph. 799 00:31:50,070 --> 00:31:51,150 And that project 800 00:31:52,260 --> 00:31:53,759 will come out in January. 801 00:31:53,760 --> 00:31:55,589 So probably done. 802 00:31:55,590 --> 00:31:57,539 But I like to end on this slide, 803 00:31:58,950 --> 00:32:00,599 which was introduced to me, my friend 804 00:32:00,600 --> 00:32:03,269 Richard Reeves, which shows 805 00:32:03,270 --> 00:32:04,979 a scene from one hundred years ago in New 806 00:32:04,980 --> 00:32:07,409 York. Now, if you look at this photo, 807 00:32:07,410 --> 00:32:08,729 everybody's wearing a hat. 808 00:32:08,730 --> 00:32:11,009 If you look around the room, nobody's 809 00:32:11,010 --> 00:32:11,909 wearing a hat. 810 00:32:11,910 --> 00:32:14,699 Right? So in one hundred years from now, 811 00:32:14,700 --> 00:32:16,799 we're going to have a similar 812 00:32:16,800 --> 00:32:18,929 transformation of fashion and the way 813 00:32:18,930 --> 00:32:20,279 that we appear. 814 00:32:20,280 --> 00:32:22,229 And what will that look like? 815 00:32:22,230 --> 00:32:24,449 And hopefully it'll look like 816 00:32:24,450 --> 00:32:26,460 something that's according to 817 00:32:28,320 --> 00:32:31,169 appearing in a way where we we optimize 818 00:32:31,170 --> 00:32:33,419 our personal privacy or 819 00:32:33,420 --> 00:32:34,939 optimize. 820 00:32:34,940 --> 00:32:36,910 Um, yeah, 821 00:32:38,940 --> 00:32:40,619 optimize, you know, according to the 822 00:32:40,620 --> 00:32:42,059 settings of mass surveillance. 823 00:32:42,060 --> 00:32:43,559 And I'll end on that. 824 00:32:43,560 --> 00:32:44,560 Thank you.