1 00:00:06,440 --> 00:00:18,430 *35C3 Intro music* 2 00:00:19,490 --> 00:00:25,046 Herald Angel: We at the Congress, we not only talk about technology, we also talk 3 00:00:25,046 --> 00:00:30,990 about social and ethical responsibility. About how we can change the world for 4 00:00:30,990 --> 00:00:36,190 good. The Good Technology Collective supports the development guidelines... 5 00:00:36,190 --> 00:00:41,289 sorry, it supports the development process of new technology with ethical engineering 6 00:00:41,289 --> 00:00:47,829 guidelines, that offer a practical way to take ethic and social impact into account. 7 00:00:47,829 --> 00:00:54,235 Yannick Leretaille - and I hope this was okay - will tell you more about it. 8 00:00:54,235 --> 00:00:57,820 Please welcome on stage with a very warm applause Yann Leretaille. 9 00:00:57,820 --> 00:01:02,649 *applause* 10 00:01:02,649 --> 00:01:07,060 Yannick Leretaille: Hi, thanks for the introduction. So before we start, can you 11 00:01:07,060 --> 00:01:11,360 kind of show me your hand if you, like, work in tech building products as 12 00:01:11,360 --> 00:01:17,720 designers, engineers, coders, product management? OK, so it's like 95 percent, 13 00:01:17,720 --> 00:01:27,820 90 percent. Great. Yeah. So, today we kind of try to answer the question: What is 14 00:01:27,820 --> 00:01:33,560 good technology and how can we build better technology. Before that, shortly 15 00:01:33,560 --> 00:01:39,810 something of me. So I am Yann. I'm French- German. Kind of a hacker, among the CCC 16 00:01:39,810 --> 00:01:44,140 for a long time, entrepreneur, like, co- founder of a startup in Berlin. And I'm 17 00:01:44,140 --> 00:01:48,110 also founding member of the Good Technology Collective. The Good 18 00:01:48,110 --> 00:01:55,920 Technology Collective was founded about a year ago or almost over a year now actually 19 00:01:55,920 --> 00:02:02,250 by a very diverse expert council and we kinda have like 3 areas of work. 20 00:02:02,250 --> 00:02:07,660 The first one is trying to educate the public about current issues with 21 00:02:07,660 --> 00:02:12,190 technology, then, to educate engineers or to build better technology, and then 22 00:02:12,190 --> 00:02:19,980 long-term hopefully one day we'll be able to work like in legislation as well. 23 00:02:19,980 --> 00:02:27,250 Here, it's a bit of what we achieved so far. We've like 27 council members now. We 24 00:02:27,250 --> 00:02:30,870 have several media partnerships and published around 20 articles, that's kind 25 00:02:30,870 --> 00:02:36,260 of the public education part. Then we organized or participated in roughly 15 26 00:02:36,260 --> 00:02:44,780 events already. And we are now publishing one standard, well, kind of today 27 00:02:44,780 --> 00:02:49,459 actually, and *applause* 28 00:02:49,459 --> 00:02:54,290 and if you're interested in what we do, then, yeah, sign up for the newsletter and 29 00:02:54,290 --> 00:03:00,290 we keep you up to date and you can join events. So as I said the Expert Council is 30 00:03:00,290 --> 00:03:08,290 really, really diverse. We have everything from people in academia, to people in 31 00:03:08,290 --> 00:03:13,380 government, to technology makers, to philosophers, all sorts, journalists. 32 00:03:13,380 --> 00:03:22,260 And the reason that is the case is that a year ago we kind of noticed that in our own 33 00:03:22,260 --> 00:03:27,840 circles, like, as technology makers or academics, we were all talking about a lot 34 00:03:27,840 --> 00:03:33,209 of, kind of, voice on development and technology, but no one was really kind of 35 00:03:33,209 --> 00:03:37,470 getting together and looking at it from all angles. And there have been a lot of 36 00:03:37,470 --> 00:03:43,580 very weird and troublesome developments in the last two years. I think we really 37 00:03:43,580 --> 00:03:49,010 finally feel, you know like, the impact of filter bubbles. Something we have talked 38 00:03:49,010 --> 00:03:54,310 for like five years, but now it's like, really like, you know, deciding over 39 00:03:54,310 --> 00:04:01,290 elections and people become politically radicalized and society is, kind of, 40 00:04:01,290 --> 00:04:06,360 polarized more because they only see a certain opinion. And we have situations 41 00:04:06,360 --> 00:04:11,310 that we only knew, like, from science fiction, just kind of, you know, pre-crime, 42 00:04:11,310 --> 00:04:16,629 like, governments, kind of, over-arching and trying to use machine learning to make 43 00:04:16,629 --> 00:04:22,820 decisions on whether or not you should go to jail. We have more and more machine 44 00:04:22,820 --> 00:04:26,820 learning and big data and automization going into basically every single aspect 45 00:04:26,820 --> 00:04:31,620 of our lives and not all of it been has been positive. You know, like, literally 46 00:04:31,620 --> 00:04:36,510 everything from e-commerce to banking to navigating to moving to the vault now goes 47 00:04:36,510 --> 00:04:45,889 through these interfaces. That present us the data and a slice of the world at a time. 48 00:04:45,889 --> 00:04:49,699 And then at the same time we have really positive developments. Right? We have 49 00:04:49,699 --> 00:04:54,180 things like this, you know, like space travel, finally something's happening. 50 00:04:54,180 --> 00:05:00,850 And we have huge advances in medicine. Maybe soon we'll have, like, self-driving cars 51 00:05:00,850 --> 00:05:10,870 and great renewable technology. And it kind of begs the question: How can it be that 52 00:05:10,870 --> 00:05:16,889 good and bad use of technology are kind of showing up at such an increasing rate in 53 00:05:16,889 --> 00:05:25,130 this, like, on such extremes, right? And maybe the reason is just that everything 54 00:05:25,130 --> 00:05:30,330 got so complicated, right? Data is basically doubling every couple of years, 55 00:05:30,330 --> 00:05:35,620 so no human can possibly process anymore. So we had to build more and more complex 56 00:05:35,620 --> 00:05:40,979 algorithms to process it, connecting more and more parts together. And no one really 57 00:05:40,979 --> 00:05:46,020 seems to understand it anymore, it seems. And that leads to unintended consequences. 58 00:05:46,020 --> 00:05:50,270 I've an example here: So, Google Photos – this is actually only two years ago – 59 00:05:50,270 --> 00:05:56,960 launched a classifier to automatically go through all of your pictures and tell you 60 00:05:56,960 --> 00:06:01,254 what it is. You could say "Show me the picture of the bird in summer at this 61 00:06:01,254 --> 00:06:06,330 location" and it would find it for you. Kind of really cool technology, and they 62 00:06:06,330 --> 00:06:11,180 released it to, like, a planetary user base until someone figured out that people 63 00:06:11,180 --> 00:06:17,090 of color were always marked as gorillas. Of course it was a huge PR disaster, why 64 00:06:17,090 --> 00:06:21,680 somehow no one found out about this before it came out... But now the interesting thing 65 00:06:21,680 --> 00:06:27,240 is: In two years they didn't even manage to fix it! Their solution was to just 66 00:06:27,240 --> 00:06:32,853 block all kind of apes, so they're just not found anymore. And that's how they 67 00:06:32,853 --> 00:06:37,645 solved it, right? But if even Google can't solve this... what does it mean? 68 00:06:38,795 --> 00:06:42,245 And then, at the same time, you know, sometimes we seem to have, kind of, 69 00:06:42,245 --> 00:06:44,130 intended consequences? 70 00:06:45,640 --> 00:06:49,650 I have an example... another example here: Uber Greyball. I don't know if anyone 71 00:06:49,650 --> 00:06:58,490 heard about it. So Uber was very eager to change regulation and push the services 72 00:06:58,490 --> 00:07:03,210 globally as much as possible, and kind of starting a fight with, you know, all the 73 00:07:03,210 --> 00:07:06,539 taxi laws and regulation, and taxi drivers in the various countries around the 74 00:07:06,539 --> 00:07:11,880 world. And what they realized, of course, is that they didn't really want people to 75 00:07:11,880 --> 00:07:18,620 be able to, like, investigate what they were doing or, like, finding individual 76 00:07:18,620 --> 00:07:22,170 drivers. So they built this absolutely massive operation which was like following 77 00:07:22,170 --> 00:07:27,120 data in social media profiles, linking, like, your credit card and location data 78 00:07:27,120 --> 00:07:31,039 to find out if you were working for the government. And if you did, you would just 79 00:07:31,039 --> 00:07:36,680 never find a car. It would just not show up, right? And that was clearly 80 00:07:36,680 --> 00:07:41,307 intentional, all right. So at the same time they were pushing, like, on, like, 81 00:07:41,307 --> 00:07:46,490 the lobbyism, political side to change regulation, while heavily manipulating the 82 00:07:46,490 --> 00:07:52,039 people that were pushing to change the regulation, right? Which is really not a 83 00:07:52,039 --> 00:07:54,754 very nice thing to do, I would say. 84 00:07:55,534 --> 00:07:56,917 And... 85 00:07:58,417 --> 00:08:02,711 The thing that I find, kind of... worrisome about this: 86 00:08:02,711 --> 00:08:08,207 No matter if it's intended or unintended, is that it actually gets worse, right? 87 00:08:08,207 --> 00:08:12,746 The more and more systems we interconnect, the worse these consequences 88 00:08:12,746 --> 00:08:17,300 can get. And I've an example here: So this is a screenshot I took of Google Maps 89 00:08:17,300 --> 00:08:24,100 yesterday and you notice there are, like, certain locations... So they're kind of 90 00:08:24,100 --> 00:08:28,160 highlighted on this map and I don't know if you knew it but this map and the 91 00:08:28,160 --> 00:08:32,510 locations that Google highlight look different for every single person. 92 00:08:32,510 --> 00:08:38,130 Actually, I went again and looked today and it looked different again. So, Google 93 00:08:38,130 --> 00:08:42,520 is already heavily filtering and kind of highlighting certain places, like, maybe 94 00:08:42,520 --> 00:08:48,850 this restaurant over there, if you can see it. And I would say, like, from just 95 00:08:48,850 --> 00:08:53,550 opening the map, that's not obvious to you that it's doing that. Or that it's trying 96 00:08:53,550 --> 00:08:59,130 to decide for you which place is interesting for you. However, that's 97 00:08:59,130 --> 00:09:05,620 probably not such a big issue. But the same company, Google with Waymo, is also 98 00:09:05,620 --> 00:09:11,360 developing this – and they just started deploying them: self-driving cars. They're... 99 00:09:12,720 --> 00:09:18,690 ...still a good couple of years away from actually making it reality, but they are 100 00:09:18,690 --> 00:09:22,547 really – in terms of, like, all the others trying it at the moment – the farthest, I 101 00:09:22,547 --> 00:09:28,915 would say, and in some cities they started deploying self-driving cars. So now, just 102 00:09:28,915 --> 00:09:33,892 think like 5, 10 years into the future and you have signed up in your Google... 103 00:09:35,246 --> 00:09:38,834 ...self-driving car. Probably you don't have your own car, right? And you go in 104 00:09:38,834 --> 00:09:44,381 the car and you are like: "Hey, Yann, where do you want to go?" Do you want to go to 105 00:09:44,381 --> 00:09:49,770 work? Because, I mean obviously that's why I probably go most of the time. Do you 106 00:09:49,770 --> 00:09:52,620 want to go to your favorite Asian restaurant, like the one we just saw on the 107 00:09:52,620 --> 00:09:56,700 map? Which is actually not my favorite, but the first one I went to. So Google 108 00:09:56,700 --> 00:10:00,300 just assumed it was. Do you want to go to another Asian restaurant? Because, 109 00:10:00,300 --> 00:10:06,960 obviously, that's all I like. And then McDonald's. Because, everyone goes there. 110 00:10:06,960 --> 00:10:11,420 And maybe the fifth entry is an advertisement. And you would say: Well, 111 00:10:11,420 --> 00:10:17,899 Yann, you know, that's still kind of fine, but it's OK because I can still click on: 112 00:10:17,899 --> 00:10:24,750 'No, I don't want these 5 options, give me, like, the full map.' But now, we went back 113 00:10:24,750 --> 00:10:31,050 here. So, even though you are seeing the map, you're not actually not seeing all 114 00:10:31,050 --> 00:10:35,650 the choices, right? Google is actually filtering for you where it thinks you want 115 00:10:35,650 --> 00:10:42,910 to go. So now we have, you know, the car like this symbol of mobility and freedom. 116 00:10:42,910 --> 00:10:49,790 It enables so much change in our society that it's actually reducing the part of 117 00:10:49,790 --> 00:10:53,910 the world that you see. And because, I mean these days they call it AI, I think 118 00:10:53,910 --> 00:10:58,660 it's just machine learning, because these machine learning algorithms all do pattern 119 00:10:58,660 --> 00:11:04,730 matching and basically just can recognize similarities. When you open the map and 120 00:11:04,730 --> 00:11:09,330 you zoom in and you select a random place, it would only suggest places to you where 121 00:11:09,330 --> 00:11:14,730 other people have been before. So now the restaurant that opened around the corner 122 00:11:14,730 --> 00:11:19,010 you'll probably not even discover it anymore. And no one will. And it will 123 00:11:19,010 --> 00:11:23,080 probably close. And the only ones that will stay are the ones that are already 124 00:11:23,080 --> 00:11:32,410 established now. And all of that without being really obvious to anyone who would 125 00:11:32,410 --> 00:11:40,200 use the technology. Because it has become like kind of a black box. So, I do want 126 00:11:40,200 --> 00:11:47,740 self-driving cars, I really do. I don't want a future like this. Right. And if we 127 00:11:47,740 --> 00:11:52,850 want to prevent that future, I think we have to first ask a very simple question, 128 00:11:52,850 --> 00:11:59,519 which is: Who is responsible for designing these products? So, do you know the 129 00:11:59,519 --> 00:12:01,519 answer? audience: *inaudible* 130 00:12:01,519 --> 00:12:05,200 Yann: Say it louder. audience: We are. 131 00:12:05,200 --> 00:12:10,220 Yann: Yeah, we are. Right. That's a really frustrating thing about it that actually 132 00:12:10,220 --> 00:12:15,490 gets us, right, as engineers and developers. You know we are always driven 133 00:12:15,490 --> 00:12:20,230 by perfection. We want to create, like, the perfect code sources. One problem, 134 00:12:20,230 --> 00:12:25,420 really, really nice. You know. Chasing the next challenge over and over trying to be 135 00:12:25,420 --> 00:12:31,950 first. But we have to realize that at the same time we are kind of working on 136 00:12:31,950 --> 00:12:37,040 frontier technologies, right, on things, technology, that are really kind of on the 137 00:12:37,040 --> 00:12:42,230 edge of values and norms we have in society. And if we are not careful and 138 00:12:42,230 --> 00:12:46,450 just, like, focus on our small problem and don't look at the big picture, then we 139 00:12:46,450 --> 00:12:52,270 have no say in on which side of the coin the technology will fall. And probably it 140 00:12:52,270 --> 00:12:58,680 will take a couple of years, or by that time we alreaday moved on, I guess. So. 141 00:12:58,680 --> 00:13:06,620 It's just that technology has become so powerful and interconnected and impactful, 142 00:13:06,620 --> 00:13:10,720 because we are not building stuff that it's not affecting like 10 or 100 people 143 00:13:10,720 --> 00:13:14,520 or a city but literally millions of people, that we really have to take a step 144 00:13:14,520 --> 00:13:20,950 back and not only look at the individual problem as the challenge but also the big 145 00:13:20,950 --> 00:13:27,120 picture. And I think if you want to do that we have to start by asking the right 146 00:13:27,120 --> 00:13:33,510 questions. And the first question of course is: What is good technology? So, 147 00:13:33,510 --> 00:13:39,250 that's also the name of the talk. Unfortunately, I don't have a perfect 148 00:13:39,250 --> 00:13:45,680 answer for that. And probably we will never find a perfect answer for that. So, 149 00:13:45,680 --> 00:13:53,060 what I would like to propose is to establish some guidelines and engineering 150 00:13:53,060 --> 00:13:57,980 processes that help us to build better technology. To kind of ensure the same 151 00:13:57,980 --> 00:14:03,959 where we have quality insurance and project management systems and processes 152 00:14:03,959 --> 00:14:09,480 to, like, kind of, this you were tasked with. And companies that what we build is 153 00:14:09,480 --> 00:14:15,589 actually, has a net positive outcome for society. And we call it the good 154 00:14:15,589 --> 00:14:22,130 technology standard. We've kind of been working that over, the last year, and we 155 00:14:22,130 --> 00:14:26,840 really wanted to make it really practical. And what we kind of realized is that if you 156 00:14:26,840 --> 00:14:32,250 want to make it practical you have to make it very easy to use and also mostly, 157 00:14:32,250 --> 00:14:39,000 actually what was surprising, just ask the right questions. So, what is important 158 00:14:39,000 --> 00:14:46,270 though, is that if you adapt the standard, it has to be in all project phases. It has 159 00:14:46,270 --> 00:14:50,200 to involve everyone. So, from, like, the CTO to, like, the product managers to 160 00:14:50,200 --> 00:14:55,820 actually legal. Today, legal has this interesting role, where you develop 161 00:14:55,820 --> 00:15:00,350 something and then you're like: Okay, now, legal, make sure that we can actually ship it. 162 00:15:00,350 --> 00:15:06,480 And that's what usually happens. And, yeah, down to the individual engineer. And 163 00:15:06,480 --> 00:15:09,940 if it's not applied globally and people start making exceptions then of course it 164 00:15:09,940 --> 00:15:17,970 won't be worth very much. Generally, we kind of identified four main areas that we 165 00:15:17,970 --> 00:15:22,730 think are important, kind of defining, kind of an abstract way, if a product is 166 00:15:22,730 --> 00:15:30,470 good. And the first one is empowerment. A good product should empower its users. And 167 00:15:30,470 --> 00:15:36,290 that's kind of a tricky thing. So, as humans we have very limited decision 168 00:15:36,290 --> 00:15:40,360 power. Right? And we are faced with, as I said before, like, this huge amount of 169 00:15:40,360 --> 00:15:46,339 data and choices. So it seems very natural to build machines and interfaces that try 170 00:15:46,339 --> 00:15:51,260 to make a lot of decisions for us. Like the Google Maps one we saw before. But we 171 00:15:51,260 --> 00:15:55,730 have to be careful because if we do that too much then the machine ends up making 172 00:15:55,730 --> 00:16:03,079 all decisions for us. So often, when you develop something you should really ask 173 00:16:03,079 --> 00:16:07,320 yourself, like, in the end if I take everything together am I actually 174 00:16:07,320 --> 00:16:12,880 empowering users, or am I taking responsibility away from them? Do I 175 00:16:12,880 --> 00:16:18,360 respect the individual choice? Why does he say: I don't want this, or they give you 176 00:16:18,360 --> 00:16:23,570 their preference, do we actually respect it or do we still try to, you know, just 177 00:16:23,570 --> 00:16:29,639 figure out what is better for them. Do my users actually feel like they benefit from 178 00:16:29,639 --> 00:16:34,330 using the product? So, I couldn't, actually not a lot of people ask themselves, 179 00:16:34,330 --> 00:16:39,720 because usually you think like in terms of: Are you benefiting your company? And I 180 00:16:39,720 --> 00:16:45,699 think what's really pressing in that aspect: does it help the users, the humans 181 00:16:45,699 --> 00:16:53,529 behind it, to grow in any way. If it helps them to be more effective or faster or do 182 00:16:53,529 --> 00:16:57,990 more things or be more relaxed or more healthy, right, then it's probably positive. 183 00:16:57,990 --> 00:17:02,100 But if you can't identify any of these, then you really have to think about it. 184 00:17:02,100 --> 00:17:08,789 And then, in terms of AI, in machine learning, are we actually kind of 185 00:17:08,789 --> 00:17:17,049 impacting their own reasoning so that they can't make proper decisions anymore. The 186 00:17:17,049 --> 00:17:22,560 second one is Purposeful Product Design. That one is one that, it's been kind of a 187 00:17:22,560 --> 00:17:26,789 pet peeve for me for a really long time. So these days we have a lot of products 188 00:17:26,789 --> 00:17:32,289 that are kind of like this. I don't have something specifically against Philips 189 00:17:32,289 --> 00:17:37,639 Hue, but there seems to be, like, this trend that is kind of, making smart 190 00:17:37,639 --> 00:17:42,899 things, right? You take a product, put a Wi-Fi chip on it, just slap it on there. 191 00:17:42,899 --> 00:17:47,860 Label it "smart", and then you make tons of profit, right? And a lot of these new 192 00:17:47,860 --> 00:17:50,309 products we've been seeing around us, like, everyone is saying, like, oh yeah, 193 00:17:50,309 --> 00:17:55,119 we will have this great interconnected feature, but most of them are actually not 194 00:17:55,119 --> 00:17:58,489 changing the actual product, right, like, the Wi-Fi connected washing machine today 195 00:17:58,489 --> 00:18:03,320 is still a boring washing machine that breaks down after two years. But it has 196 00:18:03,320 --> 00:18:08,970 Wi-Fi, so you can see what it's doing when you're in the park. And we think we should 197 00:18:08,970 --> 00:18:16,000 really think more in terms of intelligent design. How can we design it in the first 198 00:18:16,000 --> 00:18:21,769 place so it's intelligent, not smart. That the different components interact in a 199 00:18:21,769 --> 00:18:26,779 way, that it serves a purpose well, and the kind of intelligent by design 200 00:18:26,779 --> 00:18:33,950 philosophy is, when you start using your product you kind of try to identify the 201 00:18:33,950 --> 00:18:40,629 core purpose of it. And based on that, you just use all the technologies available to 202 00:18:40,629 --> 00:18:44,359 rebuild it from scratch. So, instead of building a Wi-Fi connect washing machine 203 00:18:44,359 --> 00:18:47,450 would actually try to build a better washing machine. And if it ends up having 204 00:18:47,450 --> 00:18:51,489 Wi-Fi, then that's good, but it doesn't has to. And along each step actually try 205 00:18:51,489 --> 00:18:58,309 to ask yourself: Am I actually improving washing machines here? Or am I just 206 00:18:58,309 --> 00:19:06,279 creating another data point? And yeah, a good example for that is, kind of, a 207 00:19:06,279 --> 00:19:09,960 watch. Of course it's very old and old technology, it was invented a long time 208 00:19:09,960 --> 00:19:14,149 ago. But back when it was invented it was for something you could have on your arm 209 00:19:14,149 --> 00:19:19,510 or in your pocket in the beginning and it was kind of a natural extension of 210 00:19:19,510 --> 00:19:25,489 yourself, right, that kind of enhances your senses because it's never there, you 211 00:19:25,489 --> 00:19:29,850 don't really feel it. But when you need it it's always there and then you can just 212 00:19:29,850 --> 00:19:33,940 look at it and you know the time. And that profoundly changed how, like, we humans 213 00:19:33,940 --> 00:19:37,570 actually worked in society because we couldn't meet in the same place at the 214 00:19:37,570 --> 00:19:42,779 same time. So, when you build a new product try to ask yourself what is the 215 00:19:42,779 --> 00:19:46,590 purpose of the product, who is it for. Often I talk to people and they talk to me 216 00:19:46,590 --> 00:19:50,750 for one hour, what like, literally the details of how they solved the problem but 217 00:19:50,750 --> 00:19:55,130 they can't tell me who their customer is. Then does this product actually make 218 00:19:55,130 --> 00:19:59,570 sense? Do I have features, and these distract my users, that I maybe just don't 219 00:19:59,570 --> 00:20:03,850 need. And can I find more intelligent solutions by kind of thinking outside of 220 00:20:03,850 --> 00:20:09,549 the box and focusing on the purpose of it. And then of course what is the long term 221 00:20:09,549 --> 00:20:12,950 product vision like, where do we want this to go? This kind of technology I'm 222 00:20:12,950 --> 00:20:20,090 developing in the next years. The next one is kind of, Societal Impact, that goes 223 00:20:20,090 --> 00:20:27,659 into what I talked about in the beginning with all the negative consequences we have 224 00:20:27,659 --> 00:20:30,820 seen. A lot of people these days don't realize that even if you're, like, in a 225 00:20:30,820 --> 00:20:34,571 small start up and you're working on, I don't know, a technology, or robots, or 226 00:20:34,571 --> 00:20:39,669 whatever. You don't know if your algorithm, or your mechanism, or whatever 227 00:20:39,669 --> 00:20:45,190 you build, will be used by 100 million people in five years. Because this has 228 00:20:45,190 --> 00:20:49,849 happened a lot, right? So, only when starting to build it you have to think: If 229 00:20:49,849 --> 00:20:54,220 this product would be used by 10 million, 100, maybe even a billion people, like 230 00:20:54,220 --> 00:20:58,149 Facebook, would it have negative consequences? Right, because then you get 231 00:20:58,149 --> 00:21:03,470 completely different effects in society, completely different engagement cycles and 232 00:21:03,470 --> 00:21:09,279 so on. Then, are we taking advantage of human weaknesses? So this is arguably 233 00:21:09,279 --> 00:21:16,249 something that's just their technology. A lot of products these days kind of try to 234 00:21:16,249 --> 00:21:19,789 hack your brain, what, we understand really well how, like, engagement works 235 00:21:19,789 --> 00:21:24,870 and addiction. So a lot of things, like social networks, actually have been 236 00:21:24,870 --> 00:21:28,070 focusing, you know, and also built by engineers, you know, trying to get a 237 00:21:28,070 --> 00:21:34,509 little number from 0.1% to 0.2%, can mean that you just do extensive A/B testing, 238 00:21:34,509 --> 00:21:38,460 create an interface that no one can stop looking at. You just continue scrolling, 239 00:21:38,460 --> 00:21:42,340 right? You just continue, and two hours have passed and you haven't actually 240 00:21:42,340 --> 00:21:48,830 talked to anyone. And this attention grabbing is kind of an issue and we can 241 00:21:48,830 --> 00:21:53,590 see that Apple actually now implemented screen time and they actually tell you how 242 00:21:53,590 --> 00:21:56,700 much time you spend on your phone. So there's definitely ways to build 243 00:21:56,700 --> 00:22:01,830 technology that even helps you to get away from these. And then for everything that 244 00:22:01,830 --> 00:22:06,269 involves AI and machine learning, you really have to take a really deep look at 245 00:22:06,269 --> 00:22:11,090 your data sets and your algorithms because it's very, very easy to build in biases 246 00:22:11,090 --> 00:22:16,700 and discrimination. And again, if you it applied to all of society many people who 247 00:22:16,700 --> 00:22:20,440 are less fortunate, or more fortunate, or they're just different, you know they just 248 00:22:20,440 --> 00:22:24,899 do different things, kind of fall out of the grid and now suddenly they can't, 249 00:22:24,899 --> 00:22:30,849 like, [unintelligible] anymore. Or use Uber, or Air B'n'B, or just live a normal 250 00:22:30,849 --> 00:22:35,440 life, or do financial transactions. And then, kind of what I said in the 251 00:22:35,440 --> 00:22:39,789 beginning, not only look at your product but also, if you combine it with other 252 00:22:39,789 --> 00:22:43,689 technologies that are upcoming, are there certain combinations that are dangerous? 253 00:22:43,689 --> 00:22:48,509 And for that I kind of recommend to do, like, some techno or litmus test to just 254 00:22:48,509 --> 00:22:59,359 try to come up with the craziest scenario that your technology could entail. And if 255 00:22:59,359 --> 00:23:05,450 it's not too bad then, probably good. The next thing is, kind of, sustainability. I 256 00:23:05,450 --> 00:23:11,460 think in today's world it really should be part of a good product, right. The first 257 00:23:11,460 --> 00:23:16,700 question is of course kind of obvious. Are we limiting product lifetime? Do we maybe 258 00:23:16,700 --> 00:23:20,049 have planned obsolescence, or if we build something that is so dependent on so 259 00:23:20,049 --> 00:23:23,950 many services and we're not only going to support it for one year anyways, that 260 00:23:23,950 --> 00:23:29,389 basically it will have to be thrown in the trash afterwards. Maybe it would be 261 00:23:29,389 --> 00:23:34,159 possible to add a standalone node or a very basic fallback feature so that at 262 00:23:34,159 --> 00:23:37,659 least the products continues to work. Especially if you talk about things like 263 00:23:37,659 --> 00:23:43,710 home appliances. Then, what is the environmental impact? A good example here 264 00:23:43,710 --> 00:23:49,009 would be, you know, crypto currencies who are now using as much energy as certain 265 00:23:49,009 --> 00:23:56,960 countries. And when you consider that just think like is there maybe an alternative solution 266 00:23:56,960 --> 00:24:00,950 that doesn't have such a big impact. And of course we are still capitalism, it has 267 00:24:00,950 --> 00:24:05,289 to be economically viable, but often there aren't, often it's again just really small 268 00:24:05,289 --> 00:24:12,809 tweaks. Then of course: Which other services are you working with? But for 269 00:24:12,809 --> 00:24:17,879 example I would say, like, as european companies, we're in Europe here, maybe try 270 00:24:17,879 --> 00:24:22,049 to work mostly with suppliers from Europe, right, because you know they follow GDPR 271 00:24:22,049 --> 00:24:28,309 and strict rules, and in a sense the US. Or check your supply chain if you build 272 00:24:28,309 --> 00:24:32,919 hardware. And then for hardware specifically that's because also I have, 273 00:24:32,919 --> 00:24:37,979 like, we also do hardware in my company, I always found that interesting. We're kind 274 00:24:37,979 --> 00:24:41,710 of in a world where everyone tries to save, like, the last little bit of money 275 00:24:41,710 --> 00:24:46,460 out of every device that is built and often makes the difference between plastic 276 00:24:46,460 --> 00:24:51,690 and metal screws like half a cent, right. And at that point it doesn't really change 277 00:24:51,690 --> 00:24:57,039 your margins much. And maybe as an engineer, you know, just say no and say: 278 00:24:57,039 --> 00:25:00,620 You know, we don't have to do that. The savings are too small to redesign 279 00:25:00,620 --> 00:25:05,509 everything and it will impact upon our quality so much that it just breaks 280 00:25:05,509 --> 00:25:13,159 earlier. These are kind of the main four points. I hope that makes sense. Then we 281 00:25:13,159 --> 00:25:17,230 have two more, kind of, additional checklists. The first one is data 282 00:25:17,230 --> 00:25:24,490 collection. So really, just if, especially like in terms of like IOT, you know, 283 00:25:24,490 --> 00:25:29,419 everyone focuses on kind of collecting as much data as possible without actually 284 00:25:29,419 --> 00:25:33,629 having an application. And I think we really have to start seeing that as a 285 00:25:33,629 --> 00:25:39,889 liability. And instead try to really define the application first, define which 286 00:25:39,889 --> 00:25:44,279 data we need for it, and then really just collect that. And we can start collecting 287 00:25:44,279 --> 00:25:49,350 more data later on. And that can really prevent a lot of these negative cycles we 288 00:25:49,350 --> 00:25:53,029 have seen. By just having machine learning organisms run on of it kind of 289 00:25:53,029 --> 00:25:59,349 unsupervised and seeing what comes out. Then also kind of really interesting I 290 00:25:59,349 --> 00:26:02,820 found that, many times, like, a lot of people are so fascinated by the amount of 291 00:26:02,820 --> 00:26:08,649 data, right, just try to have as many data points as possible. But very often you can 292 00:26:08,649 --> 00:26:13,539 realize exactly the same application with a fraction of data points. Because what you 293 00:26:13,539 --> 00:26:17,879 really need is, like, trends. And that usually also makes the product more 294 00:26:17,879 --> 00:26:24,230 efficient. Then how privacy intrusive is the data we collect? Right. There's a big 295 00:26:24,230 --> 00:26:27,619 difference between, let's say, the temperature in this building and 296 00:26:27,619 --> 00:26:32,070 everyone's individual movements here. And if it is privacy intrusive then we should 297 00:26:32,070 --> 00:26:35,759 really, really think hard if we want to collect it. Because we don't know how it 298 00:26:35,759 --> 00:26:44,389 might be used at a later point. And then, are we actually collecting data without 299 00:26:44,389 --> 00:26:47,610 people realizing that they do it, right, especially if we look at Facebook and 300 00:26:47,610 --> 00:26:52,720 Google. They're collecting a lot of data without really implicit consent. But of 301 00:26:52,720 --> 00:26:58,659 course at some point you like all agreed to the privacy policy. But it's often not 302 00:26:58,659 --> 00:27:03,860 clear to you when and which data is collected. And that's kind of dangerous 303 00:27:03,860 --> 00:27:12,149 and kind of in the same way if you kind of build dark patterns into your app. They 304 00:27:12,149 --> 00:27:17,669 kind of fool you into sharing even more data. I had, like, an example that someone 305 00:27:17,669 --> 00:27:24,830 told me yesterday. I don't if you know Venmo which is this American system where 306 00:27:24,830 --> 00:27:27,381 you pay each other with your smartphone. Basically to split the bill in a 307 00:27:27,381 --> 00:27:32,570 restaurant. By default, all transactions are public. So, like 200 million public 308 00:27:32,570 --> 00:27:38,659 transactions which everyone can see, including the description of it. So for 309 00:27:38,659 --> 00:27:45,249 some of the more maybe not so legal payments that was also very obvious, 310 00:27:45,249 --> 00:27:50,720 right? And it's totally un-obvious when you use the app that that is happening. So 311 00:27:50,720 --> 00:27:56,809 that's definitely a dark pattern that they're employing here. And then the next 312 00:27:56,809 --> 00:28:02,809 point is User Product Education and Transparency. Is a user able to understand 313 00:28:02,809 --> 00:28:08,320 how the product works? And, of course, we can't really ever have a perfect 314 00:28:08,320 --> 00:28:15,609 explanation of all the intricacies of the technology. But these days for most people 315 00:28:15,609 --> 00:28:20,889 almost all of the apps, the interfaces, the building technology and tech. This is 316 00:28:20,889 --> 00:28:25,340 a complete black box and no one is really doing an effort to explain it to them why 317 00:28:25,340 --> 00:28:30,460 most companies advertise it like this magical thing. But that just leads to kind 318 00:28:30,460 --> 00:28:35,950 of this immunization where you just look at it and you don't even try to understand 319 00:28:35,950 --> 00:28:46,399 it. I'm pretty sure that no one ever, like, these days is still opening up a PC 320 00:28:46,399 --> 00:28:50,029 and trying looking at the components, right, because everything is in tablet and 321 00:28:50,029 --> 00:28:56,639 it's integrated and it's sold to us like this magical media consumption machine. 322 00:28:56,639 --> 00:29:01,809 Then, are users informed when decisions are made for them? So we had that in 323 00:29:01,809 --> 00:29:07,950 Empowerment, that we should try to reduce the amount of decisions we make for the 324 00:29:07,950 --> 00:29:12,149 user. But sometimes, that's a good thing to do. But then, is it transparently 325 00:29:12,149 --> 00:29:17,789 communicated? I would be totally fine with Google Maps filtering out for me the 326 00:29:17,789 --> 00:29:21,629 points of interest if it would actually tell me that it's doing that. And if you 327 00:29:21,629 --> 00:29:25,779 can't understand why it made that decision and why it showed me this place. And maybe 328 00:29:25,779 --> 00:29:29,589 also have a way to switch it off if I want. But today we seem to kind of assume 329 00:29:29,589 --> 00:29:34,039 that we know better for the people why it's, so we found the perfect algorithm 330 00:29:34,039 --> 00:29:37,869 that has a perfect answer. So we don't even have to explain how it works, right? 331 00:29:37,869 --> 00:29:42,440 We just do it and people will be happy. But then we end up with is very negative 332 00:29:42,440 --> 00:29:48,999 consequences. And then, that's more like a marketing thing, how is it actually 333 00:29:48,999 --> 00:29:54,820 advertised? I find it, for example, quite worrisome that things like Siri and 334 00:29:54,820 --> 00:30:00,210 Alexa and Google home are, like, sold as these magical AI machines that make your 335 00:30:00,210 --> 00:30:03,499 life better, and are you personal assistant. When in reality they are 336 00:30:03,499 --> 00:30:10,119 actually still pretty dumb, pattern matching. And that also creates a big 337 00:30:10,119 --> 00:30:14,379 disconnect. Because now we have children growing up who actually think that Alexa 338 00:30:14,379 --> 00:30:20,529 is a person. And that's kind of dangerous. And I think we should try to prevent that 339 00:30:20,529 --> 00:30:27,450 because for these children, basically, it kind of creates this veil and it's 340 00:30:27,450 --> 00:30:33,340 humanized. And that's especially dangerous if then the machine starts to make 341 00:30:33,340 --> 00:30:37,359 decisions for them. And suggestions because they will take them as if a human 342 00:30:37,359 --> 00:30:48,539 did it for them. So, what is that? So, these are kind of the main areas. Of course 343 00:30:48,539 --> 00:30:54,929 it's a bit more complicated. So we just published the standard today in the first 344 00:30:54,929 --> 00:31:00,990 draft version. And it's basically three parts of science introduction, kind of the 345 00:31:00,990 --> 00:31:04,809 questions and checklists that you just saw. And then actually how to implement it in 346 00:31:04,809 --> 00:31:10,239 your company, which processes to have, at which point you basically should have 347 00:31:10,239 --> 00:31:16,289 kind of a feature gate. And I would kind of ask everyone to go there, look at it, 348 00:31:16,289 --> 00:31:22,749 contribute, shared it with people. We hope that we'll have a final version ready kind 349 00:31:22,749 --> 00:31:40,499 of in Q1 and that by then people can start to implement it. Oh, yeah. So, even though 350 00:31:40,499 --> 00:31:45,330 we have this standard, right, I want to make it clear having such a standard and 351 00:31:45,330 --> 00:31:50,539 implementing it in your organization or for yourself or your product is great. It 352 00:31:50,539 --> 00:31:55,859 actually doesn't remove your responsibility, right? This can only be 353 00:31:55,859 --> 00:32:01,899 successful if we actually all accept that we are responsible. Right? If today I 354 00:32:01,899 --> 00:32:06,719 build a bridge as a structural engineer and the bridge breaks down because I 355 00:32:06,719 --> 00:32:10,479 miscalculated, I am responsible. And I think, equally, we have to accept that if 356 00:32:10,479 --> 00:32:18,589 we build technology like this we also have to, kind of, assume that responsibility. 357 00:32:18,589 --> 00:32:25,289 And before we kind of move to Q&A, I'd like to kind of take the citations. This 358 00:32:25,289 --> 00:32:30,839 is Chamath Palihapitiya, former Facebook executive, from the really early times. 359 00:32:30,839 --> 00:32:35,340 And also, around a year ago when we actually saw the GTC he said this in a 360 00:32:35,340 --> 00:32:40,190 conference: "I feel tremendous guilt. I think in the back in the deep restlessness 361 00:32:40,190 --> 00:32:44,269 of our mind we knew something bad could happen. But I think the way we defined it 362 00:32:44,269 --> 00:32:48,490 is not like this. It is now literally at a point where I think we have created 363 00:32:48,490 --> 00:32:54,169 tools that are ripping apart the social fabric of how society works." And 364 00:32:54,169 --> 00:33:02,769 personally, and I hope the same for you, I do not want to be that person that five 365 00:33:02,769 --> 00:33:07,979 years down the line realizes that they built that technology. So if there is one 366 00:33:07,979 --> 00:33:14,190 take-away that you can take home from this talk, then to just start asking yourself: 367 00:33:14,190 --> 00:33:18,720 What is good technology, what does it mean for you? What does it mean for the 368 00:33:18,720 --> 00:33:24,779 products you build and what does it mean for your organization? Thanks. 369 00:33:24,779 --> 00:33:29,970 *applause* 370 00:33:29,970 --> 00:33:37,739 Herald: Thank you. Yann Leretaille. Do we have questions in the room? There are 371 00:33:37,739 --> 00:33:44,539 microphones, microphones number 1, 2, 3, 4, 5. If you have a question please speak 372 00:33:44,539 --> 00:33:49,269 loud into the microphone, as the people in the stream want to hear you as well. 373 00:33:49,269 --> 00:33:52,616 I think microphone number 1 was the fastest. So please. 374 00:33:52,616 --> 00:33:57,659 Question: Thank you for your talk. I just want to make a short comment first and 375 00:33:57,659 --> 00:34:01,750 then ask a question. I think this last thing you mentioned about offering users 376 00:34:01,750 --> 00:34:06,999 the options to have more control of the interface there is also a problem that 377 00:34:06,999 --> 00:34:11,330 users don't want it. Because when you look at the statistics of how people use online 378 00:34:11,330 --> 00:34:17,250 web tools, only maybe 5 percent of them actually use that option. So companies 379 00:34:17,250 --> 00:34:22,260 remove them because for them it seems like it's something not so efficient for user 380 00:34:22,260 --> 00:34:26,409 experience. This was just one thing to mention and maybe you can respond to that. 381 00:34:26,409 --> 00:34:33,589 But what I wanted to ask you was, that all these principles that you presented, they 382 00:34:33,589 --> 00:34:40,079 seem to be very sound and interesting and good. We can all accept them as 383 00:34:40,079 --> 00:34:45,329 developers. But how would you propose to actually sell them to companies. Because 384 00:34:45,329 --> 00:34:50,090 if you adopt a principle like this as an individual based on your ideology or the 385 00:34:50,090 --> 00:34:53,680 way that you think, okay, it's great it will work, but how would you convince a 386 00:34:53,680 --> 00:34:58,590 company which is driven by profits to adopt these practices? Have you thought of 387 00:34:58,590 --> 00:35:04,589 this and what's your idea about this? Thank you. 388 00:35:04,589 --> 00:35:11,310 Yann: Yeah. Maybe to the first part. First, that giving people choice is 389 00:35:11,310 --> 00:35:16,760 something that people do not want and that's why companies removed it. I think 390 00:35:16,760 --> 00:35:21,970 if you look at the development process it's basically like a huge cycle of 391 00:35:21,970 --> 00:35:26,359 optimization and user testing geared towards a very specific goal, right, which 392 00:35:26,359 --> 00:35:31,060 is usually set by leadership which is, like, bringing engagement up or increase 393 00:35:31,060 --> 00:35:37,670 user amount by 200 percent. So I would say the goals were, or are today, mostly 394 00:35:37,670 --> 00:35:41,849 misaligned. And that's why we end up with interfaces that are in a very certain way, 395 00:35:41,849 --> 00:35:46,049 right? If we set the goals differently, and I mean that's why we have 396 00:35:46,049 --> 00:35:51,130 like UI and UX research. I'm very sure we can find ways to build interfaces that are 397 00:35:51,130 --> 00:35:59,370 just different. And still engaging, but also give that choice. To the second 398 00:35:59,370 --> 00:36:06,289 question. I mean it's kind of interesting. So I wouldn't expect a company like Google 399 00:36:06,289 --> 00:36:10,730 to implement something like this, because it's a bit against that. This is more by 400 00:36:10,730 --> 00:36:16,359 that point probably but I've met a lot of, like, also high level executives already, 401 00:36:16,359 --> 00:36:23,250 who were actually very aware of kind of the issues of technology that they built. 402 00:36:23,250 --> 00:36:28,480 And there is definitely interest there. Also, more like industrial side, and so 403 00:36:28,480 --> 00:36:34,250 on, especially, it seems like self-driving cars to actually adopt that. And in the 404 00:36:34,250 --> 00:36:39,530 end I think, you know, if everyone actually demands it, then there's a pretty 405 00:36:39,530 --> 00:36:44,069 high probability that it might actually happen. Especially, as workers in the tech 406 00:36:44,069 --> 00:36:50,760 field, we are quite flexible in the selection of our employer. So I think if 407 00:36:50,760 --> 00:36:56,340 you give it some time, that's definitely something that's very possible. The second 408 00:36:56,340 --> 00:37:01,930 aspect is that, actually, if we looked at something like Facebook, I think they 409 00:37:01,930 --> 00:37:08,730 overdid it. Say, optimize that so far and push the engagement machine and kind of 410 00:37:08,730 --> 00:37:13,150 triggering like your brain cells to never stop being on the site and keeps 411 00:37:13,150 --> 00:37:17,960 scrolling, that people got too much of it. And now they're leaving the platform in 412 00:37:17,960 --> 00:37:21,950 droves. And of course Facebook would not go down, they own all these other social 413 00:37:21,950 --> 00:37:27,420 networks. But for the product itself. as you can see, that, long term it's not even 414 00:37:27,420 --> 00:37:33,519 necessarily a positive business outcome. And everything we are advertising here 415 00:37:33,519 --> 00:37:38,610 still also to have very profitable businesses, right, just tweaking the right screws. 416 00:37:38,610 --> 00:37:42,730 Herald: Thank you. We have a question from the interweb. 417 00:37:42,730 --> 00:37:48,480 Signal Angel: Yes. Fly asks a question 418 00:37:48,480 --> 00:37:54,700 that goes into a similar direction. In recent months we had numerous reports 419 00:37:54,700 --> 00:37:59,040 about social media executives forbidding their children to use the products they 420 00:37:59,040 --> 00:38:05,289 create at work. I think these people know that their products are made addictive 421 00:38:05,289 --> 00:38:11,079 deliberately. Do you think your work is somewhat superfluous because big companies 422 00:38:11,079 --> 00:38:16,400 are doing the opposite on purpose. Yann: Right. I think that's why you have 423 00:38:16,400 --> 00:38:23,119 to draw the line between intentional and unintentional. If we go to intentional 424 00:38:23,119 --> 00:38:27,220 things like what Uber did and so on. At some point it should probably become a 425 00:38:27,220 --> 00:38:32,289 legal issue. Unfortunately we are not there yet and usually regulation is kind 426 00:38:32,289 --> 00:38:39,019 of lagging way behind. So I think for now we should focus on, you know, the more 427 00:38:39,019 --> 00:38:45,190 unintentional consequences, of which there are plentiful and kind of appeal to the 428 00:38:45,190 --> 00:38:52,329 good in humans. Herald: Okay. Microphone number 2 please. 429 00:38:52,329 --> 00:38:59,619 Q: Thank you for sharing your ideas about educating the engineer. What about 430 00:38:59,619 --> 00:39:05,440 educating the customer, the consumer who purchases the product. 431 00:39:05,440 --> 00:39:12,390 Yann: Yeah. So that's a really valid point. Right. As I said I think 432 00:39:12,390 --> 00:39:19,609 [unintelligible] like part of your product development. And the way you build a 433 00:39:19,609 --> 00:39:24,619 product should also be how you educate your users on how it works. Generally, we 434 00:39:24,619 --> 00:39:30,940 have a really big kind of technology illiteracy problem. Things have been 435 00:39:30,940 --> 00:39:34,990 moving so fast in the last year that most people haven't really caught up and they 436 00:39:34,990 --> 00:39:39,849 just don't understand things anymore. And I think again that's like a shared 437 00:39:39,849 --> 00:39:44,500 responsibility, right? You can't just do that in the tech field. You have to talk 438 00:39:44,500 --> 00:39:48,670 to your relatives, to people. That's why we're doing, like, this series of articles 439 00:39:48,670 --> 00:39:54,860 and media partnerships to kind of explain and make these things transparent. One 440 00:39:54,860 --> 00:40:01,510 thing we just started working on is a children's book. Because for children, 441 00:40:01,510 --> 00:40:06,510 like, the entire world just exists with this shiny glass surfaces and they don't 442 00:40:06,510 --> 00:40:11,180 understand at all what is happening. But it's also primetime to explain to them, 443 00:40:11,180 --> 00:40:15,420 like, really simple machine learning algorithms. How they work, how like, 444 00:40:15,420 --> 00:40:19,280 filterbubbles work, how decisions are made. And if you understand that from an 445 00:40:19,280 --> 00:40:24,740 early age on, then maybe you'll be able to deal with what is happening. In a way 446 00:40:24,740 --> 00:40:31,730 better, an educated way. But I do think that is a very long process and so only if 447 00:40:31,730 --> 00:40:37,349 we start and the more work we invest in that, the earlier people will be better 448 00:40:37,349 --> 00:40:40,730 educated. Herald: Thank you. Microphone number 1 449 00:40:40,730 --> 00:40:45,090 please. Q: Thanks for sharing your insights. I 450 00:40:45,090 --> 00:40:50,829 feel like, while you presented these rules along with their meaning, the specific 451 00:40:50,829 --> 00:40:55,710 selection might seem a bit arbitrary. And for my personal acceptance and willingness 452 00:40:55,710 --> 00:41:01,840 to implement them it would be interesting to know the reasoning, besides common 453 00:41:01,840 --> 00:41:07,280 sense, that justifies this specific selection of rules. So, it would be 454 00:41:07,280 --> 00:41:12,779 interesting to know if you looked at examples from history, or if you just sat 455 00:41:12,779 --> 00:41:19,299 down and discussed things, or if you just grabbed some rules out of the air. And so 456 00:41:19,299 --> 00:41:26,230 my question is: What influenced you for the development of these specific rules? 457 00:41:26,230 --> 00:41:34,130 Yann: It's a very complicated question. So how did we come up this specific selection 458 00:41:34,130 --> 00:41:39,470 of rules and also, like, the main building blocks of what we think should good 459 00:41:39,470 --> 00:41:47,099 technology be. Well, let's say first what we didn't want to do, right. We didn't 460 00:41:47,099 --> 00:41:51,119 want to create like a value framework and say, like, this is good, this is bad, 461 00:41:51,119 --> 00:41:55,290 don't do this kind of research or technology. Because this would also be 462 00:41:55,290 --> 00:41:59,960 outdated, it doesn't apply to everyone. We probably couldn't even agree in the expert 463 00:41:59,960 --> 00:42:05,300 council on that because it's very diverse. Generally, we try to get everyone on the 464 00:42:05,300 --> 00:42:12,200 table. And we talked about issues we had, like, for example me as an entrepreneur. And when 465 00:42:12,200 --> 00:42:18,890 I was, like, in developing products with our own engineers. Issues we've seen in terms 466 00:42:18,890 --> 00:42:26,790 of public perception. Issues we've seen, like, on a more governmental level. Then 467 00:42:26,790 --> 00:42:32,349 we also have, like, cryptologists in there. So we looked at that as well and 468 00:42:32,349 --> 00:42:42,509 then we made a really, really long list and kind of started clustering it. And a 469 00:42:42,509 --> 00:42:49,589 couple of things did get cut off. But generally, based on the clustering, these 470 00:42:49,589 --> 00:42:57,569 were kind of the main themes that we saw. And again, it's really more of a tool for 471 00:42:57,569 --> 00:43:03,680 yourself as a company that developers, designers and engineers to really 472 00:43:03,680 --> 00:43:08,690 understand the impact and evaluate it. Right. This is what these questions are 473 00:43:08,690 --> 00:43:13,369 aimed at. And we think that for that they do a very good job. 474 00:43:13,369 --> 00:43:18,559 From microphone 1: Thank you. Herald: Thank you. And I think. Microphone 475 00:43:18,559 --> 00:43:22,359 number 2 has a question again. Q: Hi. I was just wondering how you've 476 00:43:22,359 --> 00:43:26,730 gone about engaging with other standards bodies, that perhaps have a wider 477 00:43:26,730 --> 00:43:32,540 representation. It looks largely like from your team of the council currently that 478 00:43:32,540 --> 00:43:36,740 there's not necessarily a lot of engagement outside of Europe. So how do 479 00:43:36,740 --> 00:43:41,540 you go about getting representation from Asia. For example. 480 00:43:41,540 --> 00:43:52,369 Yann: No, at the moment you're correct the GTC is mostly a European initaitive. We 481 00:43:52,369 --> 00:43:57,710 are in talks with other organizations who work on similar issues and regularly 482 00:43:57,710 --> 00:44:04,220 exchange ideas. But, yeah, we thought we should probably start somewhere. In Europe 483 00:44:04,220 --> 00:44:09,250 is actually a really good place to start. Like a societal discourse about technology 484 00:44:09,250 --> 00:44:14,049 and the impact it has and also to to have change. But I think if for example 485 00:44:14,049 --> 00:44:19,599 compared to things like Asia or the US where is a very different perception of 486 00:44:19,599 --> 00:44:25,029 privacy and technology and progress and like the rights of the individual Europe 487 00:44:25,029 --> 00:44:29,400 is actually a really good place to do that. And we can also see things like GDPR 488 00:44:29,400 --> 00:44:35,920 regulation, that actually, ... It's kind of complicated. It's also kind of a big 489 00:44:35,920 --> 00:44:40,790 step forward in terms of protecting the individual from exactly these kind of 490 00:44:40,790 --> 00:44:47,150 consequences. Of course though, long term we would like to expand this globally. 491 00:44:47,150 --> 00:44:52,640 Herald: Thank you. Microphone number 1 again. 492 00:44:52,640 --> 00:44:57,270 Q: Hello. Just a short question. I couldn't find a donate button on your 493 00:44:57,270 --> 00:45:03,549 website. Do you accept donations? Is money a problem? Like, do you need it? 494 00:45:03,549 --> 00:45:12,960 Yann: Yes, we do need money. However it's a bit complicated because we want to stay 495 00:45:12,960 --> 00:45:19,750 as independent as possible. So we are not accepting project related money. So you can't 496 00:45:19,750 --> 00:45:22,470 like say we want to do certain research product with you, it has to be 497 00:45:22,470 --> 00:45:29,800 unconditional. And the second thing we do is for the events we organize. We usually 498 00:45:29,800 --> 00:45:33,690 have sponsors that provide, like, venue and food and logistics and things like 499 00:45:33,690 --> 00:45:39,140 that. But that's an, ... for the event. And again, I can't, like, change the 500 00:45:39,140 --> 00:45:44,249 program of it. So if you want to do that you can come into contact with us. We 501 00:45:44,249 --> 00:45:48,509 don't have a mechanism yet for individuals to donate. We might add that. 502 00:45:48,509 --> 00:45:54,109 Herald: Thank you. Did you think about Patreon or something like that? 503 00:45:54,109 --> 00:46:03,509 Yann: We thought about quite a few options. Yeah, but it's actually not so 504 00:46:03,509 --> 00:46:09,470 easy to not fall into the trap that, like, as organizations in space have been, 505 00:46:09,470 --> 00:46:15,190 like, Google at some point sweeps in and it's like: Hey, do you want all this cash. 506 00:46:15,190 --> 00:46:18,840 And then very quickly you have a big conflict of interest. Even if you don't 507 00:46:18,840 --> 00:46:25,660 want that to happen it starts happening. Herald: Yeah right. Number 1 please. 508 00:46:25,660 --> 00:46:32,730 Q: I was wondering how do you unite the second and third points in your checklist. 509 00:46:32,730 --> 00:46:37,960 Because the second one is intelligence by design. The third one is to take into 510 00:46:37,960 --> 00:46:43,080 account future technologies. But companies do not want to push back their 511 00:46:43,080 --> 00:46:48,519 technologies endlessly to take into account future technologies. And on the 512 00:46:48,519 --> 00:46:52,000 other hand they don't want to compromise their own design too much. 513 00:46:52,000 --> 00:47:00,160 Yann: Yeah. Okay. Okay. Got it. So you were saying if we should always stop 514 00:47:00,160 --> 00:47:04,109 these, like, future scenarios and the worst case and everything and incorporate 515 00:47:04,109 --> 00:47:07,869 every possible thing that might happen in the future we might end up doing nothing 516 00:47:07,869 --> 00:47:14,210 because everything looks horrible. For that I would say, like, we are not like 517 00:47:14,210 --> 00:47:21,079 technology haters. We are all from areas working in tech. So of course the idea is 518 00:47:21,079 --> 00:47:25,859 that you can just take a look at what is there today and try to make an assessment 519 00:47:25,859 --> 00:47:30,470 based on that. And the idea is if you look it up and meet the standards that over 520 00:47:30,470 --> 00:47:35,289 time actually you try to,... When you add new major features to look back at your 521 00:47:35,289 --> 00:47:40,079 assessment from before and see if it changed. So the idea is you kind of create 522 00:47:40,079 --> 00:47:46,819 a snapshot of how it is now. And this kind of document that you end up as part of 523 00:47:46,819 --> 00:47:50,549 your documentation kind of evolved over time as your product changes and the 524 00:47:50,549 --> 00:47:57,390 technology around it changes as well. Herald: Thank you. Microphone number 2. 525 00:47:57,390 --> 00:48:02,789 Q: So thanks for the talk and especially the effort. Just to echo back the 526 00:48:02,789 --> 00:48:07,430 question that was asked a bit before on starting with Europe. I do think it's a 527 00:48:07,430 --> 00:48:14,010 good option. What I'm a little bit worried is it might be the only option. It might 528 00:48:14,010 --> 00:48:19,569 become irrelevant rather quickly because it's easy to do, it's less hard to 529 00:48:19,569 --> 00:48:26,220 implement. Maybe in Europe now. Okay. The question is. It might work in Europe now 530 00:48:26,220 --> 00:48:30,960 but if Europe doesn't have the same economical power it cannot bog in as much 531 00:48:30,960 --> 00:48:36,549 politically with let's say China or the US in Silicon Valley. So will it still be 532 00:48:36,549 --> 00:48:41,390 possible and relevant if the economical balance shifts? 533 00:48:41,390 --> 00:48:52,329 Yann: Yes, I mean we have to start somewhere, right? Just saying "Oh, 534 00:48:52,329 --> 00:48:59,040 economical balance will shift anyway, Google will invent singularity, and that's 535 00:48:59,040 --> 00:49:02,039 why we shouldn't do anything" is, I think, one of the reasons why we actually got 536 00:49:02,039 --> 00:49:07,730 here, why it kind of is this assumption that there is like this really big picture 537 00:49:07,730 --> 00:49:14,490 that is kind of working against us, so we all do our small part to fulfill that 538 00:49:14,490 --> 00:49:20,779 kind of evil vision by not doing anything. I think we have to start somewhere and I 539 00:49:20,779 --> 00:49:26,780 think for having operated for one year, we have been actually quite successful so far 540 00:49:26,780 --> 00:49:31,690 and we have a good progress. And I'm totally looking forward to make it a bit 541 00:49:31,690 --> 00:49:35,769 more global and to start traveling more, I think that like one event outside Europe 542 00:49:35,769 --> 00:49:40,330 last year in the US and that will definitely increase over time, and we're 543 00:49:40,330 --> 00:49:46,450 also working on making kind of our ambassadors more mobile and kind of expand 544 00:49:46,450 --> 00:49:50,310 to other locations. So it's definitely on the roadmap but it's not like yeah, just 545 00:49:50,310 --> 00:49:54,030 staying here. But yeah, you have to start somewhere and that's what we did. 546 00:49:54,030 --> 00:50:01,809 Herald: Nice, thank you. Number 1 please. Mic 1: Yeah. One thing I haven't found was 547 00:50:01,809 --> 00:50:08,420 – all those general rules you formulated fit into the more general rules of 548 00:50:08,420 --> 00:50:16,390 society, like the constitutional rules. Have you considered that and it's just not 549 00:50:16,390 --> 00:50:25,319 clearly stated and will be stated, or did you develop them more from the bottom up? 550 00:50:25,319 --> 00:50:33,470 Yann: Yes, you are completely right. So we are defining the process and the questions 551 00:50:33,470 --> 00:50:39,330 to ask yourself, but we are actually not defining a value framework. The reason for 552 00:50:39,330 --> 00:50:42,809 that is that societies are different, as I said they are widely different 553 00:50:42,809 --> 00:50:48,260 expectations towards technology, privacy, how society should work, all the ones 554 00:50:48,260 --> 00:50:53,799 about. The second one is that every company is also different, right, every 555 00:50:53,799 --> 00:50:58,240 company has their own company culture and things they want to do and they don't want 556 00:50:58,240 --> 00:51:04,640 to do. If I would say, for example, we would have put in there "You should not 557 00:51:04,640 --> 00:51:08,220 build weapons or something like that", right, that would mean that all these 558 00:51:08,220 --> 00:51:12,950 companies that work in that field couldn't try to adapt it. And while I don't want 559 00:51:12,950 --> 00:51:17,029 them to build weapons maybe in their value framework that's OK and we don't want to 560 00:51:17,029 --> 00:51:21,069 impose that, right. That's why I said in the beginning we actually, we're called 561 00:51:21,069 --> 00:51:24,730 the Good Technology Collective, we are not defining what it is and I think that's 562 00:51:24,730 --> 00:51:28,780 really important. We are not trying to impose our opinion here. We want others to 563 00:51:28,780 --> 00:51:33,750 decide for themselves what is good and cannot support them and guide them in 564 00:51:33,750 --> 00:51:36,299 building products that they believe are good. 565 00:51:36,299 --> 00:51:44,599 Herald: Thank you. Number two . Mic 2: Hello, thanks for sharing. As 566 00:51:44,599 --> 00:51:51,710 engineer we always want users to spend more time to use our product, right? But 567 00:51:51,710 --> 00:51:58,990 I'm working at mobile game company. Yep. We are making, we are making a world that 568 00:51:58,990 --> 00:52:05,539 users love our product. So we want users spend more time in our game. So we may 569 00:52:05,539 --> 00:52:13,510 make a lot of money, yeah, but when users spend time to play our game they may lose 570 00:52:13,510 --> 00:52:19,549 something. Yeah. You know. So how do we think about the balance in a game, mobile 571 00:52:19,549 --> 00:52:24,910 game. Yeah. Yann: Hmm. It's a really difficult 572 00:52:24,910 --> 00:52:32,470 question. So the question was like specifically for mobile gaming. Where's 573 00:52:32,470 --> 00:52:38,490 kind of the balance between trying to engage people more and, yeah, basically 574 00:52:38,490 --> 00:52:44,510 making them addicted and having them spend all their money, I guess. I personally 575 00:52:44,510 --> 00:52:53,880 would say it's about intent, right? It's totally fine with a business model where 576 00:52:53,880 --> 00:52:58,119 you make money with a game. I mean that's kind of good and people do want 577 00:52:58,119 --> 00:53:08,750 entertainment. But if you actively use, like, research in how, like, you know, 578 00:53:08,750 --> 00:53:14,750 like the brain actually works and how it get super engaged, and if you basically 579 00:53:14,750 --> 00:53:18,540 build in, like, gamification and lotteries, which a lot of, I think, have 580 00:53:18,540 --> 00:53:21,829 done, where basically your game becomes a slot machine, right, you always want to 581 00:53:21,829 --> 00:53:28,270 see the next opening of a crate and see what you got. Kind of making it a 582 00:53:28,270 --> 00:53:32,651 luck based game, actually. I think if you go too far into that direction, at some 583 00:53:32,651 --> 00:53:36,280 point you cross the line. Where that line is you have to decide yourself, right, 584 00:53:36,280 --> 00:53:40,060 some of it could be a good game and dynamic but there definitely some games 585 00:53:40,060 --> 00:53:44,700 out there, I would say with quite a reason to say that they pushed to the limit quite 586 00:53:44,700 --> 00:53:48,099 a bit too far. And if you actually look how they did it because they wrote about 587 00:53:48,099 --> 00:53:52,730 it, they actually did use very modern research and very extensive testing to 588 00:53:52,730 --> 00:53:58,180 really find out these, all these patterns that make you addicted. And then it's not 589 00:53:58,180 --> 00:54:02,260 much better than an actual slot machine. And that probably we don't want. 590 00:54:02,260 --> 00:54:08,140 Herald: So it's also an ethical question for each and every one of us, right? 591 00:54:08,140 --> 00:54:10,750 Yann: Yes. Herald: I think there is a light and I 592 00:54:10,750 --> 00:54:13,500 think this light means the interwebs has a question. 593 00:54:13,500 --> 00:54:21,589 Signal angel: I, there's another question from ploy about practical usage, I guess. 594 00:54:21,589 --> 00:54:25,199 Are you putting your guidelines at work in your company? You said you're an 595 00:54:25,199 --> 00:54:29,880 entrepeneur. Yann: That's a great question. Yes, we 596 00:54:29,880 --> 00:54:37,569 will. So we kind of just completed some and there was kind of a lot of work to get 597 00:54:37,569 --> 00:54:41,740 there. Once they are finished and released we will definitely be one of the first 598 00:54:41,740 --> 00:54:47,910 adopters. Herald: Nice. And with this I think we're 599 00:54:47,910 --> 00:54:50,440 done for today. Yann: Perfect. 600 00:54:50,440 --> 00:54:54,049 Herald: Yann, people, warm applause! 601 00:54:54,049 --> 00:54:55,549 *applause* 602 00:54:55,549 --> 00:54:57,049 *postroll music* 603 00:54:57,049 --> 00:55:19,000 subtitles created by c3subtitles.de in the year 2020. Join, and help us!