Privacy for App Design – Seneca FNT

We tend to think of privacy for app design as something scary. But it’s actually a great opportunity.

Privacy is an opportunity

for us to help people feel

safety and trust.

Privacy for App Design: Focusing on Fintech

Seneca College FNT – Privacy Management & Identity Theft
Pre-Recorded for 23 March 2022, 18:00 UTC-4

This guest lecture turns Privacy by Design principles into practical techniques for app designers in fintech. We discover what opportunities arise when we embrace privacy by design.

What We Cover

  • Key privacy mindsets to shift
  • Privacy rights we all enjoy 
  • Design patterns for privacy rights
  • Privacy principles for digital apps & products

Video

👇🏼 Skip to the transcript

Downloads

First slide of presentation for Privacy for App Designers - Seneca FNT

👉 Download the slides here (PDF)

Ethical Design Checklist (PDF)

References

Special thanks to Saskia van Manen 🙏🏼

Transcript

Good evening, everyone. Hi. Michelle, thank you so much for inviting me to this guest lecture for the Financial Technology Program at Seneca College. I’m incredibly honored to be here.

So, without further ado, let’s just dive right in, shall we? The first thing I’d like to do actually is to invite us all to just take a breath and take a moment… And to slow down and reflect on a time when we felt perfectly safe and complete trust. Feels nice, right?

Privacy by design, designing for privacy, is an opportunity for us to help the people using our apps, our services, our products, to actually experience that feeling of safety and trust.

This is especially important now that technology has become so intimately interwoven in our daily lives, especially FinTech, because giving people this sense of security and safety and trust, and actually providing it for real, makes our positive impact on their lives that much more meaningful.

So, my name’s Brian Pagán. It’s wonderful to meet you.

Hello. I’ve been, oops. I’ve been in UX for about 20 years now. Apparently not long enough to learn how to work PowerPoint, but whatever.

I got into privacy for mobile apps around 2013 at Phillips, where I worked together with an IT business developer and one of our legal consultants to create the first privacy for mobile app design guidelines at the company itself. Beyond that, I have a psychology background and I founded the Greatness Studio in 2016, which is my kind of platform for not only providing UX workshops and consultancy and coaching, but also doing advocacy work like this around ethical design and privacy.

As part of that, I’m also working together with Michelle to update Dr. Cavoukian’s Privacy by Design paper. So, what we’re we gonna talk about today. You’re already familiar with the paper and the principles and everything, so I’m not gonna get into that.

We’re going to talk today about very practical things. First, we’re gonna start with a mindset. So, I would like to try to inspire two major mindset shifts around privacy with you. After that, we’re gonna get into the main part, let’s say the main course of our presentation today, and we’re gonna talk about the different rights around privacy that people enjoy and the design patterns that we can use in our mobile apps or other apps to accommodate those rights. And as a dessert, we will talk about further principles, so practical principles that we can apply in our work with Privacy by Design.

Okay, let’s get started. The first mindset shift I’d like to talk about is that privacy isn’t scary. We don’t have to think about privacy as some kind of obligation or something that’s negative. Really, we can think about it as an opportunity. It’s an opportunity to help people feel that sense of safety and trust that I mentioned before, but on another note, it’s also a way for us to help, for example, the companies that we work with to avoid fines and bad press, or to help our products or services differentiate from the competition, like Apple’s doing now, for example.

And from an ethical standpoint, it’s good for us to be able to help protect people’s human rights. Now, this is one that I wanna talk about a little bit more, because let’s not forget that as creators, as designers, as people who build things for other people, we are part of the world too.

So, we might be the designers or the creators of whatever we’re working on, but we’re the consumers of everything else around us in the world. So, the more are privacy forward things that we put into the world, the more privacy forward our world becomes.

So beyond that, another mindset shift I’d like us to think about today is about, instead of thinking of privacy in terms of how much data can I possibly get from people, let’s start thinking in terms of how much do we really need? It’s almost like how much can we get away with not collecting.

This is the principle of data minimization, and I really like this because it’s a super simple way to actually solve a lot of the problems that we have around data collection. And another thing about minimizing data or data minimilization, or minimization (I can’t talk either today) is that it’s important for us to keep staying critical with the project that we’re working on and not think about data minimization in the beginning, but can continually thinking about it at every stage in iteration of the development process, even after we’ve launched, of course.

For example, let’s look at all these instant messenger data collection things. Like why does Facebook messenger need health and fitness info? I don’t understand. And what does other data mean? For me, that’s a little bit scary.

So, that was brief about mindsets. Let’s get into the main course, rights and patterns. So, as you know, legally and ethically, all of us have rights under privacy law. I’m not gonna talk about all nine of these right now, or one, two, three, four, five, sorry, six. I’m also not so good at math lately.

But in any case, I’m not going to talk about all six of these right now, because I’m going to handle them one by one and talk about design patterns that can help us to accommodate these actual rights themselves. So, the first one is obviously the right to object.

And y’all are familiar with consent, I understand this, but what I’d like to connect here are the different ways that we can actually design for consent. Because consent sounds very simple, but it’s a little bit more nuanced than we typically get taught.

So for example, you’re also all already familiar with the concept of opting out and opting in, but when we think about examples of what implicit and explicit consent mean, or how we designed for implicit or explicit consent, I’d like to talk now about these examples from the web. As you can see on the left side, implicit consent is basically that you are already at a place, or you’re already doing a thing, and it lets you know that data are being collected in some way, but you don’t get an explicit option.

Explicit consent on the other side, is where we can actually see that we need to complete an extra step, we need to actually give an explicit action, or conduct or complete an explicit action in order to provide that consent. This is especially important for when a form, for example, is created for one purpose like sign up, but then it also wants you to agree to something else, like for example, subscribing to a newsletter.

And I also wanna highlight one actual thing here, that consent does not count unless it is informed. A person cannot agree to something if they do not understand what they’re agreeing to. This is on us. As the creators of whatever app or service that we might be creating, it’s our responsibility to make sure that people understand what it is that they’re agreeing to. But don’t worry, there are plenty of ways to do this, which we will cover here shortly.

But before we do that, I just want to make one small distinction, small but crucial distinction in the kinds of data that we collect and how we need to deal with those kinds of data. So, we’re all familiar with personal data, right? It’s just data that can identify who we are, like an email address, or potentially your name, an ID number potentially that you have been maybe assigned to identify you within a service, or maybe across the web.

These things aren’t necessarily super sensitive, because basically what these data points or these bits of data tell a person, or a company, or an organization, is just who you are, that you are you, and someone else maybe who has the same name, or who has a similar name, is someone else.

Sensitive data, however. It is, sensitive data types make up their own category. And the reason why sensitive data are considered sensitive is because these are fuel for discrimination. So, sensitive data are any kinds of data that someone can use to figure out your sexual orientation, your gender, your sex, your age, your religion, your political affiliations, whatever, if you’re like a member of a union, for example.

All of these kinds of information or data that lead to these kinds of information can be used to discriminate or to segment unfairly, or can be used for political ends. And when I spoke about human rights earlier, sensitive data are key to that kind of stuff. So, let’s think about really quickly what that means.

If we take all that stuff together, thinking about the concepts of opt-in, opt-out, explicit and implicit consent, sensitive data and personal data, I do wanna say that legally speaking, and ethically as well, personal data are okay to be collected in a scenario with implicit consent, so opt-out.

For example, this cookie banner, right? If we think about a website that shows us a cookie banner, and in this case it’s a cookie banner with just one little button that says okay, I don’t have any choices here. Basically, this website is already tracking me by the time that I actually get to see this cookie banner.

And the reason why that’s okay is because the cookies that this website are using, I hope. I didn’t actually research this for this particular website. But the cookies that a website would be using in this scenario are most likely tracking cookies that allow that website or advertisers or other sort of things to track me and my identity across the different pages of that it itself and hopefully not around the whole web.

But they wanna see where I as one individual navigate around their website, so they save a little identifier on my computer. That doesn’t tell them anything about my gender, or my sex, or my sexual orientation, or my ethnicity, my race, or anything like that. It just tells them that the person who clicked through this series of pages is me and not someone else.

One place where this becomes a bit tricky is camera surveillance. It might not necessarily be super relevant for you as a Fintech designer. However, it is something that I want us to think about because the physical world and digital world are becoming very, very intertwined. And I’m not sure if you heard about this, but the United States Internal Revenue Service, our tax service, recently piloted a program where they were using facial identification on the website if you just wanted to do very simple tasks regarding your own tax stuff. And of course, that’s very bad. It’s sensitive data and it can be very, very badly misused.

But in any case, what I wanna talk about here is the fact that in this particular example. I don’t wanna make it all doom and gloom and very negative here, so let’s try to keep it light in any case. But what struck me here, this is a sign from the Stockholm Museum of Modern Art. A friend of mine and I went there a few weekends ago, and this sign basically tells you that there are cameras that are surveilling you.

The issue with this is, the reason why it’s an opt-out sign, the reason why it’s implicit consent and not explicit, is because this sign appears after you’ve actually bought your ticket and entered the museum itself. So, like the cookie banner on the website, the tracking is already happening by the time I have reached the point where I can see the notification of that tracking.

That distinction is the key. If we think about a different one. This isn’t a privacy example per se, but it does give a great illustration of what opt-in consent looks like, explicit consent. This modal overlay actually asks before the action it wants to perform is performed. So, it’s asking permission first. And what I also really like about this is that it tells me the action it wants to do, so it wants to install a driver, but it also tells me why. I want to install a driver in order to share sound.

And it gives me a little bit of extra context. It’s only gonna take a few seconds and I only have to do it once. I really like the way that this is presented, because not only do I have an escape hatch, I can click on cancel to dismiss it and not install the driver, but it also allows me to make a more informed decision. I know that it’s only gonna take a few seconds, so I’m not gonna have to sit here for hours in front of my computer waiting for this thing to install. You know what I mean?

So keeping that in mind, this is the kind of affordance and feed forward that we should be giving people whenever we ask them the choice whether they consent or don’t. Another pattern are these toggles, or check boxes work here too. But basically we see these a lot in forms.

Like I was mentioning earlier about the signing up form with the newsletter notification, or the terms of use consent. The thing here about opting in and opting out, or explicit and implicit is if the toggle is already set to on, so to the “I accept” position before I turn it there, it becomes implicit consent, because it would take an extra action for me to actually turn it off, which is why we call it opt-out.

Whereas the opt in version of the same pattern is simply if the switch is turned off. So if I have to actually click to turn it on and activate it, that makes it opt in, which makes it explicit consent. So, pop quiz.

Let’s say we have the signup screen for some kind of piggy bank app or whatever, and this is what I see. The switch is turned on already, asking me or saying that I consent to my transaction history being analyzed. Is this okay?

It actually isn’t. Here we see the first appearance of our friend Deceptive Devil. So, this little devil icon that you see over here, this icon is gonna appear on the screen whenever a deceptive pattern is shown. And in this case, this is a deceptive pattern, because transaction histories can tell a huge amount about a person depending on where they maybe pay dues for a club, maybe what kind of medicines they’re buying.

You can tell if someone has some kind of chronic disease, you can tell what kind of sex toys they might be buying, or for whom or with which counterparties they might be having a lot of transactions. So you can see some relationships there.

Transaction history is extremely rich in information about a person, which is why a transaction history data are sensitive data and we need explicit consent here for this, which is why this is not okay.

On the other hand, what about this? It’s a newsletter form and there’s no checkbox, or a switch. Is this all right?

In this case, it actually is. The reason why is because the form itself has only one purpose, and that is to subscribe to the newsletter.

So in that case, there is no need for a separate opt-in to a newsletter, because the filling in the form itself is the explicit consent, is the explicit action that shows that a person consents to subscribing to that newsletter, with a small a caveat that if you’re going to do something like this, make sure that you show a link to the privacy notice. Here’s another one.

Another pattern I really like to use, in connection with a short form privacy notice which we’ll talk about later, is a privacy options screen. So basically, any time we allow a person to go through an onboarding flow and they start using an app that we’re creating for them, whether it’s a web app or a mobile app, legally, and also ethically, we have to give them the option to be able to retract their consent if they need to, or if they want to.

And obviously, you can see that one of the things, basic information here, for example, isn’t selectable, because the basic information is needed for the app to actually work at all. So if someone doesn’t wanna share those data, then they really have to either delete their account or just stop using the app itself.

But, what’s important here is that we give people options for extra things. And it sounds very easy. Like if we think about how easy it is for us to add some check boxes and some switches here, some toggles to let someone actually consent or un-consent from a thing, it’s it also means that we need to design the app in such a way that it also works if someone opts out of those other extra additional types of data collection, if that makes sense.

So, the app needs to also be able to work with basic information, even if some features might not work, or some features might be locked or unavailable if a person doesn’t want to share certain kinds of data, we need to keep that in mind whenever we’re structuring the app itself. So try to make it modular, such that someone either not giving consent, or first giving and then retracting consent for optional things can still have a meaningful experience that delivers on the value proposition of that app or service.

Actually, I just ran into something with PowerPoint here. Microsoft Office does this because they have certain features in Microsoft Office that require them to, for example, download things from the internet. So there’s a new thing. You can insert like stock icons, and those come from a Microsoft online service, which means that my local PowerPoint copy needs to actually make a connection with the server on the internet and it needs my permission and consent to be able to do that.

But if I take away that consent, that feature just isn’t available and I get a prompt whenever I click on the button to try to call up that feature to say that I understand that you wanna do this, but we need your consent please, to be able to do it. Does that make sense?

So how do we ask for those kinds of consent? I will talk about that in a second, but first I want to disambiguate something. We all are probably familiar with these kinds of permission dialogues. These are not the same as collecting consent, and here’s why. A consent question in an app is a conversation between your app and the person who’s using it. It’s direct between you and them.

But, a permissions dialogue like this is actually a conversation between the person and their device about you, or about your app in this case. So, it’s definitely not enough to think that if you have incorporated this kind of alert into your app’s flow, it’s not enough to be considered consent from a legal standpoint, and actually from an ethical standpoint either, because there are much richer and nicer and more fun ways to be able to ask people for of this kind of consent, actually, if you really want to.

So, we’ve covered now the right to object. Let’s talk now about the right to notice. So, this is what I meant when I talked about the fact that we can ask people for consent at different parts of the journey. It helps if we can tell them why we need their data. So, the right to notice entitles people to know, people who are data subjects, it entitles them to know which data we collect from them, why we need them, who can access those data, and how they can have or delete their data.

And what I was kinda building up to the whole time is this thing called a short form privacy notice. Now, I didn’t invent this. This actually comes from years and years of legal precedent in data protection law. But this is my let’s say, version of this design pattern. And a short form privacy notice shows all of those interesting types of information about the relevant data that’s being collected, or that we want to collect let’s say.

It tells us what we’re trying to collect, why we wanna collect it, who can access it, how people can have it changed or deleted, and it also has a link to the full privacy notice. And the reason why a short form privacy notice is so nice is because it gives us the opportunity to actually speak with someone about the data that we wanna collect in a human, chill, less formal, non-legalese way, if that makes sense.

I mentioned this very briefly earlier but I wanna it extremely explicitly right now, is that the choice is not allowed to be all or nothing, especially in the case where we want to collect sensitive data. It’s actually a legal obligation, but it’s also a moral and sort of just commercial one, like if we wanna serve people in a nice way, it’s important for us to be able to provide them a meaningful experience and give them the value proposition that they expect from us without having to force them to opt in to all the different kinds of data that we want to collect.

So please, if you’re going to use this kind of pattern, make sure that there’s a way that people can actually meaningfully not opt in and still will have a good experience on your service or app. I will also say really quickly where we can put this thing. So, I mentioned a moment ago that we can create a screen, or we can include a screen, or an app or an overview, inside the service itself or the app itself where we can manage consent after the fact. So anytime, if I wanna retract some kind of consent that I’ve given, I can go somewhere in my settings or under my profile and find the options to do so.

The short form privacy notice is intended to be used for the first time, or actually when you want to ask for consent to collect certain kinds of data, so that we can give people context about what we’re gonna do with those data, why we need them, and why it behooves them to actually give those data to us, and allow them to actually opt in if they want to, and look at the full privacy notice if they want to as well.

So, depending on your specific context, it might be interesting to do this at the beginning of the journey, or as part of potentially an onboarding process or maybe an extended onboarding process.

But what I also like to do is split up things. So, let’s say we have now transactional history, precise location, our two types of data that we wanna collect, next to the basic data that we’re collecting anyway.

Let’s say there are two other kinds of data that we wanna collect as well. It might be something like in the beginning, in order to make sure that someone starts with a rich and full experience, it would be great if we can see the transaction history and precise location. And so, we have this short form privacy notice, and then we have maybe a shorter one or a smaller one whenever it comes to asking for a specific other type of data, those other two that we haven’t asked for here.

And that way you spread out the asking of consent to the moment when it’s actually relevant, if that makes sense. And the reason why that’s good is because for a person using an app like this, it makes more to be able to make the decision around the bestowing of this consent in the context of why this consent is important.

So for example, if I wanna take a photo and it wants to ask me if the thing can actually analyze my photos, for example, then why should have asked me before I’ve actually hit the button to take a photo, like the Microsoft Office example I talked about earlier. It doesn’t ask me for consent for that other stuff in the beginning. It only asks once I’ve said that I wanna use the specific feature that it’s talking about or that the consent is required for, if that makes sense.

So, short recap about the short form privacy notice. We can have it at the beginning when we are onboarding people to our app, and we can also pepper them in between different parts of the journey to ask for data that are relevant to that specific context.

The next thing I wanna talk about really briefly, super, super briefly, you’ll notice that I haven’t said privacy policy, I’ve been saying privacy notice. And ‘um, actually,’ the privacy notice is the thing that we always read that people talk about as being the privacy policy, but the privacy notice is actually the thing that we read and the privacy policy is actually the behavior and governance underlying that privacy notice.

So in that sense, you can say that a privacy notice describes a privacy policy, but it’s not the policy itself. This is a very nitpicky thing, but I see it everywhere. So, I just felt like I would talk about it here.

The next right that people enjoy, we included, is the right to access our data. This one is, from a design standpoint, actually relatively easy to accommodate. As you can see here in this book value app, there’s an export button that gives a number of options to be able to export the data to different kinds of formats.

So, if we think in terms of dos and don’ts, definitely do let people export their data into useful formats, especially if they’re human readable. But what we don’t wanna do, and here’s Deceptive Devil again, don’t make people have to contact a person in order to download their data.

I’ve seen a lot of apps and services that say, if you’d like to have an export of your stuff, please email blah, blah, blah, blah. That’s not cool, okay?

What is cool is, in any data dump or data export, is including any algorithmic processing or any kind of logic that relates to the person who is downloading these data, stuff like Facebook. Like I would really love to see what kind of advanced algorithms are being worked on the profile that I don’t have anymore ’cause I’ve deleted my Facebook, but that’s one of the places where I’d love to see that, because algorithms nowadays determine lots of things for us.

For example, whether we should be eligible for a loan, or if we have been convicted of a crime, what kind of a sentence we should be getting. These are both examples of things that people are being judged by algorithms in the world today. Sorry to bring it down.

One last thing that Deceptive Devil loves to do that’s really bad is making people pay to access their data. If anyone does this to you, if you try to export your data from any service and they make you pay for it, it’s illegal. You have the right to have access to your data without paying for it. So, I don’t know what to do in that situation, but in any case, they’re in the wrong, also legally speaking.

Another thing is the right to rectification. This is our right to be able to change data that are inaccurate or incomplete, or fix things that are broken in data. And so, in designing our services, very much do let people make appropriate changes to the data in your service or app.

However, the tricky thing here is with blockchain. Anything that’s on a blockchain is immutable and cannot be changed. So, be careful which data you actually put on chain and which data you store off chain, because anything that you have on chain cannot be changed.

This is, in my opinion, kind of a shortcoming of blockchain, or at least a shortcoming in the way a lot of people use the technology itself.

Another thing is, Deceptive Devil, don’t make people contact someone to make changes, if you can help it. Obviously I understand if you’re working in a startup and development time is very tight, and you don’t have the resources or the dev capacity to be able to create a system to make it work independently. Okay, I get it. But don’t try to get away with this for too long, ’cause it’s not cool.

Another right that we have is the right to portability. And it sounds very funny. I don’t know why, but I just like the way it sounds, the right to portability. But in any case, best practices here is to again, allow people to export file formats that are standardized. And this is a little bit different from the right to access, because there, there’s a little bit more freedom in terms of which formats we can use, or which formats we can export the data in.

Here, for the right to portability is actually very core to the do and don’t in this case. It’s definitely do give people their data in whatever format is relevant for the type of data that it is. If there’s some kind of standard, it’s really great. And in this case, it’s more important that the data are, or the export, or exported format I should say, is machine readable than that it’s human readable, if that makes sense.

So for example, if I want to export my data from one accounting service and import it into another one, a PDF is not going to help me. A PDF will allow me to look at the data on my own and maybe have an understanding of let’s say, my business’s bookkeeping, but it’s not going to allow me to upload and import those data into another service.

This is why here, I’ve shown in the example here, like CSV, comma separated variables, XML and JSON, I don’t know what they stand for. But in any case, these are all formats that are very well machine readable. These are common formats that people use to import data from one service to another.

And Deceptive Devil over here tries to allow people to download some kind of proprietary format that only works for their own project or product. That’s not cool, and I think it’s also illegal. But in any case.

This is another right that we talk about a lot, and another one where blockchain is again, we have to be a bit careful with it, and this is the right to erasure, or the right to be forgotten. This means that we need to be able to delete our accounts and have our data deleted in so far as is legal. I understand for banks, there are legal responsibilities for, for example banks to keep a hold of financial information for a number of years based on the local laws. But in any case, it’s very important that people are able to delete their accounts, delete their data and have it removed.

And again, blockchain, being immutable as it is, which is one of the wonderful things about blockchain, we can’t erase whatever’s on the blockchain. So, this is another one of those times where it’s important for us, if we are working somewhere on a blockchain project or a project that uses blockchain technology, we need to be very careful and very intentional about which data we store on chain and which data we store elsewhere off chain. Because if it’s off chain, we have control over it, of course. But if it’s on chain it’s immutable, we cannot erase it and we cannot change it. It’s part of the point. But this is where there’s a little bit of conflict and tension between actually the technology and the legal rights that we all enjoy.

So for this example, I just wanna talk about this, this is StrongLifts 5×5. It by no means means that I am working out regularly, I’m not. It’s been a long time since I actually used this app, but I love it because it’s so well designed, and they let you delete the account right from the app itself. It’s super simple to find, it’s right there. But I do also wanna talk about one thing, and here’s another little nuanced thing.

So, I see a lot of apps and a lot of services that make it incredibly difficult either to find, or to carry out the action to delete the account. Don’t make it too hard. That’s not cool. That’s Deceptive Devil all right.

But, having said that, it is helpful, or it can be helpful to add a little bit of additional friction so that people don’t mistakenly delete their account. It’s a very destructive action, especially for someone who has been collecting or been working with the data for a long time.

And if it’s a data set that’s been collected over a long period, then it’s heart crushing to lose it all at once by accident. So, try to prevent mistakes whenever you possibly can, and let people delete their data, if you want.

Last thing I’ll say, avoid at all costs putting sensitive data on an immutable blockchain. Please don’t do that, especially in this case of where those data are not anonymized. Because anonymization is also one of these myths. Like it’s very easy to de-anonymize data, especially when we combine them with other data sources. So, I understand that what I’m saying isn’t very nuanced and I understand the situation is a bit more complex. But really, if you can avoid it, just any sensitive information or data, just leave it off chain. Leave it off the chain. Let it be erased if it needs to be erased.

Think about stuff like Tinder was busted for selling people’s HIV statuses to advertising agencies. I know that doesn’t have anything to do with blockchain. But just think if it were on a blockchain, and let’s say you had an HIV status somewhere, or some kind of health thing anywhere and it was unerasable. And anyone could look at it at any time and connect you with a specific chronic disease or some kind of thing that could potentially make you unemployable to certain people.

This is where human rights and privacy really intersect. So I just wanna say, I don’t wanna make it all like hardcore, but just be careful. Be cool, that’s all I’m asking, all right? Okay, so, are you still awake? I hope so. We’re almost there, we are almost done.

That was the main course, and now on to dessert. It’s gonna be very short I promise you, and we’re almost through it. So, stay with me.

Principles! Let’s go back and understand that the overarching principle here is that privacy is about safety and trust. It’s an opportunity for us to help people and to help provide them with safety and help them feel that trust. That’s really helpful in being the caretakers and the facilitators that we are in people’s lives these days.

Furthermore, let’s make sure that we minimize the data that we collect. Don’t collect anything we don’t need, and try as much as possible to keep taking things away. If you can stop collecting data that you don’t need, or if you can put things in place that delete data as soon as you don’t need them anymore, that’s really good. Another thing, another principle, is that we need to keep people informed. And I think this one overlaps a bit with Dr. Cavoukian’s paper.

People need really to understand what it is that they’re consenting to. They need to understand the potential risks, they need to understand what it is that they’re actually doing and what they’re agreeing to. If they give us their consent, and if they enter into a relationship with us that requires this change of data, especially sensitive data.

And finally, let’s keep people in control. People need to be able to erase or change data when they need to or want to. And it’s on us to make it as fluid and painless as possible to make that process happen. It is a legal right after all.

But basically, if you don’t remember anything out of this whole video, out of this whole talk or presentation… If nothing else sticks but this, there’s one thing that I would love for you to remember. And it’s this.

Empower people, never exploit them.

It’s that simple. Thank you very much.

Chi Miigwech! I really appreciate being here. And I’m honored that I get to speak with you today. Or this evening, I should say. Thank you!