Roisi Proven
Transcript
[00:00:15]
Thank you, uh, everyone who's here, uh, for my talk. I I appreciate you taking the time. Um, and, uh, welcome to Fucon. uh, as well. Uh, and and thanks to them for inviting me. Um, so my name's Roshi Proven. I'm director of product at Altmetric, which is a company that helps, uh, researchers, institutions, publishers, um, and corporate bodies track the attention around scholarly outputs. Um, and, uh, I wanted to talk to you a little bit today about, uh, the bad things that can happen when we don't pay enough attention to the products that we're building. And how we can avoid some of these pitfalls, I think it's especially important in this day and age.
[00:01:02]
So, I think we can all agree that, um, for the most part, we are building products to make things better. Uh, as sort of product managers, as people who deliver software, as people who enable the delivery of software, we're here because we want to make things better. Unfortunately, we keep seeming to make things just a little bit worse. Um, this subject has jumped the shark a little bit in 2020. Um, I last gave this talk around a year ago, and the world is not the world that I gave this talk in last time. Um, and as I was sort of looking for examples for this slide of things getting worse, one of the things that happened that really drove home to me how important it is to be more aware of this stuff. is when I was looking for a clipping, uh, around, uh, Facebook's part in the, uh, genocide in Myanmar. And so I Googled Facebook genocide as you do, you know, lighthearted Googling. And, uh, realized that I had to specify the country in which the genocide took place because just saying Facebook genocide wasn't specific enough because they've been complicit in more than one. Um, and that really said to me that like,
[00:02:10]
this is very important and we should all be constantly aware of the, um, the bad things that can be happening when we don't pay enough attention. And I really do think a lot of the time it isn't malice, it's just not having the right team and not being aware enough of your environment. So it doesn't happen all at once, it's not like, um, Mark Zuckerberg was sitting at university and was like, you know what, I'm going to build an app that invades privacy and leads to death. That was not the thought, I don't think that went through his head at the time. Instead, what it was, was a dozen little steps that might have seemed productive or altruistic, but that weren't actually taking them in a good direction. So, you know, they were sitting there and he was like, I quite like to meet people on campus. I should make an app that's fairly innocuous, um, easy to understand and doesn't seem too scary. Um, and then it's like, well, I don't want my users to pay directly, so I'll advertise for to them so that they can use it for free. That sounds really cool. We're making the internet free for people to use, feel free for people to socialize within. And then they were saying, if I collect more information about them, then the adverts will be more relevant. That's another cool thing that I can do, that's another way that I can help serve my users.
[00:03:27]
Um, and then we were sort of thinking, oh, this app is for real people, we don't want to pretend people, so we'll make users use their legal name. What could be wrong with that? And it turned out that each one of these steps, whilst not a problem in isolation, was actually damaging part of the community that they were claiming to serve. And it's got to the point now where they've grown to such a scale and they have such an outsized influence on our communities is it's really quite scary and I think that if we don't become more conscious, we will really, really struggle in this day and age. to to keep things going. It really is that important.
[00:04:07]
Um, but if the bad thing isn't on purpose, how do we avoid it? If it's not something that we're consciously doing, um, how do we stop it? Uh, and for me, the answer is to run a Black Mirror workshop on your product. So I'm going to go in a little bit more detail of each step of the workshop. I also have a mirror template, my contact details will be at the end of this presentation if anyone is interested in running this in your own organization. I will happily give you that mirror template. Um, I want as many people to benefit from this as possible. But the reason that I framed this in basically a horror episode, um, is that part of the hard thing about thinking about the bad that can happen is that it's not nice to do. Um, and so I feel like by wrapping it in a fiction and by abstracting yourself from all the bad things that you're discussing, you allow yourself a little bit more freedom to be a bit more aggressive with the negative things that you're thinking about and with the hard questions that you're asking yourself. And there are three core rules, um, in order to run a Black Mirror workshop. Nothing is out of bounds, no matter how unpalatable. It's really easy to sort of sit there and say, oh, that would never happen because just nobody does anything that badly. But when we're talking about genocide, we're talking about stocking, we're talking about systemic injustice, um, it doesn't matter how unpalatable it seems, it's still possible. And forget what you know about the people around you. As I said before, a lot of this doesn't come from malice, it comes from not asking the right questions and not asking them often enough. Um, so when you look at your team, you will most likely see a group of people that are your friends or your colleagues, and people that you like and don't think badly of. You need to forget the goodness in those people and focus on the potential badness of the whole world at a macro level. And you should be looking for real life parallels. I'll show you some more a little bit later on, earlier on, I showed you that whole slide of news clippings. It is alarmingly easy to find examples of hideous abuses happening as a result of a lack of attention to our products and and to how they might harm people. And above all, it won't work without trust. When we're talking about really harsh realities and things like systemic racism and uh acknowledging our privilege, um, and figuring out how to take a step over it and understand how we could hurt people with that privilege. It won't work if you don't have high trust within your team. Uh, so I wouldn't recommend doing this with a team that is already, uh, struggling with any sort of dysfunctions because, um, it could be really harmful. Um, because you're going to be touching on some quite sensitive topics, you're going to be asking some quite hard questions of one another. And you're going to be digging deeply into the harm that could be caused. And that's not going to work if you're already feeling uncomfortable with each other. So it's really, really important that as a group and as a team, you trust each other. So, how do you run a Black Mirror workshop? When I talk about a Black Mirror workshop, the outcome is that you're going to have a Black Mirror episode where the central antagonist is your product. So you're going to start with your true north. So before doing this, it's really, really important to make sure that you have, um, an understanding of what your product is so that you can figure out what it isn't. Whilst it's really important to to run this exercise and be critical and be pragmatic and a little bit negative, honestly. it's more important to start with positivity because otherwise you won't be able to see the wood from the trees, you'll just get stuck in what ifs. So it's really important to define what your product is and what it's going to do for the communities that you are hoping to serve. So on the left hand side, you can see that I have, um, done an example. Um, I have picked an innocuous, I've made up a product. I haven't picked one because it felt a bit unfair to inflict this on a company that actually exists without asking them. I did that previously, I think better of it now. Um, and so any similarity to any product living or dead is purely coincidental. Um, so the product in question for this Black Mirror example is walk with me, which is a social media app that allows dog owners to meet other dog owners in their area for puppy play dates. So the most innocent and innocuous sentence I could think of at the time. Um, it's for pets lacking socialization due to lockdown, difficulty making friends as an adult, needing something in common, safety concerns around walking alone in dangerous areas. And then you're sort of thinking about the opportunities you can create, you can create a community. It's all of the usual sort of social media things, creating a community, find um similarities between people and also collecting data. on people, because as we all know, data is one of the most valuable. uh, commodities on the internet and by collecting it, it is a viable and valuable way to monetize your product. And so that's why our target segments for Walk with Me is dog owners, pet care brands and dog breeders. So it's not just the owners using the app, it's also the people using the data from the app to market to those people. So this is our true north. This is what we think our product is going to do, what opportunities it's going to create and problems it's going to solve. So you've got your positive, altruistic, supportive view of things. And then you're going to immediately darken your landscape. You should be looking both in your domain and also for bad things that have happened to your intended customer base. So you can see here we have some examples of Facebook being used in dog theft and Gumtree being used in dog theft, trading, breeding. And also looking at, um, apps that track your location. So Strava had a, um, uh, a feature that meant if you ran past someone, complete stranger, it would add you automatically to a running group with that person and share your running data, which means exposing where you live. It means exposing things like military bases, you could see routes around military bases that identified where the military base was. So there were some real serious differential privacy concerns.
[00:10:52]
Um, this task in a remote workshop like this is best done asynchronously. in part because it takes a little bit of time and also because it's quite hard work. Again, you are leaning into the darker side of life and leaning into the bad things that can happen. And you have to acknowledge the mental toll that that can take on your team. It's not nice to think about bad things all the time. Um, so it's best given people a bit of time to go away and look for things and collect them over time. So you have your landscape, you have the things that have already happened.
[00:11:32]
So now you're going to look at the worst possible outcomes.
[00:11:37]
And at this point, it's really important not to be afraid of hyperbole. The reason for that is, um, sometimes it can be tempting to hold back a little bit because it's like, well, my product could never cause a death, or my product could never cause a, uh, theft.
[00:11:53]
And that is possibly true. Um, it very may well may be that it doesn't cause any of those things. But what you're looking for is the worst possible outcome and then those steps to get to it, because if you avoid the first step, you will never get anywhere near it. And it's important to think really hard about the worst possible outcome, so you can avoid the little paper cut steps that take you toward it. It's also really important at this point to invite diverse perspectives. If you are sitting in a room with people who all look like you, you will you will miss things. And that isn't because you're bad people, it isn't because you're malicious, it's just because you don't have the lived experiences necessary to contribute. Um, that can mean, uh, looking at social media for diverse perspectives on your area. Um, spend I've spent a lot of time trying to make sure that I have a Twitter feed for instance, of people who do not look or sound like me and people who make me think really hard about my place in the world and sometimes make me feel uncomfortable about my place in the world. This really important. Um, also, hire people, hire consultants and pay them for their time to help you uncover these issues. And I'm not talking about like a diversity and inclusion consultant, I'm talking about a product expert or a domain expert that can help you through this process and you pay them for that time. because that diverse perspective is completely invaluable. So a really good example of um, an email, um, platform that launched and everyone was like really, really into it. It was very, very full-featured, very set for business, and one of the things that it had was the ability to tell when someone had opened it and on that open note, it would tell you who when they'd opened it and where they'd opened it from. And that seemed like a great idea at the time, but what it turned out to do was that it was sending someone's location to a person that hadn't given permission for that location to be sent. And in making a statement when they removed this feature, they said, we simply didn't understand or think about the harm it could cause. And that was because they weren't inviting enough diverse perspectives into things like this to ask these questions. Um, in the template, I've also included a number of seat questions to help you kick off and start thinking about some of this stuff. And that's where some of these posted come from. People could use it to kill dogs, people could use it as a kidnapping lure, people could use it for dog breeding, people could use it to organize a dog fight. These are all really aggressive and extreme things that could happen. But the point is, you're thinking about how bad it can get, so you get nowhere near there.
[00:14:41]
And then the step four is it's strange to say that it's the fun step because this is a very serious thing. But at this point, it is okay to have a bit of fun with it because again, you are being deliberately hyperbolic and deliberately extreme. And you're wrapping the problem in that fiction so that you can abstract yourself from it and accept it more readily. So there are two ways that you can do this. You can either do a storyboard, which is a visual way of mapping out your episode, or you can do a beat sheet. A beat sheet is something that is used quite commonly in script writing. Um, because genres tend to have similar story beats, no matter the film. Like an action film has a structure that is predictable and you can write to that structure. So I've included a horror beat sheet, seemed up. Um, and you can use this to seed some of the smaller steps that it would take to get to the worst outcome. And I think it's okay to have a bit of humor here because I think confronting the the horror of some areas of modern life with humor is valuable for our collective sanity. Um, you probably can tell from my talk delivery style that I prefer to approach things with humor, a bit of sarcasm and, um, like a big dollop of just pragmatism and realism.
[00:16:12]
Um, and so our, uh, episode for, uh, this one is some dogs go to hell. Which is a really corny title, but, you know, I was writing this quite late at night. And it takes us through the experience of a user called Sally. Sally has a be show on freeze. She, uh, really wants to make friends near her house, and so she joins up to this app. And we discover that someone lurking in the background seems to be particularly interested in Sally's bee on freeze. Uh, and so through the story, you start seeing this person pop up. Sally immediately doesn't do the wrong thing. She doesn't go into the basement without the torch. This is not, like, we're not doing people doing silly things to get into trouble. People are doing the right things and they're still getting into trouble. So she met with another woman. There's a feature on the app that gives you the midpoint between two users' houses so you can find their nearest park. Massive differential privacy issue. And so as we go through the story, things get more sinister. The person that Sally's walking with disappears. Her dog turns up dead, and then finally Sally's be on freeze gets stolen from her and she gets severely injured in the process. This is not something that could never happen, this is something that has happened, just not in the context of this app. And as I said, it's very extreme, but what we're thinking about here is take the feature of, um, midpoint between two houses. What risks does that have to vulnerable individuals? What, um, could allow someone to abuse that or use it for nefarious purposes? It's about triggering those sort of points of contemplation as you're building your product. And then finally, you've got step five. So you've darkened your landscape. You've understood what you're doing, and you've storyboarded your Black Mirror episode. You could film it if you want, that might be fun.
[00:18:14]
Um, but what that should get you is your true south. So this is, you know what your product is, but you also know what it explicitly isn't. And your definition of done badly, questions will stay with you throughout your development process. So, um, a good practice when you're developing something is when it gets to the point where you're moving it into the done column, you want to say to yourself, like, what does done mean for us? And this it expands that slightly and says what does done badly do for us? So it allows us to ask certain questions to ourselves, such as, what makes it easier to see what breeds are in an area? Could someone use that to aggregate a a view of where breeds are to target them? Could a vulnerable dog owner be put at risk by this feature? And it will, like, trigger you to answer those questions. And the answer to the question might be, yes, it will, but we will mitigate the risk by X. So it's not necessarily a stopping point for these features, it's just a point to release them more consciously. And if you decide to pivot after running the workshop, if you run it and you're like, the risks for bad things here are so high, I cannot in all good conscience take that risk, pivot. Run the workshop again, see where you get to. It's important to refresh this the same way that you would refresh your product strategy, so it keeps you on the right path. So by the end of this, you should have your storyboard or your beat sheet of your Black Mirror episode. You should have your true south, the direction that you absolutely, positively do not want to go in.
[00:19:55]
And you should have your definition of done badly that allows you to apply some rigor to the processes and allows you to continuously think about this. As you're incrementing on your product, you're also incrementing on the ethics surrounding your product.
[00:20:13]
So I would now pass this to you.
[00:20:17]
Um, you can build whatever you want to, the world is our oyster these days. But should you be building it? What's the worst that can happen if you build it? Look around us at the US election and the pandemic at just, just 2020.
[00:20:38]
And figure out what's the worst that could happen. Like, go crazy. Um, how do you protect your vulnerable users? And you also need to decide personally where your line is and whether your company crosses it. Um, because if you can't change where you work, you should change where you work. This is something that I haven't followed closely enough in the past. Um, and I definitely feel like I will be following it more closely in future. Uh, but some places will be aligned with your ethics and what you believe in. And some will not be and will never be. Uh, and it's important for you to define where that line is. I have a quite short list of companies that I wouldn't work for for all the money in the universe. Um, but there are some companies where I would happily be like, you know what, I'm going to muck in, I'm going to get my hands dirty, I'm going to do good here. That other people would just draw the line at. So it it's different for everyone. And I really believe that this kind of product thinking is critical to turn a corner in our industry and start really caring for our users rather than just being obsessed with them.
[00:22:05]
Um, so I thank you ever so much for your time. I'm aware, uh, it is a little on the short side, but I have a habit for machine gun Scottishing. Um, but my Twitter and LinkedIn are here. If you are interested in running the workshop yourself, you're more than welcome to get in touch and uh ask for the template and I will send it over. Thank you all very much.