You Got This!

When Moving Too Fast Really Does Break Things

Summary

Whilst a demand to output as quickly as possible can seem like a productive way to work, it can actually cause a great deal of harm. Moving too fast can hurt businesses and hurt individuals. In this talk, Vidhika advocated for working proactively rather than just reacting.

Why moving too fast can cause harm:

  • When creating products, we have a seat at the table that others don’t. Don’t doubt your power in how you can inform the future of your product and uses it may be unlocking.
  • Zoombombing is a good example of this - the original default settings of Zoom let to many traumatic experiences through this harmful use of its product.
  • Features that are intended for convenience could result in bad results - we can be blinded by good intentions.

How to be proactive in reducing harm in the long-run:

  • Ensure your data sets, sample sets, and your team is diverse enough. A wider range of issues can be accounted for through a diverse range of experiences and data.
  • Remember that design features created for good always have the possibility of causing harm. An example is Facebook Memories - designed to remind its users of joyful occasions but has resulted in reminding users about a personal crisis they may have previously faced.
  • Carry out both usability and abusability testing.
  • Always think of the worst-case scenario. ‘Black Mirror Brainstorm’ your product: if an episode of Black Mirror was created about your product, exactly what could happen?
  • Is risks are identified, ask yourself is it is an isolated risk or if it could cause more harm in the long-run, and lead to even greater issues.

When you really do have to move fast:

  • In some faces, moving slowly may cause even more harm. However, we can still plan to mitigate risk.
  • If you do need to move fast, just remain clear about what risks this can have, and inform your product users of what might be missing at this stage.
  • Plan to return to the product at a later date, so you can spend time thinking about possible worst-case scenarios and properly plan for them when you have time to do so.
  • John Wooden said: “If you don't have time to do it right, when will you have time to do it over?”

Transcript

My name is Vidhika, and I'm a UX manager, and I work very closely with a lot of different functions, but one of the things that I see as a very common ask is this idea of we have to move fast and a mantra we are all used to, that when moving too fast really isn't supposed to break things, the intentions are always good, it's the intention is let's not obsess about perfection, let's make sure we're just keeping some momentum, so it is really great in theory, but I've noticed that it is a little bit outdated, because sometimes when we move too fast, we really can break things, and not break things, but we can hurt people. So my argument here is that I don't think we always have to break some things.

Sometimes, we can predict what might go wrong and stop it from happening in the first place. Today, I want to talk to you about examples of how when not thinking ahead has broken things and how we can use our power and privilege to proactively prevent that in what we build. So, why does this matter? Well, I'm going to share a little comic with you first of all.

As you can tell, in here, moving too fast is not always the right approach. It's not always the best one and it doesn't work in all contexts. Not everything can break, and just kind of still be okay, right?

So an example on this list is just a surgeon, right. You wouldn't want your surgeon to be doing like an MVP surgery. If your surgeon decided like I'm just going to move fast, don't worry, you would be very concerned, because you hired your surgeon for the opposite reason is that you want them to feel something, make sure they're doing their due diligence, and your loved ones would feel the same way.

That's obviously an extreme example, and you might be thinking well, tech is different, right. We're not surgeons, we're not really - we don't have people's lives in our hands quite literally, and while that is true, I think that a lot of the jobs that even let's say a surgeon is doing, right, that often requires technologies that some of us are going to be helping build.

At the end of the day, some things just aren't worth breaking. It is just plain and simple, moving too fast can hurt individuals, can hurt businesses, and it even hurt society as a whole. There are a lot of ripple effects involved.

We are going to talk a little bit about some of the specifics. Before I go any further, I want to make a couple of notes. One is that intention is not equal to impact. I very much note that we never mean to make things that are bad, right? Most of us, I think many of us got into this field because we want to build things that help people.

Just because something isn't built to have harm doesn't mean it can't still have harm. There can be unintended consequences, so it is just something that we need to be mindful of separating our intentions from the potential impact that we might have. Secondly, foresight is better than hindsight.

We don't want just to be reactive but proactive when trying to prevent bad consequences. Obviously, we can't figure out everything in advance. There are some things that we're going to have to learn the hard way but there are some things that we can get ahead of if we give our work a little bit of thought, deliberation, try to account for things that are just a happy path. Finally, inaction is essentially like being complicit, right? If we're not part of the solution, we are sort of part of the problem.

We are really lucky in the sense that we get to be privy to things that are early stage. We get to build a lot of the tools that the world runs on. And we have a voice and a seat at the table that many others don't, so it is really important that we don't lose our privilege for good. This is a nuanced and complex topic and it can be a whole different talk, but I understand there are times when we can't speak up, there are times when we don't have the leeway to say say something that we think maybe deserves more time or should be done a different way because we have a non-negotiable deadline, or, you know, a boss telling us that we don't have a choice, and we don't want to lose our jobs.

I'm not suggesting that we put ourselves in bad situations. I think it's like the saying goes, right, you want to put your mask on first before helping someone else, and I'm realising that that is an expression that used to refer to aeroplanes and nowadays people think of Covid. You get the gist: you want to make sure you're taking care of yourself first.

A lot of time the tiniest actions we take, even if it is on our own little piece of the pie, the work that we are directly responsible for, we can have a really big impact, so don't doubt your power, if there is one thing I want you to take away from this, it is that. So there are a lot of different things that can go wrong when - any time we build something that things are used differently, there are things that we couldn't have guessed what happened, but a couple of things, a couple of categories that come to mind, one is abuse, right? There is sometimes bad actors who are problematic, and simply using what we build, defining vulnerabilities take advantage of other people for their own benefit.

Then there are other situations that are everyday life. A lot of times, someone using something as they intend, they don't even realise that something could go wrong, and some small mistake happens, and it may be because the usability of the product was bad, or something just wasn't accounted for, or different people are different, right? There are lots of use cases, and we don't always account for all of them. And then finally, life happens. Not everything follows a neat path, and there are a lot of e eventualities that we have to account for in our products. I'm going to deep-dive a little bit on each of these.

For the first one, abuse, it's when bad actors show up. For this one, I'm going to talk about Zoom a good bit in this presentation because I think it is something that all of us have probably spent a lot of time in. And if someone wanted to wreak havoc on Zoom earlier in the pandemic it was not hard. The default setting was such that any meetings participant could share their screen without any host permission. Anyone with a public link could join a meeting and there were a lot of instances of Zoom-bombing.

I don't know if you experienced this, but at the very beginning of the pandemic, once we were stay-at-home orders were issued, I was bummed about it, and I can now attend all of these conferences and talks, and all this content from all around the world that had gone virtual that previously wouldn't have had access to, and so I was attending all of these things, and one of these talks that I attended, a meet-up talk, I went to it, and my first experience with Zoom bombing, very confusing because a couple of minutes in, a couple of teenagers took, I guess took control of our Zoom meeting, and the event I was super excited for not only got cancelled, they just shut it down, couldn't figure out how to turn it off but incredibly uncomfortable because there were these kids saying like slurs, there was some inappropriate content shared.

It was just a mess. And so that is in some ways still minor. My experience was still a little bit bothersome, my event got cancelled. It was jarring. But there are situations where that Zoom-bombing had rule the resulted in a lot of racist and misogynist comments and anti-Semitism. There were all sorts of things that people were subjected to against their will and unexpectedly.

So this is a example of where it was just a bad default setting. Now that they've changed the setting so that not anyone can share their screen without permission, those kinds of instances are a lot less likely to happen to happen.

Even though I'm talking about slowing us down and not moving so fast, sometimes, it's a matter of taking a minute or an hour to think about what are some ways that this could be a little different? Another example is some years ago, Facebook Messenger had a feature if your GPS was turned on and you were KPMG the with someone, that person could chat a message you sent and see your exact location.

I used to use Facebook Messenger a ton back in 2015 and I didn't know that. That kind of scares me looking back. I don't know who I shared my location with. It was not something that you enabled on purpose. It was something I like to call an aggressive default. As you can see in this tweet, it is kind of scary, because it can lead to a lot of abuse.

Here's an example of actual abuse occurred because of this oversight which again I want to make clear this feature was probably intended for convenience, it wasn't intended for anything bad. You can probably have an easier time where each other are if you're talking about your best friend, but it could pave the way for stalking, invasion of privacy.

We are often blinded, I think, by our good intentions, and this example where this person is talking about how her ex showed up and her boyfriend started throw beer bottles on her car, made her stalking worse. There was some damage there, property damage for sure, there was probably psychological damage, but it also has implications for physical harm down the road if the stalking would have continued. It doesn't take a lot for a bad actor to take advantage of a situation that doesn't have some failsafes in place.

Now, the next category is just everyday use. There are so many use cases, so bad actors aren't the only ones that cause harm. Lots of times, the harm can occur in just everyday situations, just because people use things differently, people are different, and, if that is not accounted for, we are going to have bad consequences.

So last year, there was a report of a glitch in Zoom that caused parts of people's screens to show even when they were not sharing. I'm sharing a screen with you, and thanks to this glitch, it could have been possible that you might have been able to see my, if I had my texts open, or something like that.

You might have been able to see all of that which might seem minor, like maybe you're in a work meeting and accidentally people could have seen the guilty pleasures show you're watching on Netflix, or could have been something serious, health records open, you could have a conversation that was really heavy with a family member or something like that. I know that there is ideally, you could say if you want to prevent security issues, just keep personal things on your personal computers, and professional things on your professional computer, whatever. But nowadays especially as the last few talks have shed a light on, we are spending so much of our time remotely, our lines between be ...

I think instead of expecting that humans are going to be behave in this very predictable way, it's really important to take into account that life - things are going to happen, and people are going to use things differently. In this case, it was a glitch.

But it is also common, I don't know how often you've seen it, I know in the last two years of using lots and lots of Zoom, I've seen it a ton, where people have accidentally screen-shared something they didn't intend to. Part of it is because it is not super obvious that like how or whether you should share one window or just a tab, or silence notifications. Those things aren't really laid out clearly, and that is a design problem. But the example here I picked it especially because it seems super minor to me in the sense that someone accidentally screen-shared their personal calendar during an interview.

If you think about what consequences that could have, it suddenly becomes clear that it may not be that minor, because the person could have been something on the calendar that maybe was seemingly inappropriate that turned off the interviewer and cost this person the job. It could have been a situation where maybe all of the other interviews they had scheduled were shown on that calendar, and that could have been problematic, or maybe they had listed where they were going to be on Saturday night, and if there was someone in the crowd that was listening to the interview that wanted that information, they suddenly had it. Just little things that could happen.

This New York Times, I loved the title which was, "How to not ruin your life or just die of embarrassment with a screen share" because it's such a common experience, but those fail director safes haven't really been built in. We're not really warned that something else could happen.

One example just from something that I observed recently was we had someone accidentally at work end up sharing that she was pregnant. She got a text from her husband that popped up on the screen, which again, arguably the notifications was something she could have turned off, but we need to make sure that our software is proactively accounting for situations like that, and warning people of preventing those errors in the first place.

She accidentally announced her pregnancy to everyone and it was not something she wanted to share. It can be really tricky. This example is another one of just if we're not accounting for differences between people, different use cases, things can go really badly.

And this example just breaks my heart. It's a great sad example of algorithmic bias, but in it, this person's daughter was trying to take to take an exam. A proctoring software she had to use. Because of her dark skin, she was not able to be recognised by that software, and so she ended up having to literally put a flashlight above her head just to take the exam.

Not only is that inconvenient, but it is outright dehumanising to have to do that, it's humiliating, take a toll on your self-esteem and likely caused her to have a worst grade because she wasted time on that which would lead to fewer opportunities, and it could just be a whole plethora of problems. And it really brings to light are our data sets representative and diverse enough? It is not something that we can shrug off and go with the most common use case and expect everything to be okay. We need to account for when people are different, and when people use what we build in different ways.

Finally, the last category is just life happens, right? So as I say, like life is what happens when you're busy making other plans. Things don't always follow the happy path. We all have bad days. Sometimes we have bad years, and so we really need to make sure that we are proactively building for those kinds of things.

An example is the feature, or features, I guess, that kind of are in the same vein of looking back at past memories, right. On Facebook, Google, even Apple, there are a lot of different ways that we can look back at our previous photos, and memories, and things like that, and generally speaking, these features are wonderful, right? They can bring a lot of value, bring a lot of joy, sometimes you can get something, see something where it's like I haven't talked to this friend in three years, and an excuse to reach out to them.

Sometimes, you're reminded of a reunion or birthday, and it puts a smile on your face. The intentions again obviously are good. However, the reality always isn't. You don't have to do anything wrong in order to sometimes get on the wrong side of this. I will give you guys second to read these.

The gist is that these Google photos and Facebook recover is in some ways reminding people, some people, of really painful memories. One person's talking about how they just lost their little sister, and Facebook's reminding them of when they friended their sister. Google Photos is here is this memory from two years ago, when it could have been the break-up, or the loss of a loved one.

Another reminder, intention is not equal to impact, and we need to think not just about the good and what good a feature we are really excited about could bring but also what pain could it cause? How could we prevent embarrassment or danger? How could we prevent heartache? The good news is that there are some actions that have been taken ideally since this was done before, like now Facebook memorialises when a friend passes away and that way you don't get any reminders associated with all of these things, but that doesn't mean that - a number of people should have had to go through that emotional pain first, if it had occurred to someone that like, oh, what if something really bad happened to someone, how can we be sure to make sure that they can opt out or they are asked before we just push this new photo on them? Maybe we should make sure they're in the right head space for it. There is a lot of harm we could have prevented.

Here is another example, very similar, where Apple kind of automatically creates albums, and most of the time, it's really helpful. I mean, I know for me, it's been valuable in the past. But it's a complex problem, it's not an easy clear-cut solution, but in this situation, Apple was classifying photos associated with a painful memory for someone as celebratory, so I think there is sometimes an inclination to make assumptions, and we have to do that in our line of work, and we take assumptions, and we take them too far and don't account for all the different situations in life that someone might be going through.

And here's a quote that I'm going to read to you because I really loved it, and it is from one of the co-authors of Design For Real Life. And he actually started writing about this especially after losing his daughter. He lost his daughter, and then was sort of bombarded with all sorts of memories that were incredibly painful. And so I'm going to read this to you. It says, "This inadvertent algorithmic cruelty is ... years showing them selfies at a party, or whale spouts from sailing boats, or the marina outside their vacation house. For those of us who lived through the death of loved ones, or spent extensive time in the hospital, or hit with divorce, losing a job, or any one of 100 crises, we might not want another look at this past year." It is oftentimes when we're in the mode of creating, we don't really remember that, and so it is really important to account for those differences and just how life goes.

So finally, I want to talk about, okay, we've heard all these terrible examples of just ways that people have been hurt, but how can we use our privilege for good? What can we do to make a difference? We can't obviously prevent everything from going wrong. Some of these are very nuanced. But how can we make some baby steps? So, a couple of principles to keep in mind: one is that always think worst-case scenario.

Try to think about what if everything doesn't go right? What if it goes terribly wrong? Is there a situation in which this exact feature, or this exact message, or whatever it is that I'm building, is there a situation which this could get someone's day completely, take it off the rails? And then also, assess the stakes. There is some industries in which, or some spaces in which the stakes are far higher than others.

For instance, any time you're talking about let's say finances, or someone's health, or even like travelling, or getting from one place to another, or the government, like, these can all have ripple effects that can really result in some negative consequences that are maybe not as minor as like, okay, if you can't use, let's say, if you can't post a picture on Instagram for a day, maybe you'll be okay, but if let's say you need to apply for a visa to go abroad so that you can go and see, attend a loved one's funeral or wedding or something like that, that has higher stakes.

If one thing doesn't work for someone, it can have a bunch of ripple effects. It's really important to this about that. Finally, consider other people's experiences. We're not the expert. We were the experts on our own lived experiences, and we are the expert on how to do our jobs, but ultimately, what we are building is usually for people outside of ourselves, and so it is so, so key that we don't just think about how we would use something, but we think about beyond our good intentions, what are some of the things that could go wrong? And then there are activities we can do that kind of help anchor us and give us a little bit of structure as we are trying to think about what the different ways something could go wrong is.

I don't know how many of you watched Black Mirror, but, there is an episode called Hated in the Nation, and it is a brilliant exploration of data security, and robotics, as well as hate speech online. But basically, in this en sowed there is a government agency that made robotic bees. Right there, that is a good intention, they're trying to replace the dying bee population. However, these bees are hacked by someone, a bad actor, who uses them to kill people who are criticised online. The fact that a bad actor like a hacker got - that probably wasn't something ever accounted for, but all of a sudden, the solution that was supposed to help has become this like terrifying threat.

So the idea behind Black Mirror brainstorming is you can do a cross-functional workshop with your team, like have people from different functions there, and try to, and they can be a little bit fun. Try to create a Black Mirror episode where you don't have to write everything out. Try to think if the plot of a Bipolar Mirror episode resolves around your team's products, what are the ways in which it could be misused, the way it could accidentally result in something really bad.

What would the villain in this episode do, right? And kind of a light version of that is it that it is similar to a pre-mortem, where was a retro or a post-mortem where you look back, saying what went well or wrong, instead, you're almost pre-emptively trying to figure out where are the laces that could go wrong. What should we do differently? How should we change course? I'm sure many of you are familiar with these already.

But, usability, and of usability testing can be so, so important. The screen example I shared is one that is not necessarily a mistake, but people aren't warned enough, before someone has shared their entire screen that said everything that pops up on the screen will be visible to your audience, that might give people pause.

That could have been revealed in usability testing, especially if the results were taken seriously, and actually included in the plan going forward. But another thing that you can do is also of usability testing.

Basically, the idea here is that instead of just thinking about well-meaning people, and like personas, or, you know, people are maybe different, but trying just to do complete their tasks as intended, instead, try to think about like bad actors.

What if you had for every person that you have, or every use case that you think about, try to include one that has somebody that is intending to do harm, and try to this through, okay, what is it that this person might try to do? Is there a way that we could get ahead of it or prevent it in the first place?

Also, when conducting research, is a great way to learn more from other people's experiences. When you do that kind of research, or testing, it can be really valuable to have the, ask the people that you're talking to, to tell you about a time, assuming it's something that is built especially, to tell them, they can tell you about a time they have had a bad experience, not just in terms from a technical standpoint, but a bad emotional experience, because is there some time that they used something a colleague built and it caused some sort of distress, or it had consequences that maybe you had not considered? It could also be a really good way just to think about what is happening next.

And then also more diverse recruiting and testing. When I say "recruiting", I'm talking about both on your team, because of course if you have more, I think this topic is very, very familiar to all of you, but if you have a more diverse team, you're more likely to mitigate each other's blind spots and realise that like hey, something here could go wrong based on the experience I have had in the past. You're more likely to be mindful of accessibility. So recruiting for your team, making sure your teams are diverse is super important.

Also at the next step, recruiting, making sure your data sets are representative and diverse, making sure when you're doing testing, it's with diverse participants, it's not with everybody who looks and acts, and thinks exactly the same. So that can also help you kind of triangulate different perspectives.

So now to kind of wrap up and give you some really practical questions that you can ask yourself, as you're going through a project, even if you don't necessarily want to re think the entire project, just start to think about who is going to use this? Who is going to use it, how are we choosing that, and on the flip side, who can't use this, and why is that? Are we intentionally excluding this group, or it can be a good way to figure out whether you're just forgetting certain people. And then after that, ask about what if it doesn't work for them.

So it is really important not just to think about okay, well this is what we are building because we know these people want to complete this task, but what if they're not able to, and what is the worst case scenario? What could happen if they can't complete this task? And then finally, what are the consequences of that? Is this kind of a siloed experience, or would there be some follow-up issues? Can the pain compound?

If someone was trying to access something again related to their health, if they weren't able to get through a system, one of the examples that came up earlier in the pandemic for me at least at least was me, something where there was a form, a minimum for a last name, it had be at least three letters, so anybody with a last name of two or few questioner letters couldn't go through and submit this form. It was for a Covid test.

You can imagine here, that creates a lot of problems, because if there this person thought they had Covid, wanted to sign up, weren't able to, because, you know, the form wasn't allowing them to go through, because it hadn't been accounted for that oh, maybe we should allow people with just two-letter last names. There are over a million people with just with last names that are two letters around the world, that could then lead to follow-up effects, right? That person might go to a party, other people might have gotten Covid. It could prevent them from getting care when they need. It could prevent them from being able to to go to work. You can see what I'm saying.

There is just these compounding effects that can happen. So when we prioritise, we - but as a greater society, and that is really high price to pay. Try to keep these questions in mind. And then there is the inevitable question of there are times when we can't slow down.

There were times when we really don't have much of a choice, and we have to, for whatever reason, whether it is a mandate from a boss, or whether it is a situation where not moving fast would actually be more problematic than putting something out there, in those situations, make sure that you're owning it.

Own that it is not ideal, acknowledge the consequences, just make sure that you and your team are aware of what those consequences are, what the potential fall-out is, what those stakes are. And then manage people's expectations. Communicate to them what they can or not expect. Don't waste their time.

An example is that if let's say at this conference we're really lucky we have live captioning, but if say that wasn't available, it would be really important to make sure that that was known to people so that if someone was planning to attend and really needed the captions, they wouldn't have cleared their schedule to attend only to find out they weren't going to be accommodated. Communicating clearly in advance with people about what is not there is really valuable. F

inally, revisit your solution, that we call it an MVP for a reason. Let's not make an MVP a final result. Iterate when you do have more time, and it can help to put something on the calendar in the future and say, all right, we are going to come back to this when we're not in the fire drill time-crunch mode. As the Hall of Fame basketball player said, if you don't have time to do it right, when will you have time to do it over? Rushing through something might save you time now, but ultimately, it will take you more time in the future, especially if there are problems, and everybody will be in a new fire drill mode.

Sometimes saving time is great, but not when it comes at a cost. This is a saying that the fifth grade teacher liked to say all the time because he didn't like to correct our work and it stuck with me over the years: be lazy, do it right the first time. Let's not wait until something has gone off the rails and caused harm.

Let's try to be the voice in the room that brings these things up proactively. It will help us sleep at night, feel better, help our customers in the long run, and it will just pave the way for a better society.

Thank you so much for the gift of your attention, and your time, and please feel free to reach out to me. I'm on Twitter, way more than I should be, and you can also find me on LinkedIn if you prefer.