Toxic Tech: An Interview with Sara Wachter-Boettcher

by Kendra R. Parker

On Tuesday, April 2, Sara Wachter-Boettcher, principal of Rare Union, co-host of Strong Feelings podcast, and author of Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech (2017), will speak to the Hope College and Holland communities on “Inclusive Design, Ethical Tech, and All of Us.” In today’s interview, Sara discusses her upcoming visit and her interest in Women’s and Gender Studies. Sara also reminds us that “…filling computer science classes with women will not fix the problems in tech…if it’s toxic to the women, how does that help?”

 What are you most excited to share with students, faculty, staff, and community members who will attend your lecture on Tuesday, April 2?

I’m excited to really connect the dots between a bunch of things attendees have probably seen in the news about tech—from Facebook’s data breaches and lack of privacy to image recognition systems that don’t work for black and brown people—and talk about how those problems have manifested.

So often, news stories about tech are sensationalized, either positively or negatively. It’s either, “Ooooh! Shiny!” or “Tech is evil!” What we need is a deeper conversation about the cultural norms and financial incentives that have led the tech industry to build so many products laced with sexism, racism, bias, and other types of harm—because only when we can have this conversation in a nuanced and meaningful way can we begin to figure out how to fix it.

Did you major/minor in Women’s and Gender Studies (WGS), and if so, how did your WGS major/minor/certificate shape you? If not, how did you come to WGS as an academic discipline or research interest?

I took a bunch of classes and almost added a Women’s and Gender Studies minor to my program, but I didn’t because I was already knee-deep in two majors!

The experience that most shaped my feminist perspective and values, though, was working in a countywide sexual assault support center for three years during college. I answered crisis lines and did admin sometimes, but mostly, I worked with the education program, facilitating workshops with middle schoolers. We talked to the kids about recognizing abuse, how abusers will try to convince you it’s your fault, and where to turn for help. We also taught segments on power and control, consent, and assertiveness.

That experience taught me so many things, but one of the less obvious ones is how much it matters who and what shapes our norms. By middle school, these kids had already learned to accept a lot of things as normal that they shouldn’t have to—harassment, abuse, lack of bodily autonomy. Not to mention deeply ingrained, restrictive gender roles.

And so, I think a lot about how different aspects of our culture influence norms about sex, bodies, relationships, and gender. When I started seeing the lack of diversity and compassion within the tech industry, combined with the increasing power that industry had, I realized we needed to do a better job of talking about its role in reinforcing biases and narrow norms for the world.

Your response is great for all of our readers–you don’t have to major/minor in Women’s and Gender Studies to have a passion for justice. Thank you for that necessary reminder. Now, if this does not give too much away before your lecture, what inspired you to research and write on technology and sexist apps?

It started in 2015 when I was filling out a form online for a new doctor’s office. Halfway through, out of the blue, it asked me: “Have you ever been sexually abused or assaulted?” And it stopped me in my tracks. Because there was no information about why they wanted this info, how it would be used, where it would be stored. There were just these two checkboxes: yes, or no?

For a new doctor with no context to ask me this in a form I’m filling out online broke me open a little bit. And so I started looking at how data we collect online can be problematic—it can be collected non-consensually, it can be collected in biased ways, it can be collected to surveil you, it can be collected to hurt you. Once I was thinking about that problem, I started noticing a million other related problems—and how much worse those problems were getting the more we were relying on artificial intelligence and algorithms in software to make decisions about who you are, what you want, or what you deserve.

Amazing. I always like to say that “research is me-search,” so your experience encapsulates this perfectly. Now, your talk will be on dangers of toxic tech, and you’ll touch on a range of biases, but I am wondering if you will share your thoughts on the need for women in tech—as undergraduate majors, developers, and the like? What advice would you give to current women students who are not considering Computer Science or STEM courses as a major or minor (or even a course in their undergraduate studies) about their value and the necessity of women in the discipline?

I think women are absolutely needed to make technology products that work for a wider range of people, and I definitely think that computer science and STEM courses need to be much more open to women. However, I don’t think you have to study those things to be a crucial addition to a tech project (I didn’t).

One of the problems we are seeing in tech today is that the industry has prized technical skills above all for a long time: if you can code, you’re deemed a genius…even if what you code is an app that increases surveillance and incarceration of black people. Meanwhile, if you understand, say, the historical context of race in this country, or you know about the emotional and psychological effects of living under surveillance, you’re deemed unnecessary to building tech. This is a myth that needs to change if we ever want to have a more ethical tech industry—one that isn’t built on business models that exploit and harm. I think that is very slowly starting to change, precisely because the tech industry as it’s stood has started to show cracks; I know there’s now more emphasis in a lot of tech companies on hiring people with backgrounds in social sciences, humanities, and communication. But it’s big industry-wide bias to overcome.

I say all of this because I think it’s important to note that just filling computer science classes with women will not fix the problems in tech—particularly because there’s a huge issue of women leaving the industry because it’s so unwelcoming. And I don’t think it’s useful to tell women they “should” study computer science, just because it would be good for tech—if it’s toxic to the women, how does that help?

However, I would say, if you are interested in technology, even a little bit, I would absolutely give it a try. There are so many myths about programming —that it’s really hard, that you have to be great at math, that you should have started when you were 12 if you want to be good at it. None of those things are true, and you absolutely belong in those classes if you’re interested at all. So many people gain technical skills at so many points in their lives, from every background you can imagine.

That’s very insightful. Thank you for sharing. I have one last question. What’s on your bookshelf these days—the one book you recommend that we read—and why? 

Recommending one book is an impossible task! But one that sticks out at this current moment is definitely Thick, a new book of essays by Tressie McMillan
Cottom. She does an amazing job bringing together a rigorous background in sociology with incredibly accessible, moving, personal writing, which is rare! But she also gets at the heart of so much happening in this current moment around race, gender, privilege, and political power. Once you read her work, you’ll want more of her in your life, I promise.


We really appreciate Sara for taking the time to share with us! Intrigued? Want to learn more? Join us next week at 4PM in Winants Auditorium (located in Graves Hall). Sara’s event is free and open to the public.

Join the Conversation

1 Comment

Leave a comment

Your email address will not be published. Required fields are marked *