Why algorithms aren’t working for women

Image: UN Women

Algorithms are everywhere. They’re the unnoticed mechanisms that determine everything from the temperature in your office, to the content of your Facebook feed, to the information Google shows you in search results. But there’s one big problem: these algorithms are not actually passive or value-neutral. I spoke with feminist and software engineer Liz Rush about how algorithms, although invisible, function to reinforce and promote sexism and racism in our increasingly Internet influenced and informed society.

Susan Cox: So what exactly is an algorithm?

Liz Rush: Simply put, an algorithm is a set of rules or steps that are taken in order to calculate or problem solve with computers. It’s like a recipe. You have different steps that you take in a certain order and when you’re done, you have the dish as the end result.

SC: What affect are algorithms having on modern society?

LR: Algorithms are everywhere, but they’re almost invisible to us. They’re in your Facebook feed, what you see on Twitter, all sorts of things on the Internet — even search engines. This content you interact with on a daily basis is all informed by powerful machine-learning algorithms. What you’re seeing is filtered through algorithms that have been customized and tweaked based on what a company or program thinks you would like to see.

This is a concept we call “the filter bubble,” which refers to the algorithms that affect the bounds of what we see on the Internet. When you use Google and enter a search term, it doesn’t just give you the best results that most closely match that term. It takes your search history into account, it takes your demographics and online shopping habits into account, it also takes into account what people who are in the same city as you search. So if you and I search for the same thing, we are likely to have similar results because we’re both interested in the same topics, we’re both white women, we’re both millennials, et cetera.

SC: So the results aren’t just based on finding the most relevant and accurate content for the search term?

LR: They have an algorithm that determines the websites that pop up in search results, based on the content of those websites and a variety of other factors that determine whether or not they think it matches your input. But that data is becoming more and more entwined with your personal data and the data of other people.

For example, when you see Google’s auto-complete feature, that’s when you see data that is a machine-learning algorithm —  it’s trying to guess what it thinks you’re trying to search for based on your history as well as users similar to you. And if that is what you’re searching for, it tries to learn whether or not it was correct.

But of course you can see how an algorithm like that could learn bias, and you can see it when you search for: “Why are women so…” Often times it’s not a very flattering picture that pops up of women, or of people of colour, or of any minority.

SC: So sexism and racism can become embedded in the algorithm itself?

LR: The topic of whether an algorithm is sexist or racist brings up a lot of inflamed opinions because the truth of the matter is when you write an algorithm, it’s like writing a a recipe. In and of itself, it isn’t necessarily racist or sexist or classist. But, when the algorithm learns from biased feedback, or uses starting data that’s biased, that’s when racism or sexism become the result of the design.

Let’s say, for example, you’re searching the name of a historically black college in the United States. If you search that on Google, it’s been shown that it is more likely to bring up arrest records. That’s not something that’s been programmed, but it’s racism that the algorithm’s learned over time.

SC: Wow.

LR: Or if you search for jobs online as a woman, you are less likely to see high paying jobs than if you are a man. That pattern has been proven over and over again.

So that’s the central issue of algorithms: Regardless of the intent of the person who wrote the algorithm, if the algorithm’s design allows for self-reinforcing bias to continue — to propagate — then the results will become the intent, if they’re not corrected.

SC: We often view technology as something that is value-neutral — just cold, hard calculation. We don’t think of Google as having content because it’s supposedly just a reflection of the online world. But you’re saying that Google actually takes an active and significant role in shaping what we see online?

LR: Absolutely. People do say that this technology is value-neutral or passive, but that would like be saying that a virus is value-neutral or passive. It might not have the personal judgment to affect an individual on an individual level, but it’s not value-neutral, and it functions as a collective system that can be directed one way or another.

You hear the word “algorithm” and you think of math, you think of science, you think of computers. And we believe computers are neutral because they’re just zeros and ones, when it comes down to it. But the reality is that people design these systems and they are making ethical choices about how these systems should function.

SC: Are there other examples of these biases?

LR: One example I like to use for illustrating how people’s beliefs impact the design of algorithms is from before computers took over our lives.

Back in the day, somebody had to design a system — an algorithm — for determining the best optimal office temperature for air conditioning and heating systems. So they took into consideration a variety of variables: The temperature people feel comfortable at, the difference between what the temperature is and how it feels on your skin, and, of course, the body’s resting metabolic rate. And that all sounds like valid science and valid variables, right? But the reality is that the body’s resting metabolic heart rate varies from person to person, and the designers used the resting metabolic heart rate of a middle-aged white man.

It’s kind of funny because you think of your air-conditioning unit as completely value-neutral. But in actuality, office temperature optimization was designed around a bias that men’s comfort in the office is what you should use at the baseline standard.

SC: So that’s why I’m always so cold in offices! I had no idea.

LR: Yes. That’s how deep and hidden some of these biases in algorithms can be.

Another example is the algorithms in our medical devices. There’s an algorithm in pacemakers to help determine your heart rate. But pacemakers were originally designed for men only. They were too large to fit in women’s chests in a lot of cases. So if the algorithm that determines when your heart should beat is not developed for your body, you’re going to be affected by that algorithm.

SC: The air-conditioning example seems like poor office planning, and the potential bias in medical devices seems downright dangerous. But are there companies where the sexism and racism in their algorithms isn’t just a mistake, and they’re actually profiting from it?

LR: I think that’s one of the primary issues here: that we have an inherent conflict of interest where companies want to use algorithms in order to manipulate their users (for example to get people to stay on their website longer, purchase more products, et cetera). There’s an inherent conflict between that and the transparency needed to have ethical oversight.

 

SC: When I do a Google image search for anything related to women, why are the results so pornified? Especially considering that Google is taking my search history and interests into account? I never search for porn videos or anything of that nature but when I put the word “woman” into Google image search, the top terms Google suggests I add sound like porn genres.

LR: Yep. It’s pretty distressing. When you search for “woman,” it’s not only based on your data. They know that the vast majority of users who are searching for the term “woman” will likely click on pornography.

SC: So many of the image search results for anything related to females are basically soft-core porn. And the “safe search” option seems to have no effect on this. This is even the case for female children when you search for the term “girl.” 

LR: It’s horrendous. For adults, we can rationalize the situation and say, “Okay I know that porn is a huge driver of the Internet and all of these technologies.” But if you’re a 12-year-old girl getting on the Internet for the first time, trying to find a Nancy Drew video game, the chances of you looking at porn accidentally are extremely high because the algorithms have this built-in process to try to optimize users clicking on results, as well as ads. And pornography is a huge driver of that business.

SC: What about when companies directly profit from algorithms that promote sexism and racism? For example, recently a group of males gang-raped a 15-year-old girl and streamed it on Facebook Live. As is the case with all Facebook content, there were ads on the site, next to video, meaning that Facebook was profiting from the rape and the promotion of the video which happened through their algorithms.

LR: It’s horrifying. The company’s defence would be that they don’t control the content, and that’s true. Nobody went through and said “yeah, this video should be shared.”

But, in fact, it was a decision made by an algorithm to have the video show up in other users’ feeds; and the more people click on it, the more it will show up in other people’s feeds, so it can have that viral effect. And everyone looks at it and sees the ads on the side, so that benefits the company.

There’s an obsession with optimization in algorithms, meaning that they’re intended to function perfectly for each user. A lot of algorithms are so highly optimized that there is no way to back out of that feedback loop. So with the example of the live-streamed rape, if one woman sees it and reports it to Facebook and there are 30 other men watching it at the same time and sharing links to it; the algorithm is going to value the activity of the 30 men.  When the company’s goal is to keep you engaged with the content, there is an inherent conflict of interest between that goal and making sure the content is appropriate.

SC: If someone saved the video of that rape broadcasted on Facebook Live and uploaded it somewhere else, would it then be in Google’s search results, meaning that Google would be profiting from it, too?

LR: Yes, it would be on the Internet basically forever. And a lot of what happens is that images and videos are automatically mirrored on servers around the world. It’s not like it’s on one website and you can just say, “please take this down,” and it’s gone.

SC: It’s interesting that, while Google is not a porn company, pornography is really embedded in what they do. Pornography makes up 30 per cent of the entire Internet, because of this optimization process, algorithms reinforce the sexism and racism that exist in society by amplifying the trends of what people click on, which is often pornography or pornified images. 

LR: This conversation gets further complicated because many organizations that fight for privacy rights on the Internet — for fair access to the Internet, or against regulations that benefit providers over users — are funded by porn companies.

A lot of organizations fighting against the selling of personal data online, for example Porn Hub and YouPorn, are part of an ad campaign that claims they are fighting for your privacy online — the implication being that no one should know what pornography you search for or watch. It’s a strategy to make them appear to be on the ethical side of technology and to align themselves with the progressive left and free speech. But they also do this to ensure we are less likely to discuss important issues regarding the ethics of pornography on the Internet.

SC: But then what about the girls like the 15-year-old in the Facebook Live rape? What about her privacy? She seems to get lost when algorithms are optimized to promote porn, and the conversation centres on the privacy and freedom of porn users. 

LR: Her privacy gets lost, but, in addition, when a story like that comes out, searches for the content are driven way up. When there is so much publicity around it, it immediately leads to a huge spike in people searching for the content itself, which retraumatizes victims of pornography. Similarly, every time there’s a news story about a celebrity’s nude photos being leaked, the next day that’s the top search.

SC: And more searches for certain content, like that girl’s rape, means the algorithm will learn to promote it more in its search results, correct?

LR: Correct.

SC: What can we do to stop failing these victims?

LR: We start by not victimizing them further in expanding technologies. For example, facial recognition algorithms and facial recognition databases are becoming an increasing concern. The current discussions around facial image recognition mostly focus on criminality and the right to protest. So this topic is becoming a mainstream issue, for good reason, because we are worried about privacy and the right to free association.

But the glaring omission from this conversation is the impact of this technology on pornography and culture. We have to worry on two fronts:

1) That the software can be used for evil against the victims of pornography, used to identify them. For example, in the past a woman might have had an intimate photo of her published without her consent but published anonymously. With facial recognition algorithms, it’s becoming more possible to use those algorithms to identify someone in an image.

2) I worked for a company where I was the only female employee for a time. And our CEO was so excited to talk about how we could create an app that would let people search for porn stars based on an image you would upload of someone you know, such as a friend, a co-worker, a sister. The idea being that there’s someone that you “like,” so you upload a photo of them and an algorithm tries to match that photo to a database of porn stars.

SC: Are you kidding me?! That would be so violating!

LR: Absolutely! And despite the fact that I said, “We should not build that app under any circumstances,” my voice wasn’t taken seriously. It took another male at the company to say “absolutely not” for my concerns to be heard.

This is another issue when we talk about ethics in technology. We talk about needing to get more women in tech, when we’re actually here. We’re here now. But it’s not just a matter of numbers when the culture itself is so toxic and dismissive of women’s concerns that it invisibilizes women’s work and women’s critique of the work that exists.

SC: Is there hope to address sexism in software technology?

LR: The truth is we’re never not going to continue developing facial recognition software because that’s a really powerful tool. But if we could only imagine a world where we build software that prioritizes ethical choices. Maybe we could build software that helps identify images so that the victims of pornography could more easily get unwanted or illegal uploads of their private images removed.

It’s a choice of how we use the technology.

SC: And aren’t the stakes quite high? It sounds like we’re talking about serious issues endangering women’s privacy and safety.

LR: We are. And often the result of algorithms is that they create data or glean it from your behaviour online, which may not be data that you’ve chosen to disclose to other people. For example, on Instagram you may follow someone and you’ll be recommended other people to follow. That data is generated by algorithms trying to get you to stay on the app and interact with more people on it.

This is an issue for women’s privacy particularly. Even if you are being safe online, the services you use create data that can be used to identify you, data that you did not choose to disclose like your networks, who you’re connected to online, and how you interact with them.

SC: This can be especially dangerous if a woman is trying to escape someone who’s harassing her, or a stalker, or an abusive ex-partner, right?

Exactly. In the security world, they call this the “ex-boyfriend threat model.” When you think about online security, a lot of people think about hackers and big government agencies. But the truth of the matter is that the biggest security threat online is not a hacker, but someone you know, like an ex-boyfriend or an abuser. So this is a threat that has been taken seriously in security design, but it’s not yet a threat model that has been taken seriously in the algorithm design community.

This was actually brought to light recently because it was how a reporter, Ashley Feinberg, was able to find the Instagram and Twitter accounts of the FBI director James Comey. She was able to do this based on who he followed and interacted with on the apps.

This means that if you have a stalker or abuser blocked on an app, they can still eventually connect the dots to identify you though the data of recommendations and connections disclosed without your consent in these algorithms.

When you think about it, you realize that someone in the FBI has all the resources at their disposal to have the most secure Internet experience possible and the best privacy. But even someone with those means and all those resources was still found. So think about what that means for your average woman who’s just trying to get away from an abusive ex.

SC: Thank you for sharing your expertise on this important issue. Is there anything women can do to protect themselves?

This all sounds very scary. No one wants to fear monger about the Internet. However, I do recommend that everyone does the basics of personal Internet security.

But the real way to start changing this technology is to make sure that we all get involved. That means having more conversations about it, learning more about it, and really taking seriously the fact that the technology you use impacts you and the world around you.

*This conversation has been edited for clarity and readability.

Susan Cox
Susan Cox

Susan Cox is a feminist writer and academic living in the United States. She teaches in Philosophy.

Like this article? Tip Feminist Current!

$
Personal Info

Donation Total: $1

  • rollingearthsong

    Algorithms that give people more of what they like or want are not ‘racist,’ or ‘sexist. An individual’s Google search can reveal a whole lot about their search and viewing habits, and if a person is seeing porn pop ups or receiving ads for sexist or racist content, that is most likely a reflection of their (or someone else who uses their computer) racism or sexism, not some biased algorithm.

    • Cassandra

      You may want to look up the meaning of “reading comprehension” and try again.

    • fragglerock

      You’re absolutely right and Rush stated that in her interview:
      “The topic of whether an algorithm is sexist or racist brings up a lot of inflamed opinions because the truth of the matter is when you write an algorithm, it’s like writing a a recipe. In and of itself, it isn’t necessarily racist or sexist or classist.”

    • Marla

      When I was discharged from the military one of the first jobs I had was working for a major capitalist radio conglomerate (who shall remain nameless). They would use algorithm formulas computed from bullshit think tank/focus groups, questionnaires, demographic studios such as age, sex, martial status. degree of education, income, what have you. The point of this was to specially target individuals in these demographics and bombard the shit out of them with advertisements for shittier products they didn’t want and/or need.

      Now whether a sexist/racist intent was intentional is debatable either by happenstance or haphazardly or at least I first saw it that way. When it was determined that wealthy white people were listening to a certain white station that catered to them such products as BMW, Mercedes, housing mortgages, European vacations deals. A black station on the other hand with spew advertisements for coupon specials, and yes, drug addiction treatment centers; women were treated to ads for cleaning products and clothing sales. Strangely enough most of these commercials were all recorded in one location by a handful of voice actors.

      While the algorithm itself is not at all sexist/racist, it has been designed to propagate such social distinctions that certainly make it look that way. It’s all in how the technology is being used.

    • Olivia

      You should probably read the interview before commenting. You’re making yourself look foolish.

    • FierceMild

      Soooooooo, you didn’t read the article then.

  • Karla Gjini

    this is really scary and interesting all at once! thank you for doing this interview!

  • fragglerock

    Fascinating. Terrifying. Frustrating as hell.

  • Anon

    Important info. Thanks for sharing!

  • Witch

    “And our CEO was so excited to talk about how we could create an app that would let people search for porn stars based on an image you would upload of someone you know, such as a friend, a co-worker, a sister.”

    Jfc.

  • Laura

    Thank you for this article! Liz Rush, do you need an intern?

  • rollingearthsong

    I am very active online -on social media, researching, and shopping- and do NOT have particularly racist or sexist materials showing up in my feeds, or even in my ads. My feeds and the ads presented to me, almost always reflect, in some way, something I have been doing, or researching. Perhaps those of you who seem to reflexively externalize, and mock and dismiss people with whom you disagree, might want to stop blaming the algorithms, and look instead at what you -or someone else using your computer or internet- are reading, listening, to, and buying, that is resulting in your seeing the material which you find so offensive.

    • Tired feminist

      Right. And is there anything women and/or feminists are not to blame for?

    • Kendall Turtle

      Posting pretty much the same comment twice is not going to make us all bow down and listen to you…

    • will

      “reflexively externalize, and mock and dismiss ”

      Most of the opinions here (unlike yours) are informed and carefully considered over a long period of time. This accusation is pure projection on your part.

      “stop blaming the algorithms […] that is result[] in your seeing the material which you find so offensive.”

      See? You assume that the critique and analysis here is based purely on “offence”. You are mistaken about that. You seem to be both ill-informed and solipsistically focussed on one reactionary response that you insist on repeating. It appears that you find feminist analysis to be quite a threat.

  • Juan

    I agree this is a problem but how do we change it?

  • Sometimes I’ll be shopping online and think about buying something and decide reluctantly not to, either because I can’t really afford it or I decide it’s not healthy for me, and then the item pops up everywhere, teasing me to buy it. Just because I was looking at it and decided NOT to buy it. I don’t think that’s right or fair. I feel manipulated.

    • Cassandra

      Clear your cache and cookies every time you leave a site. It’s not right or fair that we have to, but that’s how you get rid of the most obnoxious aspects of tracking.

    • Sabine

      I followed the link above about the basics of personal internet safety and ta-da! No more irritating ads now I have followed the suggestions. Brilliant!!!

  • Wren

    This comment needs to be distributed in print and nailed to the door of every residence and institution. It’s that good and important.

    So many times my sanity has been nearly undone by seemingly good people choosing this post-modern relativism over true wisdom and compassion. Technology distracts us from the awareness that our souls are disintegrating.

  • Witch

    Yeah that’s it.

  • Polly MacDavid

    MEN.

  • DeColonise

    “there is no truth outside of you because life is fiction, because all knowledge is provisional, all constructed like a story is.” post-modernist psychosis in a nutshell.

  • Sabine

    I’ve not done this as I refuse to use Google (Duck Duck Go is just as good and does not track users!) but I can WELL imagine what the results would show….

  • Tired feminist

    Keep doing it until no man in your social circle uses porn. Donate to organizations that fight the sex industry. Campaign for legislation that puts limits on internet porn and/or that protects online privacy. Boycott Facebook if you can. Report misogyny and racism online when you see it.

    • Juan

      I will continue to bring up porn and prostitution in the correct situations. I’m not sure about donating money. I don’t understand putting limits on internet porn. Shouldn’t it be the elimination of internet porn? Why boycott Facebook?
      As a man, I don’t see misogyny as easily as women do. I would go as far to say anyone that has privilege, class, white, or male, has a blind spot to that privilege. This is not to say that privileged people can’t do better to understand their privilege. If you don’t live it, you will never full understand it. IMO

      • Tired feminist

        Well you can’t ban porn from the entire internet, it won’t work… porn is a billionaire industry with much more lobbying power than we will ever have. The only way to bring porn and prostitution completely down is to end demand. The goal has to be that no man could possibly feel aroused by the humiliation of women (or of anyone).

        Facebook is a cesspool of misogyny and will always be, because it was *founded* on misogyny.

  • FierceMild

    Out of curiosity, why did you stop using porn?

    • Juan

      Reading about the effects of porn then seeing the effects on me physically was the biggest reason I stopped. Understanding how and why it hurts us as a society is why I won’t go back.

  • rollingearthsong

    People are being slaughtered, raped, and starved to death around the world, and spoiled, likely mainly White, Western ‘feminists’ are whining about the user-targeted content showing up in their electronic media feeds.. Again, if you don’t like your feeds, stop looking at content that is inconsistent with your idea of who you are. Also, my use of the word ‘particularly,’ in my previous comment, was intentional, and significant.

    • Meghan Murphy

      BYE.

  • pimpinela escarlata

    Yeah like this microsoft ia, that was supossed to learn from interaction and users teached it to be a obnoxious racist edgelord

    https://www.hackread.com/microsoft-delete-ai-bot-after-it-went-completely-nazi/

  • Meghan Murphy

    Ah ha! So you are an MRA after all… Tired Feminist was right. Bye bye.

  • FierceMild

    Really?! Tired Feminist smelled you right away. I thought she was perhaps a bit too quick to finger a troll, turns out she was too wise to be taken in.

    Tired Feminist, I hereby offer you a mental handshake on your superior skills at trollspotting and MRA sniffing. What gave it away for you?

  • Sabine

    You’re welcome! I have hated Google for many reasons for a long time and feel much happier using this alternative.