I Searched for “Women” on Getty Images. Time to Burn It All Down.

I’ve been writing on the internet for so long, you’d think little would surprise me anymore. And yet . . .

An outlet I write for has an editorial subscription to Getty Images, one of the largest photo resources on the interwebs. I’ve used Getty for years to find images for articles and blog posts. It’s a useful tool. I’ve run into some photo search issues before, mainly when writing parenting articles about spanking. (Doing an image search for “spanking” brings up a lot of . . . um . . . interesting results.) But a recent search that should have been benign basically made me want to burn everything down and start over.

Here’s what happened:

I was writing an article about violence against women and needed a generic photo of women to use as the feature image. Maybe one woman, maybe a few, perhaps without faces showing—something non-specific that could represent women in general.

So I did a simple search for “women” on Getty. That’s it. Just the word “women.” And holy crap. I’m not going to fully show you what came up because I like to keep my blog as family friendly as possible, but here’s an edited screenshot of the first few results, along with a description of the first ten photos.

1. Rihanna holding her hands in a triangle over her vulva with her tongue out

2. Anne Hathaway in a sheer black dress and no bra, where you can see every bit of her breasts

3. Paris Hilton from behind, with her red thong underwear showing above the waist of her jeans

4. A woman on a beach on someone’s shoulders, her bikini top removed and hanging above her bare breasts, surrounded by men reaching up to grope her

5. Rear view of a very large woman in a bikini, sitting on a beach

6. Brittney Spears in a sheer, glittery two piece outfit on stage on her knees with her legs spread

7. Another photo of Rihanna grabbing her crotch on stage

8. Singer Melody Thornton on a red carpet in a sheer dress, braless, showing her breasts entirely

9. A Victoria’s Secret model walking down a runway in skimpy red lingerie, with what looks like a moisture spot at the base of her skivvies

10. Graziela Shazad in a sheer-topped dress, braless, where once again we can see her entire breasts

[Deep breath and sigh.] I’m a woman, folks. I’m not squeamish about breasts or women’s bodies. But why on earth would a search for “women” on a site with millions of photos bring up nothing but blatantly sexualized/objectified images?

I tried a different tack, and the results were even worse.

It’s worth pointing out that the 12th photo that came up in my first search for “women” was a fully clothed women’s cricket team celebrating winning the World Cup. But of the first 12 photos in a search for “women,” 11 of them were highly sexualized. (#11 was 5 women on a stage in black string bikinis, BTW.) Scrolling through the rest of the first page, that ratio appeared to be fairly standard. About 90% of the photos on the first page were of mostly naked women, women in blatantly sexual poses, or women in some way being portrayed as objectified bodies.

I thought maybe the issue was that I had “Editorial Subscription” turned on. Maybe that filter was the problem. So I unclicked it and searched “women” again, hoping that a broader search would give me some basic, normal photos of women.

This is when I decided that maybe I should just poke my eyes out.

Again, not going to share an unedited screenshot because some of the photos would actually get me into trouble. But let me describe the first 10 that came up.

1. An old, black-and-white photo of women dancing the can-can. (So far, not so horrible.)

2. A former Playboy Playmate of the Year in a sheer shirt showing her entire breasts. (Oh boy, here we go.)

3. That Rihanna photo of her triangling her crotch that showed up in the first search

4. Nicky Minaj sprawled out on a rug in a g-string

5. Coco (actress and model, apparently) doing the splits in stilettos and a string bikini that barely covered her breasts and vulva

6-10. The next five (and beyond) = basically more totally see through tops and bare breasts.

Where I finally hit the breaking point was in the 20th photo, which was full-on, in-your-face pornography. I won’t describe it here because I don’t want people searching for certain terms and finding my page, but we’re talking waaaay beyond anything I’ve described above. Genuine porn, no question. How that photo is even on Getty, I have no idea. But it was right there on the first page of a simple search for “women.”

(I’ve wrestled with whether or not to share the link to these results, but I think it’s valuable for people to see this for themselves. Please be warned that these are not safe for work, not safe for children, and not safe for people who really don’t want to see pornography. Here’s the link to the search.)

I was annoyed with Getty, but soon realized that the problem isn’t them. It’s us.

I was flabbergasted by these results. Flabbergasted and super annoyed. These are supposed to be editorial photos. According to Getty’s own description, “Editorial images include news, sports and entertainment images that show real-world people, places, events and things intended to be used only in connection with events that are newsworthy or of general interest (for example, in a blog, textbook, newspaper or magazine article).”

What on earth was going on? These are the photos we’re given for “events that are newsworthy or of general interest” when we search for “women”?

As I rage-scanned the screen, I noticed that there are four ways to sort photos: Best Match, Newest, Oldest, and Most Popular. I hadn’t realized that my search was automatically sorted by Most Popular.

When I clicked Best Match instead, here are the first photos that showed up:

Huh. Whaddya know.

When I clicked “Newest,” this is what showed up:

And “Oldest”:

How about that.

So “Most Popular” was apparently the problem. A search for “women” brought up page after page of sexualized images because those are the images people are pulling from Getty the most often. The issue isn’t really Getty; it’s the people using it. (Although, I will say that having pornographic photos on a public stock image site is HUGELY problematic. I should not have been able to pull that photo up without some kind of gateway or warning or something.)

The irony is that I made this discovery while writing about violence against women.

The entire reason I was looking for an image of women is because I’d written about a UN report on how women’s families and intimate partners are their biggest murder threat and needed a generic image of a woman. Incidentally, that report points out that the reason men kill their female partners is usually because of possessiveness—because women are objects they own and control at their will.

I know there are those who don’t view sexualized images as problematic and who view flaunting female sexuality as empowering. But I’d bet dollars to donuts those are not the people who are responsible for these images being most popular. I’d bet good money that it’s far more men who viewing women as sex objects, who are oogling and oggling and drooling like troglodytes over all the BOOOOOOBS.

I don’t think it’s a stretch to say that sexual objectification fuels violence against women. I think that’s why this struck me as hard as it did. That, and the fact that I’m raising two daughters in this world. I can’t help but imagine what they’d think if they did a search and like this and came up with these results. What would that tell them about their identity as women and how society values them?

FYI, the results when I searched “men” weren’t much better.

In the name of social science, I figured I should test what would happen if I searched for “men” in Most Popular photos. Interestingly, that fourth photo in my initial search for “women”—the one of the woman on the beach with her top off—is the first image that shows up. Lovely. The results for that search are a bit more varied, but many are also highly sexualized. Lots of guys in speedos and couples in sexual poses. I stopped parsing through them, though, when I hit that same porn image again. Seriously, I don’t want or need these photos in my life, Getty.

Gender inequality is so wrapped up in objectification, I can’t help but find these search results maddening. I suppose they shouldn’t be surprising, but UGH. I know that there’s a lot that we could dive into here philosophically, societally, historically, etc. But regardless of how we want to intellectualize the whys of it all, it’s disturbing.

Do better, humanity.

P.S. There are two categories of images on Getty, Creative and Editorial. This post is talking about results in the Editorial category, not Creative. Most outlets I’ve written for want Editorial photos because they look more candid and less stock-like.  If you try to replicate these results without clicking Editorial, you won’t get them (thankfully).

(Screenshots via Getty Images) 

 If you enjoyed this post, please pass it along. You can follow Motherhood and More on Facebook, Twitter, Pinterest, and Instagram.

Annie writes about life, motherhood, world issues, beautiful places, and anything else that tickles her brain. On good days, she enjoys juggling life with her husband and homeschooling her children. On bad days, she binges on chocolate chips and dreams of traveling the world alone.

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge