Facebook has been the topic of discussion for some time now. From fake Russian bots masquerading as Americans, to privacy issues, to misleading video metrics, Social media companies have had their hands full as they tries to maintain their reputation.
Given all the controversies, it wouldn’t be shocking to report that Facebook has yet again found itself in difficult circumstances. The British online newspaper The Independent says this about the most recent scandal:
Facebook has once again provoked controversy after the discovery it is possible to search for photos of female friends on its social network, but not male ones.
The feature was spotted by Belgian security researcher Inti De Ceukelaire, whose findings led to further revelations that Facebook prompts its users to search for photos of female friends in bikinis.
“Facebook has modified their creepy hidden search feature this weekend,” he tweeted earlier this week. “You can no longer retrieve hidden photos from your male friends. Women can/may still be stalked.”
He continued: “Even more: When you request photos from your male friends, Facebook assumes that you wanted to see pictures of women.”
Screen shots of the issue accompanied his tweets, prompting responses that Facebook is “sexist” due to the way its internal search feature functions.
Inti De Ceukelaire has gotten a lot of exposure over the past few years. Mostly for the pranks he’s pulled. They all aim to expose security and privacy issues within the popular services we use or conglomerates that we depend on. A warning from him spread like wildfire, and people were immediately concerned and offended.
CLICK HERE to order the
2020 Cats in Space Quoting Scientists calendar!
SAVE 20% off your order using promo code CYBER-MONDAY!
Well, if you’re like me and have to see it to believe it, the first thing you will do is go to Facebook’s search function and try it out. Here’s some screenshots from our ‘experiment’:
TNW chimes in with their own results as well:
Switching out “female” with “male” returns something completely different. Instead of pictures of friends from within your social network, you’re instead shown a selection of pictures from across the social network. In our experience, these came from accounts and groups we did not follow. Facebook will also ask if you meant to type “female,” assuming you mistyped your query.
The lack of results for males is a little strange, and it is creepy that Facebook autosuggestions ‘in bikinis‘. If this is an intentional algorithm Facebook has enacted, it is obviously very problematic. We all know there are stalkers, creeps, and overly hormonal teenagers on social media. But they shouldn’t be getting a helping hand from Facebook.
The Independent continues:
“These are abusive search suggestions and should be addressed,” said Jennifer Grygiel, an assistant professor of communications at Syracuse University.
Facebook originally stated in a comment to media outlets that it was the result of “a bug” with its search function, however the social network issued a further clarification to explain it is not a glitch but simply how the search feature works.
Facebook had overhauled their search function in late 2018. They promised that as you type, they’ll highlight things that are happening so you can follow popular stories as they unfold. We don’t know 100% how they auto-populate results, but most guess it’s similar to how Google’s Autosuggest algorithm works.
The basic idea for Autosuggest is something like this:
Say you have a giant collection of queries. Take that collection of queries and prepare all of them in a way that allows you to keep the frequency of occurrence of each word.
Imagine a frequency distribution module measuring the number of searches for every word ever queried on Google. That’s how Google organizes and calls upon this data when you type it into the search bar. So if you start typing “Yo”, Google can filter out all the words not starting with “Yo” on their frequency distribution, and order the remaining words by highest to lowest frequency.
That explains how Google can predict a single word, but what about multiple words, or even sentences? Ask yourself, what’s the probability that I’m going to type the word “videos” given that I wrote “cat” first?
Probably pretty high, but that’s besides the point.
This step is commonly referred to as ‘training’. What this word actually fails to describe is that we are building, in some sense, a frequency distribution. If we were to normalize that frequency distribution, it would be called a probability distribution.
So all Google has to do is split your query into pairs where every word is followed by the next, and we can use the first word (the given word) to give a list of suggested next words. And the more you type, the more your query is narrowed down, and the easier it is for Google to give you suggestions.
Therefore, you can reasonably deduce that the auto-population of ‘female friends in bikinis’ is a reflection of the most popular queries on Facebook.
WordStream puts this into perspective for us:
Each day, Facebook handles 1.5 billion searches against the 2 trillion posts in its index. Facebook is still a ways off from Google’s 3.5 billion daily searches, but it’s an impressive figure nonetheless and puts Facebook right up there as far as search giants go.
Of course, Facebook is only indexing content within its own ecosystem, while Google, Bing, Yahoo and the like make much of the web’s content available to searchers.
Of course, it’s unsure how many of these searches are strictly from people looking for bikini pictures. Some leaked documents from earlier last year show a lawsuit against Facebook by a developer from 2015, who shockingly created an app that allows users to filter specifically for bikini pics. So with information like this, it seems reasonable to suggest that a lot of people are searching for these pictures on a daily basis.
But it is questioned if Facebook should scale back this kind of functionality, as Facebook is clearly different from Google in that all the queries you will be typing in relate largely to people, their profiles, and their data. TNW spoke with Inti on this issue after he stumbled across the search function results:
De Ceukelaire runs a site called StalkScan.com, which allows anyone to see what kinds of information their profiles are leaking, thanks to Facebook’s advanced Graph Search tools. Graph Search has been around in various forms since 2013, and allows users to parse through social data using natural language queries — queries like “photos of my female friends.”
Over the past few years, Facebook has quietly scaled back its Graph Search, removing it from public view and making it harder to access. That being said, it’s still publicly available, much to the dismay of De Ceukelaire. “I can’t believe this feature is still working,” he told me, somewhat aghast. “Nobody needs this.”
He continues by saying he believes Facebook has taken steps to make his site, StalkScan.com, stop working. He has ran into multiple disruptions in his service after tweeting about Facebooks ‘sexist’ search function.
It’s easy to mistrust Facebook after all their scandals, but is this just a mirror thats being held up to us? Is Facebook just reflecting back what we, as a sexist society, behave like?
Should Facebook get rid of Graph Search entirely to avoid these types of results, or do they have no responsibility in this?
This autosuggest function is showing us some private perspectives of our ‘friendships’. Whereas you might not hint at sexuality to a female friend in person, you may instead search for her bikini pictures in private.
If anything, these results open a discussion on whether these pictures are okay to look at because they were deliberately posted by the user, or is it still an uncomfortable, sexist truth that women are sex objects even to their friends?
Either way, Facebook did not go out of its way to offer you your friend’s bikini pictures and is scaling back on this function, albeit slowly. You can’t deny that this one is partly on us, the end user.
thank you facebook, extremely cool and normal algorithm pic.twitter.com/2ArET4TxuG
— please dm me chonky cats (@junkiechurch) February 16, 2019
When my wife gave birth to our son, I got multiple recommendations from Facebook’s algorithm to join groups that pushed anti-Vax agendas. I cannot stress this enough: get off of Facebook. https://t.co/SMC0YewawN pic.twitter.com/HmjiiS7hgA
— Alex Sanders (@ImAlexSanders) February 19, 2019
Pretty cool that the Facebook algorithm thinks I’m Jewish and 90% of my targeted ads are for different “birthright” trips to Israel
— Grace (@gracequinoa) February 13, 2019
— Kevin Drew Davis (@kevindrewdavis) February 16, 2019