January 17, 2019
Not for nothing do most commercial websites—from social media platforms to news outlets to online retailers—collect oodles of data about their users’ personal attributes, from age to current employer; to preferences, activities, and behaviors.
In fact, our personal characteristics bring in a lot of money: Targeted advertising actually represents the core of Facebook’s business, which brings in more than $40 billion in revenue each year, The New York Times reports.
Through all the clicking, posting, and article-sharing, and activity elsewhere online, Facebook builds up an ad profile for each of its users. That includes information as basic as their age and location, as well as their hobbies, political leanings, family type and more. Advertisers use that information to direct tailored messages to users.
But how well do Americans understand these algorithm-driven classification systems, and how much do they think their lives line up with what gets reported about them?
Interestingly enough, about 74% of Facebook users were completely unaware that the company lists their personal traits and interests for advertisers on its site. Half of the users who looked at the Facebook page with that data — known as their “Ad Preferences”— said they were not comfortable that information on them was so easily available.
“Privacy matters to Americans—it’s a classic American value—yet when they’re online and doing other things, they act as if their personal information is O.K. to harvest and analyze,” Lee Rainie, Pew’s director of Internet and Technology Research, said in a January 16 interview with The New York Times. “One of the theories on this inconsistency is that Americans don’t really know what’s going on.”
About 88% of the users had listings on their Ad Preferences page, Pew said. The page says that it allows users to “learn what influences the ads you see and take control over your ad experience.”
“Pew’s findings underscore the importance of transparency and control across the entire ad industry, and the need for more consumer education around the controls we place at people’s fingertips,” Joe Osborne, a Facebook spokesman, said in a statement. “This year we’re doing more to make our settings easier to use and hosting more in-person events on ads and privacy.”
But questions around how that data can be misused to manipulate people—and how much they know about its collection in the first place—have put tech companies like Facebook on the defensive. Tech companies have responded by promoting tools that they say offer transparency around their business practices, including “Ad Preferences” and a similar product from Google called “Ad Settings.”
Pew’s survey also took a closer look at two of Facebook’s more controversial user labels, which are determined by algorithms: political leanings and “multicultural affinities.” (Facebook decides whether a user has an “affinity” for a minority group (e.g., African-American or Asian-American), which can then be used to target ads.)
When it comes to politics, about half of Facebook users (51%) are assigned a political “affinity” by the site. Among those who are assigned a political category by the site, 73% say the platform’s categorization of their politics is very or somewhat accurate, while 27% say it describes them not very or not at all accurately. Put differently, 37% of Facebook users are both assigned a political affinity and say that affinity describes them well, while 14% are both assigned a category and say it does not represent them accurately.
Only about 20% of Facebook users say they are listed as having a “multicultural affinity.” Overall, 60% of users who are assigned a multicultural affinity category say they do in fact have a very or somewhat strong affinity for the group to which they are assigned, while 37% say their affinity for that group is not particularly strong. Some 57% of those who are assigned to this category say they do in fact consider themselves to be a member of the racial or ethnic group to which Facebook assigned them.
Research contact: @paulhitlin