How the online business model encourages prejudice | Techno…


In September, a group of people searching for work in the US filed charges against Facebook and 10 other companies for discriminating against women by targeting certain job advertisements only at men. The employers, from sectors such as labouring and lorry driving, had used Facebook’s ad-targeting tools to direct the opportunities at those they thought most suitable – and this did not include women. Although the outcome of the case has not yet been decided, it could – along with similar cases – have a seismic impact on the future of digital advertising and, consequently, on the future of the net.

Allegations of discrimination have been made against Facebook’s advertising before. An investigation by ProPublica and the New York Times at the end of last year found that dozens of employers had used the platform to target job ads at particular age groups, meaning that those outside those ages did not see them. Around the same time, the Washington State attorney general’s office launched a sting operation, to show how straightforward it was to use Facebook’s targeting tools to prevent certain ethnic groups from seeing ads in the US: it placed 20 phoney ads for jobs, apartments, insurance and other services and deliberately excluded one or more ethnic minority groups from receiving the notification.

When the attorney general’s office published its findings earlier this year, Facebook said it would alter its systems to prevent this kind of discrimination recurring.

But it is not just Facebook that has been accused of discrimination. Targeted ads have also become central to Google’s business model and a Carnegie Mellon study in 2015 found that women were much less likely than men to be shown ads for higher-paid jobs.

Facebook has dismissed the latest charges, saying: “There is no place for discrimination on Facebook; it’s strictly prohibited in our policies.” It has also adapted its tools when abuses have been reported in the past.

But even if Facebook escapes these charges, they are unlikely to end here. Indeed, there are likely to be more – lots more. This is because discrimination, in its literal sense, lies at the heart of targeted advertising. To discriminate means to select or distinguish based on identifiable characteristics. Or, as the Oxford English Dictionary puts it, “the power of discriminating or observing difference”. Targeted advertising, which has come to be the dominant means of advertising online, gives advertisers the power to discriminate based on any one of a number of identifiable characteristics, such as age, gender, location, behaviour; interest-based targeting is also an option.

Targeted advertising is to Facebook what faeces is to the dung beetle – its livelihood and its nourishment. Author and journalist David “Doc” Searls, who has written extensively on the problems of digital advertising, calls Facebook “a machine built for targeting”, adding, “Facebook doesn’t so much allow advertisers to discriminate against groups, it is designed to do exactly that.”

Should targeted advertising be found to be inherently discriminatory, the risk to Facebook, Google and the panoply of digital advertising companies is huge. Digital advertising, or “ad tech”, has become the dominant way in which communications services, news and information on the web are funded. Facebook, and its progeny Instagram and WhatsApp, rely on digital advertising for more than 95% of their income. Google search, Chrome, Google Maps and gmail are all financed by digital advertising (not all of it targeted). And the influence of digital advertising goes much further than the Google-Facebook duopoly. Google advertising is integrated to more than 14m sites across the web (including the one you are on right now). More than six million advertisers use Facebook. Digital advertising is the goose that laid Google and Facebook’s golden egg. Were it to be found to be intrinsically discriminatory, that could undermine the entire superstructure of today’s digital economy.

While this might be a frightening prospect for the tech giants, and for those who have become dependent on digital advertising, it could be very healthy for politics and society. It was ad tech that allowed Russia’s Internet Research Agency to target inflammatory and divisive messages at more than 126 million Americans during the 2016 presidential election. Ad tech incubated a cottage industry of people inventing news purely for the sake of clicks and views. It also inspired a system of machine-driven personalised propaganda that makes all previous propaganda efforts seem technologically rudimentary. On top of which ad tech relies on behavioural tracking, only works if done at a phenomenal scale – and is chronically and inherently opaque.

Constantly intrusive personal tracking is essential to ad tech. This is how Facebook, Google and others convince advertisers that they can reach who they want, when they want. When you are next reading a news article on the web, check to see if you can see a little Facebook like symbol on the page. That is not there merely so that you can tell your friends you like the article, but to allow Facebook to flesh out your profile for advertisers.

Still, at least you can see the little Facebook symbol… Any website can add the Facebook pixel – a pixel that is invisible to the human eye. More than 2m have and it allows them to track users who come to the site and target them with ads once they leave.

Google is similarly voyeuristic. Google Analytics enables organisations to measure the traffic to their websites, while feeding Google the information it needs to target ads. A 2016 study of ad-tracking technology found Google Analytics on almost 70% of the top 1m websites. When you visit any of these sites Google knows you are there and can use this knowledge to tailor ads to you.

It was not always this way. When Larry Page and Sergey Brin started Google, they wanted to distance themselves from advertising. Advertising had, they said, corrupted other search services. Google would be different: it would keep advertising at arm’s length. Over time, to keep impatient investors and venture capital wolves from the door, it started to use relevant keyword advertising to support Google search.





Google’s co-founders, Larry Page, left, and Sergey Brin, 2004.



Google’s co-founders, Larry Page, left, and Sergey Brin, 2004. Photograph: Ben Margot/AP

Then it realised it could extend relevant advertising to sites across the web. A few years later it went further still, starting to serve the banner ads you see across the top of websites. Each time it spread its advertising empire, it moved further along the road of scale, automation and tracking (of watching us, in other words), all in order that it could help advertisers grab your attention.

You know when you first go to a web page and some of the boxes around the page do not immediately fill up with ads? That is not because you have a slow internet connection but because as soon as you went to that site, your personal details were thrown on to an ad exchange where advertisers started frantically bidding to win your attention. The more you are worth to them – based on who you are, where you live and countless other fragments of personal information – the more they bid. The advertiser that bids the most for your attention wins the auction and gets to put their ad in the box on your webpage. All this in the split second that it takes the page to load.

From an advertiser’s perspective, this seems great. They are offered the chance to show ads to exactly the people they want, at a price they can afford. Not only that, but they can see – if it gets clicked – whether the ad has worked. Yet as well as relying on mountains of personal information about you, this system deliberately and systematically diverts money away from authoritative sites, towards fringe, false and extreme sites – if it is cheaper for an advertiser to reach the same person at an unreliable, low-end or hyper-partisan site rather than a high-end, reliable one, then why not use the cheaper site?

Next time you are reading news online, jump from the site you are on to one that is much less well known, perhaps one with whose politics you disagree. See the ads around the page? These are not there to support that site (although that is what they end up doing), they are there to reach you.

As I was writing this article, I popped on to Breitbart News and found an ad for “Peter Jones’ Plan for Growth”, right above a piece attacking George Soros. The Washington Post did a more thorough search than me and found countless ads from large corporations beside antagonistic, hyper-partisan and discriminatory articles. Beside one headlined “Are Liberal Pervs Sexually Obsessed With Refugees?”, for example, the paper found an advertisement for Hertz cars. These ads do not just lend credibility to the articles and the sites, they fund them too.

Facebook, Google and other ad tech players will do all they can to show that they do not discriminate. When they are caught red-handed, they will adapt their services, mute certain categories, enlarge the size of the groups that advertisers are able to target and make it easier to direct ads using alternative criteria (such as interests).

Yet, eventually, it will become clear that at its core, targeted advertising enables discrimination. Once this becomes widely acknowledged, then the system will have to change, will have to become less discriminatory and less opaque and consequently – from ad tech’s perspective – will become less effective. Then, who knows, many sites may need to search for alternative methods of funding and perhaps – perhaps – we will find a better and healthier way to fund the digital economy. One can but hope.

Democracy Hacked: Political Turmoil and Information Warfare in the Digital Age by Martin Moore is published by Oneworld (£16.99). To order a copy for £16.41 go to guardianbookshop.com or call 0330 333 6846. Free UK p&p over £10, online orders only. Phone orders min p&p of £1.99

Site Search 360 Reports

Be the first to comment

Leave a Reply

Your email address will not be published.


*