The big ad-tech companies know how to sell ads without damaging privacy, but they choose not to.
It’s easy to see tech companies as a monolithic villain in the battle over consumer privacy. But in fact, there are countless tech companies, like mine, that believe that people have a fundamental right to avoid being put under surveillance and that it should be easy for them to exercise that right. By contrast, it is the big ad-tech companies — especially Facebook and Google — that do not want to make it easy for consumers to avoid profilingClose X, because their business models rely on it. They are resisting change to other proven models of advertising, even though other companies are showing that it works.
This distinction between Big Ad Tech and everyone else in tech is important to keep in mind as policymakers consider new regulations intended to protect consumers’ privacy. Executives of these big companies may individually make public statements welcoming federal regulation, but in practice they are doing everything they can to weaken existing laws and shape new ones in their own interests. This strategy is very obvious to the rest of us in the tech industry. And it’s essential to get these privacy laws right today, so that people have the opportunity to opt outClose X of online tracking now.
The most significant example of Big Ad Tech’s influence on privacy regulation is in California, which passed the California Consumer Privacy ActClose X in 2018. Industry groups representing Big Ad Tech are leading the charge to weaken the law through amendments that would exempt the sharing of personal data for ads, as Fast Company recently reported. This type of “exemption” is not a minor change; it would weaken the law so much as to make it almost meaningless. As Jacob Snow of the A.C.L.U. of Northern California put it, “A privacy law shouldn’t have a targeted advertisement exception for the same reason that an environmental law shouldn’t have a coal mining exception.”
Groups like the Information Technology and Innovation Foundation, which has board members from several big tech companies, are lobbying in Washington for similar ad-tech exemptions federally, as The Verge has reported. They argue that strong privacy laws would hurt the digital ad market, create high costs for businesses and curb innovation.
These are all weak arguments. There is no reason to fear that sites cannot still make money with advertising. That’s because there are already two kinds of highly profitable online ads: contextual ads, based on the content being shown on screen, and behavioral adsClose X, based on personal data collected about the person viewing the ad. Behavioral ads work by tracking your online behavior and compiling a profile about you using your internet activities (and even your offline activities in some cases) to send you targeted ads.
Contextual advertising doesn’t need to know anything about you: Search for “car” and you get a car ad. Over the past decade, contextual ads have been displaced by behavioral ads, aided by the rise of real-time bidding technology that auctions off each ad on a site based on user profiling. These behavioral ads are the ones that leave a bad taste in your mouth. They follow you around from website to mobile app based on your private information and, intentionally or not, enable online discrimination, manipulation and the creation of filter bubbles.
Strong privacy laws will force the digital advertising industry to return to its roots in contextual advertising. That’s a good thing, since contextual advertising does not affect privacy in the same way. (My company uses only contextual advertising, and we compete with Google.)
In fact, Google search advertising began and still largely operates this way. When you enter a search request in Google, it displays ads that are relevant to that particular search, without needing to collect information about your search, location, purchase or browsing history. However, Google still collects all this information because it also powers non-search behavioral ads across the whole internet, on more than two million websites and apps that use Google’s ad services and on Google’s non-search properties, such as YouTube. Once people are allowed to opt out of behavioral advertising, then online ads for those who do can go back to being more like non-creepy contextual search ads.
This shift back to contextual advertising need not reduce profitability. A recent poll by Digiday of publishing executives found that 45 percent of them saw no significant benefit from behavioral ads, and 23 percent said they actually caused a decline in revenue.
What about compliance costs? Companies are quickly realizing that good privacy practices are a boon for business. People increasingly want to reduce their digital footprint and so choose companies that help them do so. Companies with good privacy practices in their DNA do not face significant compliance costs.
And if there is anything stifling innovation in Silicon Valley, it is the dominance of Big Ad Tech, wrought by the surveillance capitalism business model. When contextual advertising regains prominence, more companies will be able to compete against Facebook’s and Google’s ad networksClose X because they won’t need huge troves of personal data to do so. Additionally, strong privacy laws will spur innovations to help companies use data in a privacy-respecting manner. In some fields, such as services that help businesses analyze how people use their sites, this innovation and resurgence of competition has already begun.
I am reminded of the arguments made in the 1960s and ’70s about laws to reduce toxic emissions from cars. Companies profiting from less regulation lobbied against those laws, and yet, once they were enacted, Americans’ health improved, innovations such as the modern catalytic converter entered the market, and big companies met the new emissions targets without catastrophic expense. If we enact strong privacy regulation, I believe we can be similarly hopeful about the future of privacy.
"What if We All Just Sold Non-Creepy Advertising?" by Gabriel Weinberg originally appeared in The New York Times on June 19, 2019.
For more privacy advice, follow us on Twitter & get our [privacy crash course]