国产视频

In Depth

What Do State Privacy Laws Mean for the Ad Tech Industry?

ad tech advertising digital
Shutterstock

Introduction

In the absence of comprehensive federal privacy legislation, states have passed their own privacy laws to protect their residents. In 2018, California became the first state to enact a comprehensive privacy law when the (CCPA) was passed. In 2020, the state amended the CCPA, passing the (CPRA). In 2021, Virginia followed suit, passing the (CDPA). Most recently, in July 2021, Colorado passed the (CPA). While there are differences, these laws prescribe similar responsibilities for covered entities in an attempt to protect the privacy of consumers.

have introduced privacy bills, including Florida, Washington state, Texas, Massachusetts, and Nevada. As the number of unique state privacy laws rises, businesses engaged in the buying and selling of consumer data will need to alter their practices in order to remain compliant. Companies that use advertising technology (鈥渁d tech鈥) to behaviorally target consumers are chief among these, as the digital ad industry鈥檚 business model relies heavily on the collection, aggregation, interpretation and disclosure of personal information. Advertisers and internet platforms that behavioral ads benefit consumers by fostering a more efficient and personalized web browsing experience, but these practices pose a serious risk to consumer privacy, and have encouraged state lawmakers to address these practices directly. However, state privacy laws still rely largely on a 鈥notice and choice鈥 regulation model, and are not as broad or effective as the public might expect.

This brief will explain the ad tech ecosystem, the various intermediaries involved in the collection, aggregation, and use of consumer data, and the harms associated with these practices. Next, it will explain provisions and themes common among the California, Virginia, and Colorado privacy laws, examine how these laws live up to expectations, and where improvements are needed to properly protect privacy and curb harmful data practices in the ad tech industry.

Editorial disclosure: This brief discusses policies by Amazon, Apple, Facebook, and Google, all of which are funders of work at 国产视频 but did not contribute funds directly to the research or writing of this piece. 国产视频 is guided by the principles of full transparency, independence, and accessibility in all its activities and partnerships. 国产视频 does not engage in research or educational activities directed or influenced in any way by financial supporters. View our full list of donors at www.newamerica.org/our-funding.

Technical Overview

Much of the free internet is supported by the sale of consumer data for advertising. While users are of this, the mechanisms employed by the ad tech ecosystem remain opaque and complex.

Online advertising campaigns take different forms and generally focus on maximizing interactions with consumers most likely to buy a product, change beliefs, or relate to the content. Publishers benefit from these advertisements as well鈥攔elevant content is more likely to keep users engaged for longer periods of time, increasing revenue and control. is the practice of placing ads based on the content of a page. For example, a cosmetics company would likely seek to place ads on websites geared toward women, since the majority of readers fall into their target audience. In contrast, behavioral advertising, also known as surveillance advertising, 鈥.鈥 While , the evolution of the internet and consumer data collection capabilities has led to the rise of behaviorally targeted advertising. Combined with advanced ad tech and machine learning capabilities, serving precise behavioral advertisements is faster and more scalable than ever before.

Behavioral advertising and contextual advertising transactions begin in similar ways鈥攚ith a publisher and an advertiser. However, the similarities largely end there. Through a web of interconnected technologies, softwares, servers, and programs, the ad tech ecosystem coordinates the automated purchase and sale of data-driven behavioral advertising on the internet in real time. This process many popular social media platforms, search engines, and email accounts and allows these companies to offer free services.

In order to scale the traditional buying and selling of ads to suit the speed of the internet and process large sums of user data, publishers and advertisers rely on multiple intermediaries to facilitate behavioral targeting. These intermediaries鈥攚hich include demand-side platforms, supply-side platforms, data management platforms, ad servers, ad exchanges, data brokers, and single-site ad platforms鈥攚ork together to seamlessly exchange and analyze consumer data, determine the most appropriate ad placement, and facilitate ad sales, all in the fraction of a second before the consumer鈥檚 web page loads. The is an advertisement that reflects a user鈥檚 supposed preferences and beliefs based on their web browsing activity.

In the ad tech ecosystem, publishers and advertisers exist on opposite ends of the spectrum. On the buy side of the equation, advertisers align with (DSPs) such as LiveRamp, MediaMath, and Rocket Fuel that manage advertisements and facilitate automatic bidding for ad space with multiple publishers at a time. Advertisers specify targeting criteria and bid prices to their DSPs, which are then used to help make instantaneous decisions about the value of a potential ad space. Often, demand-side platforms rely on (DMPs) like Adobe Audience Manager and Salesforce Audience Studio to collect and analyze user data from multiple sources across the web, including data brokers, to curate precise audience segments and determine the most relevant and cost effective ad placement. On the sell side, publishers align with (SSPs), such as AdColony and Google Ad Manager, which assist publishers in connecting their available inventory to buyers and setting prices, payment terms, and criteria for acceptable advertisers. In comparison to pre-internet advertising, DSPs and SSPs make the process automated, more scalable, and more lucrative on both sides of the transaction. In addition, both the buy and sell sides use 鈥攚eb servers that work alongside DSPs and SSPs to store and serve advertisements, monitor campaign impressions, and manage inventory.

The bridge between the actors on the buy and sell sides are the and . Ad networks connect a finite number of publishers through their SSPs to potential buyers. Ad networks are analogous to brokers, connecting groups of buyers and sellers based on need. At the center of these intermediaries, acting as digital trading floors, sit ad exchanges. While technically any entity can buy or sell on ad exchanges, transactions usually flow through ad networks, DSPs, and SSPs. By relying on targeting specifications and user information provided by these entities, ad exchanges facilitate the buying and selling of advertisements at scale.

Data

Data is the lifeblood of the ad tech ecosystem, serving as fuel powering the capability to narrowly target users with personalized ads. While each intermediary serves an instrumental role in the ad tech ecosystem, they would be useless without personal data. User data is derived from a variety of sources, and personal information like email addresses and phone numbers, geographic information, engagement data such as page views, clicks, and time spent on a particular page, and attitudinal data, which includes a user's opinions or feelings about a topic. When aggregated, this information has the potential to create highly specific user profiles, which are then loaded on to DSPs, SSPs, or their respective DMPs. This allows players on both the buy and sell side to create specific targeting requirements, and make informed, instantaneous decisions about which consumers would be best suited to be served a particular advertisement. User information is so illustrative that in some instances advertisers engage in , delivering a unique ad placement designed for the exact user being targeted.

User data can be categorized in four ways based on who is collecting it and how it is obtained. refers to information that a user actively shares with a website, such as data collected through polls or surveys. Similarly, is information collected by an advertiser or publisher based on their direct interactions with customers, including subscription information, transaction history, and certain website analytics. , relatedly, refers to zero-party or first-party data exchanged between affiliated entities such as business partners. The most contentious category of user information is 鈥攄ata collected by an entity that does not have a direct relationship with the consumer.

While considered less accurate than zero-party and first-party data, third-party data provides advertisers and publishers convenient access to a large amount of user information. Through tracking cookies, device IDs, location data, IP addresses, and browser fingerprinting, third-party , which they in turn sell to advertisers, publishers, and their supply and demand-side platforms to facilitate the automatic buying and selling of targeted ad space. Despite third-party data鈥檚 popularity and instrumentality to the ad tech cycle, its support in the industry is waning. Mozilla Firefox and Apple鈥檚 Safari browser both . In 2020, its intention to remove all third-party cookies from its Chrome browser by early 2022. However, in June 2021, the company until at least late 2023. While not yet completely extinct, the imminent demise of the third-party cookie on the internet鈥檚 most popular platforms will likely prompt the industry to designate more resources to build out robust first-party data sets, relying more heavily on programs discussed below, such as Facebook鈥檚 Pixel and Google鈥檚 FLoC.

While generally considered to be a 鈥減rivacy-friendly鈥 alternative to third-party data, first-party and zero-party data still provide detailed information about users who interact directly with a website or brand on the internet that can be incorporated into targeted advertising campaigns. Traditionally, have been considered more accurate and tend to provide consumers with more control over their information. However, as tech giants like Facebook, Google, and Amazon expand their exclusive 鈥攃losed ecosystems that collect, store, and create their own first-party data sets and tools for advertisers鈥攑revious assumptions about this data become less accurate.

Facebook and Google, referred to as the 鈥渄uopoly,鈥 are two of the most frequented sites on the internet, controlling of the digital advertising market respectively. With , the duopoly controls two of the most robust first-party data sets. To capitalize on this, Facebook and Google have each developed programs aimed at increasing profits by attracting advertisers and leveraging control over the ad tech market.

is a first-party cookie advertisers place on their own websites to track a Facebook user鈥檚 activity. The cookie , where it is added to the site's trove of first-party data and used to inform targeted behavioral ad placements. Since Facebook , prospective advertisers have little option but to rely on these data sets. Similarly, currently functions as its own opaque ad tech ecosystem, providing advertisers with tools and access to its first-party data sets. Currently, Google provides advertisers with flexibility to use some third-party data, with such as a prohibition on using third-party data to create audiences for targeting. This current configuration, however, is poised to change through Google鈥檚 newest venture, (FLoC). FLoC is a subsection of Google鈥檚 Privacy Sandbox, which will halt traditional cross-site tracking by collecting information directly through users' browsers without cookies and lumping them together in cohorts based broadly on browsing habits, which advertisers can then use to inform ad placements. As the world鈥檚 largest web browser, the imminent elimination of third-party cookies promises to drastically. this is a positive step that will preserve privacy on the web. , arguing Google鈥檚 cohorts make browser fingerprinting easier and subject users to potential cross-context exposure. While FLoC has not yet been implemented, it is clear that Google鈥檚 first-party data will become significantly more important in the future, ultimately leading to even more revenue and control.

When the many pieces of the ad tech ecosystem come together, they deliver personalized content to users鈥 screens in mere . Collaboration begins when a user requests a publisher鈥檚 web page from his or her browser. The browser notifies the publisher, and the publisher sends its page to the browser, which usually contains space for advertising content. The publisher immediately gets to work filling its available ad space. It first contacts its ad server, which has ads stored and queued for immediate placement. Based on the information the publisher or its DMP has relating to the user requesting the page, the ad server scans its reserves for a relevant placement. If no relevant ad exists in the server, the publisher contacts the SSP, which sends the user鈥檚 information and publisher鈥檚 inventory out to the greater ad tech ecosystem; usually to an ad exchange, but also to other ad servers, ad networks, or DSPs. Once the request reaches the ad exchange, the profile of the user loading the ad, in addition to the ad inventory available and price requirements, are sent out to a seemingly limitless number of advertisers through ad networks, DSPs, and ad servers who bid on behalf of advertisers. When the highest bidding advertiser wins the space, its DSP communicates instructions to the publisher鈥檚 ad server via the ad exchange and SSP. The publisher鈥檚 ad server then forwards the instructions to the browser, which retrieves the content from the winning advertiser鈥檚 server. Before the user even has a chance to , a highly targeted ad appears on their screen. This is only one example of how advertising technologies work together to deliver personalized content. In light of the varying size, capacity, and need of different ad tech companies and platforms, in conjunction with continued innovation in the space, the precise way a targeted ad makes it to the screen of a user, and the data used to get it there, may vary based on the use of intermediary technologies.

Is Targeting as Lucrative as the Industry Claims?

Claimed Benefits

To justify its data collection practices, the ad tech industry tends to argue that behavioral advertising yields higher 鈥攖he ratio illustrating the frequency with which those served an advertisement actually click on or engage with it鈥攁nd return on investment, due to the that companies avoid wasting time and ad dollars serving content to uninterested consumers. These high returns, , allow many sites to function without installing paywalls. The industry also claims consumers benefit from seeing ads for items they are likely interested in鈥攔educing search time and providing a more enjoyable experience. Despite these arguments, research suggests that behaviorally targeted ads compared to other types of ad placements. Additionally, other forms of advertising can be more profitable for publishers. For example, in the wake of the EU鈥檚 General Data Protection Regulation (GDPR), the New York Times completely switched from behavioral to contextual and geographical advertising in Europe, in addition to blocking open-exchange buying. Following the change, The Times reported, advertising revenue 鈥.鈥

Evidence also suggests that behavioral advertising may not be as accurate as promoted. In fact, a 2019 study found that targeted advertising based on gender . Additionally, considering the opinion of the consumer, another study found users report up to served to them are irrelevant.

Potential Harms

In addition to the questionable level of benefit, this collection and use of personal information poses a number of risks. One prominent risk is that of inappropriate discrimination. The core function of behavioral advertising is to discriminate鈥攖o differentiate between users with certain preferences and serve them content based on these differences. While it is sometimes innocuous to serve advertisements based on gender, race, or socioeconomic status, this discrimination can also be harmful. Thanks to highly detailed profiles rich with user information, algorithmic ad delivery tools have the potential to exploit these differences and harm vulnerable communities. In 2019, accusing the social media giant and its of allowing housing and employment advertisers to target users inappropriately by race and gender. While the company asserts it has since changed its practices and that advertisers can no longer use Facebook鈥檚 advertising platforms for discriminatory housing, employment, or credit ads, loopholes still exist, . Harmful audience categorization that informs targeting also poses a risk, with some platforms suggesting advertisers target users based on factors such as 鈥溾 or based on a user鈥檚 .

State Legislation鈥擨s it Making a Difference?

The pervasiveness of data-intensive practices such as behavioral advertising has led to a flurry of legislative action at the state level. As of August 2021, three states鈥擟alifornia, Virginia, and Colorado鈥攈ave passed comprehensive privacy bills, and . Broadly, the laws impose certain responsibilities on covered entities engaged in the sale of user data, in addition to providing consumers with rights regarding their personal information. The laws also provide methods of redress against companies in violation, either through the enforcement power of state attorneys general or a private right of action for citizens. While some provisions in the trio of laws positively impact consumers and require ad tech companies to alter or reexamine their current practices, others are less effective and do little to reduce the risk associated with the industry鈥檚 data collection practices.

Ambiguous Definition of 鈥淪ale鈥

Central to the California Consumer Privacy Act (CCPA), its successor the California Privacy Rights Act (CPRA), Virginia鈥檚 Consumer Data Protection Act (CDPA), and the Colorado Privacy Act (CPA) is the broad definition of the 鈥渟ale鈥 of consumer data. While each uses slightly different language, all define the term more broadly than exchanging money for data. For example, the CCPA defines a as 鈥渟elling, renting, releasing, disclosing, disseminating, making available or transferring a consumer鈥檚 personal information by the business to a third party for monetary or other valuable consideration.鈥 CPRA built on the CCPA鈥檚 definition, and the sharing of information for purposes of cross-context behavioral advertising: 鈥渢he targeting of advertising to a consumer based on the consumer's personal information obtained from the consumer's activity across businesses.鈥 Colorado and Virginia adopted similar language, but provide key exceptions to the definitions, including transfers to affiliates (second-party data). Additionally, the CDPA leaves out the vague language 鈥渙r other valuable consideration鈥 present in CCPA, CPRA, and CPA.

causes general consumer confusion and allows for inconsistent data handling practices. Since much of the behavioral advertising process involves exchanging user data fluidly, and doesn鈥檛 necessarily require a direct exchange of monetary consideration among parties, companies may view the requirements of the laws differently . The use of terms such as 鈥渙ther valuable consideration鈥 in CCPA, CPRA, and CPA also creates ambiguity, leaving the safety of consumer data at the hands of companies who must determine if their practices constitute a sale absent substantial regulatory direction. Additionally, which relationships constitute an affiliation for purposes of CDPA and CPA is also unclear. The lack of clarity has the potential to lead to compliance difficulties, and may end up doing more harm than good for consumers. As courts begin litigating complaints pursuant to these laws and attorneys general begin exercising authority to clarify definitions, a better understanding of what exactly constitutes a sale will inevitably emerge. Until then, however, the full scope of these laws remains vague.

Data Protection Assessment Requirements

CPRA, CDPA, and CPA all require data controllers to engage in certain privacy audit requirements. In amending the CCPA, CPRA created a Consumer Privacy Protection Agency (CPPA) to which covered entities whose practices impose 鈥溾 to user privacy must submit risk assessments. In determining when risk is significant, the law points to factors such as the size and complexity of the business, along with the nature of the processing activities. These risk assessments, which must be filed with the CPPA on a regular basis, require the covered business to identify whether their practices involve processing sensitive personal information and weigh the benefits against the potential risks of the processing. While CPA and CDPA do not establish designated privacy agencies, they also impose data protection assessments on covered entities that specifically require businesses engaged in targeted advertising to conduct audits explaining the risks and benefits associated with the profiling. These assessments are reviewed by the state鈥檚 respective attorney general.

Data protection assessment requirements impose a few notable obligations on the ad tech industry. First, since the CPRA鈥檚 definition of 鈥渟ignificant risk鈥 is broad, covered businesses engaged in behavioral advertising may be unsure what exactly their responsibilities are under the law and how these obligations vary from those imposed under the . As with the definition of sale, the true meaning of this provision will become clearer as the issue is litigated in California courts, but this will likely take time. For ad tech companies subject to CDPA and CPA, the applicability of these provisions is clearer, but they will still be required to ensure they are complying with the detailed requirements involved in completing the required assessments, including keeping copious records of tracking activity and identifying potential risks clearly. While the required data protection assessments will be privileged and not automatically available to the public, they will still provide consumers with tangible benefits. Notably, risk assessments will help promote industry transparency with state government, and aid in the investigatory and enforcement processes when a covered entity violates the law.

Opt-Out Requirements

California, Virginia, and Colorado all require covered entities to provide a clear and conspicuous method for consumers to opt out of the sale of their personal data, with Virginia and Colorado specifically requiring targeted advertisers to do so. Additionally, Colorado goes even further, requiring that companies engaged in targeted advertising provide a .

Critics argue that while these opt-out requirements impose some additional burdens on the ad tech industry and create the illusion of consumer privacy, they do not go far enough to properly protect citizens. Primarily, that an opt-out regime such as the ones established in CCPA, CPRA, CDPA, and CPA shift the onus to the consumer and relieve data controllers of too much responsibility, making privacy an option rather than the default approach. Additionally, while a universal opt-out mechanism is a step in the right direction and provides users with an accessible way to prevent online tracking, the CPA鈥檚 provision . This means that even users who select a universal opt-out option

Controller Duties

In addition to data protection assessments, CPA also mandates that data controllers, including those engaged in targeted advertising, adhere to certain duties. These include the duty of care, duty of transparency, duty of data minimization, duty to avoid secondary use, duty to avoid unlawful discrimination, and and duty to obtain consumer consent before processing sensitive data. The generally track the promoted by privacy advocates globally, emphasizing consumer control of information and professional responsibility. On paper, these duties certainly seem to be a step in the right direction, providing more protection for consumers and placing greater responsibility on industry. However, only time will tell if Colorado鈥檚 Attorney General will be able to enforce them in a meaningful way.

The Right to Cure

A 鈥溾 is an opportunity for an at-fault party to remedy their violation of a statute or contract before enforcement action is taken. Under the CDPA and CPA, businesses are given a and , respectively. The CPA鈥檚 cure window, however, will be phased out by 2025. Previously, the CCPA also provided a 30-day cure period鈥攈owever, the CPRA will eliminate this when it officially takes effect in 2023, instead providing the CPPA discretionary authority to allow violators to cure on a case-by-case basis.

Right-to-cure provisions do not provide consumers with more protections for their personal privacy, and without meaningful oversight, it may be difficult to ensure offenders actually remain in compliance after taking the opportunity to correct their practices. cure provisions contradict the point of regulation in the first place and are too lackadaisical on offenders, allowing companies to actively evade compliance unless they are caught. However, in California and Colorado, where the right to cure will be eventually phased out, covered entities may be incentivized to adjust their practices promptly in order to avoid incurring civil penalties. What鈥檚 more, providing a cure period that will be phased out over time is helpful because it allows covered entities to experiment and adjust their practices through trial and error, creating a window to develop compliance strategies that fit best with their business models without fear of penalty for mistakes made while adjusting.

Dark Patterns Prohibition

are 鈥渄igital tricks鈥 defined as "a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation." For example, in , a malicious actor may display a mobile advertisement with what appears to be a speck of dirt, or a scratch on the users screen in an attempt to manipulate them into tapping the banner, releasing a virus or malicious code. Dark patterns are also used by legitimate businesses through careful wording and deceptive web design. Recently, companies engaged in targeted advertising have reportedly engaged in the use of dark patterns . To combat this practice, CPRA and CPA both explicitly provide that consent acquired through the use of dark patterns is a violation of the law.

For ad tech companies covered under CPRA and CPA, the dark patterns prohibition will likely not make much of an impact, aside from prompting companies to pay closer attention to how they present users with options and ask for consent. On the other hand, prohibiting dark patterns doesn鈥檛 do much to advance consumer privacy, either. Opt-out consent regimes are disfavored by privacy advocates due to the inherent expectation that consumers alone are responsible for protecting their personal information. Prohibiting dark patterns doesn鈥檛 fix this problem鈥攊t simply acts as a Band-Aid. If lawmakers advanced a 鈥減rivacy by default鈥 standard, rather than passing the onus to consumers, there would be little need for a prohibition on dark patterns at all.

What鈥檚 Missing?

Though the flurry of action at the state level has led to some positive legislative outcomes, the current laws in California, Virginia, and Colorado , such as biometric privacy protections and data minimization principles, and lack creativity in their approaches to protecting privacy. Broadly, these three laws focus their efforts on providing users with means to control their data, without placing the onus on industry. A found that with so much responsibility on consumers, the seemingly positive provisions in the CCPA were failing to adequately regulate data collection. The report also noted that consumers struggle to locate the 鈥淒o Not Sell鈥 link as required by the CCPA, and often are unsure whether their attempt to opt out was successful. The trend among state lawmakers to give citizens 鈥渃ontrol鈥 of their data seemingly does little to help consumers, who continue to struggle to exercise their rights and remain at risk. Moving forward, , incorporating meaningful provisions to advance the goal of privacy by design.

Conclusion

Despite the hurdles created by state privacy legislation, the biggest nuisance for players in the ad tech ecosystem may be establishing a compliance program that can address all of them. The scale and interconnectivity of the ad tech industry makes it likely that a company subject to compliance in one state is also subject to compliance in another, meaning these organizations must be privy to the nuances of each. While not detrimental to the business model, the differing jurisdictional requirements may result in varying protection for users based on their residence. The solution is federal privacy legislation, but Congress has had difficulty advancing a bill. Additionally, it remains unclear how effective state privacy legislation will be in regulating the rapidly changing ad tech industry moving forward. As walled gardens and first-party data collection practices become more prominent, and programs such as Google鈥檚 FLoC take hold, the usefulness of current laws may be lost and legislation may need to adapt to regulate more effectively.

With so much uncertainty in the industry and a partisan stalemate at the federal level, it is imperative state lawmakers focus on crafting privacy laws that shift the burden from consumers to industry. Companies engaged in behavioral advertising should be held accountable for their data handling practices through provisions that promote and encourage transparency, stricter corporate obligations, user rights and controls, and privacy by default.

Downloads


More 国产视频 the Authors

meaghan donahue
Meaghan Donahue

Legal Intern, Open Technology Institute

What Do State Privacy Laws Mean for the Ad Tech Industry?