log in
Print | Back

Lowenstein Sandler LLP

Matt Savare

Matt Savare

Partner

Lowenstein Sandler LLP
New York, U.S.A.

tel: 646.414.6911
Send an Email

Local Time: Thu. 18:04

Profile

A veteran of high-profile representations in the digital advertising, media, and entertainment sectors, Matt brings a proven track record to his work for a broad range of clients. 

Matt has represented clients in copyright, trademark, trade secret, and right-of-publicity matters—with a particular emphasis on how new and emerging technologies are disrupting traditional businesses—in the following sectors:

  • Blockchain and Cryptocurrencies
  • Software (SaaS, PaaS, development, services)
  • Big data 
  • Social media
  • Advertising, financial, and education technology
  • Media and entertainment
  • Life sciences 
  • Retail (online and brick and mortar) 
  • Beauty and fashion
  • Food and beverage
  • Government contracting
  • Investment management

His work also includes counseling clients on information privacy and data security issues (including the California Consumer Privacy Act [CCPA]), cybersquatting, domain name disputes, and technology licensing. He represents The Estée Lauder Companies Inc. in connection with various investments and acquisitions, with a particular emphasis on intellectual property and right-of-publicity issues. He also represents News Corp. in connection with its digital advertising initiatives, and regularly drafts and negotiates endorsement, sponsorship, and personal appearance deals for athletes, celebrities, and major brands. 

During his time as a litigator, Matt handled various entertainment, intellectual property, false advertising, right-of-publicity, and privacy disputes, including defending a copyright infringement suit filed by the Estate of Frank Zappa and assisting in the successful defense of David Chase in connection with The Sopranos.

Prior to joining Lowenstein, Matt worked for the Department of the Army, negotiating and drafting multi-million dollar procurements for communications and electronics equipment and related services, with an expertise with the FAR, DFARS, and AFARS.

Bar Admissions

    New York
    New Jersey

Education

Seton Hall University School of Law (J.D. 2004), summa cum laude, Articles Editor, Seton Hall Law Review, Valedictorian
Monmouth University (M.A. 2001), summa cum laude, Co-valedictorian
Drew University (B.A. 1995), summa cum laude
Areas of Practice
Professional Career

Significant Accomplishments

Represents companies involved in every facet of the ad tech ecosystem, including the world's leading publishers, ad networks, ad exchanges, DSPs, SSPs, and data aggregators.

Represents The Estée Lauder Companies Inc. in connection with various investments and acquisitions, with a particular emphasis on intellectual property and right of publicity issues.

Represents numerous hedge, private equity, and other funds and their portfolio companies regarding data privacy, data security, and information technology and services agreements.Regularly drafts and negotiates endorsement and personal appearance deals for athletes, celebrities, and major brands.

Speaking Engagements

Matt Savare will moderate the panel discussion "Wearable Tech – What Is it? What Privacy Professionals Need to Know!" Sponsored by the New Jersey KnowledgeNet chapter, which is co-chaired by Cassandra Porter, the program will include a discussion about wearable technologies and the associated privacy implications. The event will conclude with a wine and cheese reception.

Lowenstein Sandler and Osborne Clarke will host the, "The ABC's of Blockchain: Altcoins, Bitcoins, and Coin Offerings." The explosive growth of blockchain and cryptocurrencies over the past year has led to an unprecedented number of ICO and new crypto tokens globally. The panelists will provide an overview of blockchain and the recent business, legal, and regulatory issues surrounding cryptocurrencies and initial coin offerings from a US, European, and Asian perspective.

For more information, email [email protected]

Matt Savare kicked off the event with an overview of blockchain, setting the stage for the Young Pros audience to first gain an understanding of the technology before identifying its role in advertising and media buying/selling. He covered various related terminology and often provided analogies to break it down when further explanation was needed.

From cookies to pixel tags and mobile devices to connected televisions, privacy and data issues abound in the world of digital advertising. Lowenstein's Matt Savare moderates a lively panel with privacy professionals from the various constituencies in the complex ad ecosystem to discuss the legal and regulatory landscape, commercial contracting issues, and what’s next.

Moderator: Matt Savare, Partner, Lowenstein Sandler LLP

Panelists:

  • Michael Hahn, Senior Vice President & General Counsel, IAB & IAB Tech Lab
  • Daniel Shore, Privacy Counsel, Conversant LLC
  • Julia Shullman, Vice President and Chief Privacy Counsel, AppNexus
  • Travis Davis, Counsel, Hearst

This session takes place 4-5 p.m. on October 4, 2018. The conference is being held at the Cloyd Heck Marvin Center at George Washington University, 800 21st Street, N.W.Washington, D.C. 20052.

Join us for our 4th Annual Cyber Day. This half-day program features sessions led by Lowenstein lawyers and other industry leaders who will discuss how companies can navigate cybersecurity, blockchain, and data privacy issues as well as the cyber insurance market in order to operate in a post-GDPR business landscape.

Topics include:

  • Cyber Risks: Where to Find Coverage and How to Maximize Recovery for Cyber Claims 
  • A Global Perspective: Status Report on the Impact of GDPR and What You Need to Know About the Evolving U.S., Federal, and State Data Privacy Laws
  • Government Investigations: How to Prepare and What to Do
  • Beyond Bitcoin: An Introduction to Blockchain

Lowenstein speakers include: 

The program runs 7:30 a.m.-2 p.m. Program location: Lowenstein Sandler LLP, One Lowenstein Drive, Roseland, New Jersey 07068; 973.597.2500. CLE credit available.

In-house lawyers in industries far beyond the tech world–such as financial services, pharmaceuticals, insurance, and consumer electronics, to name only a few–need practical guidance on the many ways that cybersecurity and privacy issues can affect all stages of business, from the valuation of data as an asset to the allocation of risk.

In response to this need, Lowenstein Sandler has expanded our annual program to include an even deeper dive into cybersecurity issues of special interest to GCs, CPOs, and CIOs. Our interdisciplinary group of privacy and data security specialists has teamed with in-house counsel to develop programming aimed to help corporations and executives navigate the potential risks, regulations, and benefits at stake, as well as best practices to address these issues.

Topics include:

  • Data Protection Law Developments: A Year in Review and What to Expect in 2020
  • Artificial Intelligence: Preparing for the Future of Business
  • Cyber Insurance: What It Covers, Why You Need It, and How To Get It
  • Blockchain Promises Solutions Across Industries, But Will it Deliver?
  • Telehealth and Telemedicine: The Future of Health care?
  • Biometric Data: From Finger Scans to Facial Recognition, a Deeper Dive into Artificial Intelligence
  • State Privacy Laws: A Deeper Dive into New and Amended U.S. State Privacy and Cybersecurity Laws

Program time: 7:30 a.m.-2:15 p.m. 

Program location: Lowenstein Sandler LLP, One Lowenstein Drive, Roseland, New Jersey 07068; 973.597.2500. 

CLE credit available.

Wi-Fi access and conference space will be available to take phone calls and stay connected to your workday.



Professional Associations

Trustee, Black Maria Film Festival


Seminars

"Legal Aspects of Independent Film-Making", Temple University, November 2002, 2003, and 2004

Professional Activities and Experience

Accolades
  • Savare - Section 809 Panel on Streamlining & Codifying Acquisition (2017) - Named advisor on blockchain issues.
  • Savare - IAB Tech Lab Blockchain Working Group
  • Savare - IAB Tech Lab Education and Taxonomy Working Group
  • Online Marketing Institute Top 40+ Digital Strategists - Online Marketing Institute - 2014
  • Rising Star - Super Lawyers New Jersey - (2013-2014)
  • Variety - 2011
  • Outstanding Volunteer of the Year - New Jersey Volunteer Lawyers for the Arts - 2011
  • Distinguished Alumnus - Monmouth University - 2011

Articles

Fear of Brave? An Analysis of GDPR Challenges to Behavioral Advertising
Lowenstein Sandler LLP, November 2018

On September 12, 2018, a complaint was submitted to the Irish Data Protection Commission1on behalf of Johnny Ryan, Chief Policy and Industry Relations Officer at Brave Software, Inc., seeking to trigger, for the first time, an EU-wide investigation into certain data practices within the digital advertising industry...

Inflection Point for VR?
Lowenstein Sandler LLP, November 2017

 Despite predictions over the last several years that virtual reality (VR) and augmented reality (AR) were going to dominate consumer technology, adoption and sales have been slower than many had forecasted...

Beyond HTML5 and Java: What Developers and Publishers Need to Know When Creating Mobile Health Apps
Lowenstein Sandler LLP, November 2016

Mobile app usage has penetrated nearly every industry and facet of our lives, from banking and dating to transportation and dining. This is especially true in the health and wellness sector. In 2014, the number of U.S. consumers using mobile health apps was only 16 percent. Two years later, this percentage doubled to 33 percent 1 and continues to grow rapidly...

Additional Articles

Rampant, worldwide cyber security incidents such as data breaches, phishing attacks, and malware across various industries have caused tremendous damage (physical, monetary, and reputational) to companies and consumers. One only has to open a newspaper (likely a digital one) to witness a new attack. Just in recent years, billions of records have been compromised in massive, high-profile breaches.

Although not a new phenomenon, ransomware has emerged as another nefarious cyber threat. Insurance company Beazley reports that the number of ransomware attacks reported by its insureds quadrupled from 2015 to 2016. In 2016 and 2017, various strains of ransomware such as WannaCry and Petya have entered the global lexicon and wreaked havoc on hundreds of thousands of computers.

Is it time for virtual and augmented reality to enter the mainstream? And what are the IP pitfalls to avoid when embracing this evolving technology?

Read more about the myriad of intellectual property issues that may arise for companies involved in virtual, augmented, and mixed reality, in "Inflection point for VR?," published in Intellectual Property Magazine, authored by Partner, Matt Savare, and Associate, Leah Satlin.

 

Interested in learning more about open-sourcing issues and related risks developers face when using ethereum as a foundation for their own applications?

Read more about the challenges and potential pitfalls involved in using ethereum's open-source code in "Coders Beware: Licensing Issues Abound for Ether Apps," published in CoinDesk, authored by Partner, Matt Savare and Associate, John Wintermute.

*This was originally published on CoinDesk.

By now, most people have heard about bitcoin.  Fewer people, however, understand the difference between bitcoin and blockchain and even fewer people have heard about – or dealt with – Ethereum and smart contracts.  Read more about how smart contracts can help businesses in various industries and how in-house counsel and their outside law firms can create and implement smart contracts in "Get Smart: Automating Business Agreements with Blockchains and Smart Contracts," published in Global Banking & Finance Review®, by Matt Savare and Kurt M. Watkins.

This article was originally published in Global Banking & Finance Review® 

 

The State of California recently enacted into law the new, sweeping California Consumer Privacy Act of 2018, which will go into effect on Jan. 1, 2020. Experts estimate that the Act will apply to more than 500,000 U.S. companies, reaching businesses of various sizes in virtually every sector. To be sure, the Act will have profound implications for the digital advertising industry, given its breadth and seeming extraterritorial reach. In this article, we explore the scope of the law and how its more significant provisions will impact digital advertising.

Artificial intelligence (AI) has moved from the silver screen and into our homes and businesses. Without even knowing it, most people encounter some manifestation of AI in their daily lives. From intelligent personal assistants such as Siri and Alexa to integrated AI systems such as IBM’s Watson, AI has already had — and will continue to have — a profound impact on the way we work, play, and live.

As attorneys immersed in the technology space, we have witnessed, assessed, and counseled how technology, including AI, will affect the legal profession and the role of general counsel. Although technological changes are often incremental, the sum of those changes with respect to the increased adoption of AI will be enormous. For example, the ability of artificial intelligence to handle certain repetitive and mundane tasks of a legal department in a consistent and efficient way will precipitate a shift in focus of the legal team and likely reduce the legal resources necessary to perform many tasks.

In this article, we discuss three key areas that we believe AI will disrupt and how they will affect general counsel: (1) the automation of routine legal and compliance services and the corresponding repurposing of existing resources; (2) document assembly and analysis; and (3) the need to shift from tactical to strategic thinking.

 

It was the scenario that many people in the digital advertising world feared: yet another privacy law. In the past several months, the industry has had to digest not only the GDPR, but also the sweeping California Consumer Privacy Act of 2018. Most recently, on July 23, 2018, two Democratic state senators from New Jersey introduced a privacy bill, S2834, which, if passed and signed into law, would saddle the industry with additional, burdensome regulations. Equally important, the New Jersey bill is modeled on the poorly drafted California law, increasing the belief that other states will adopt the same template for their own regulations.

The bill requires operators of commercial internet websites or online services to notify customers of the collection and disclosure of their personally identifiable information (PII). As with the California law, the bill’s definitions are so broadly drafted that they are virtually boundless. For example, “customer” means “an individual within [New Jersey] who provides, either knowingly or unknowingly, personally identifiable information ….” Unlike most other state privacy laws, which generally protect the states’ residents, the bill purports to cover any individual within the state, regardless of residency.

Similarly, the definition of “personally identifiable information” is so expansive that it is difficult to conceive of any online service – especially one in the digital advertising ecosystem – that does not collect personal data. Under the proposed law, personally identifiable information means “any information that personally identifies, describes, or is able to be associated with a customer of a commercial Internet website or online service….” The definition lists 20 non-exhaustive categories of PII, including not only customary ones such as name and address, but also less-sensitive information such as height. Again, unlike many privacy laws – including California’s – there is no exception for anonymized, de-identified, or aggregated data.

Like the California law, the bill includes certain notice and disclosure obligations and imposes a “Do Not Sell My Personal Information” restriction. Specifically, the bill mandates that:

  • Any operator that collects PII of a customer:
    • Provide a notice in its privacy policy that includes, at a minimum, (1) a complete description of the PII collected, (2) all third parties to which it may disclose such PII, and (3) an email address or toll-free telephone number that the customer may use for specific privacy inquiries
    • Clearly and conspicuously post on its website or online service homepage a link titled “Do Not Sell My Personal Information,” which enables the customer to opt out of the disclosure of the customer’s PII
  • If an operator discloses a customer’s PII and receives a request from the customer, the operator must, within 30 days of the request, provide the following information at no cost: (1) the customer’s PII that it disclosed in the past 12 months and (2) the names and contact information for the third parties that received the customer’s PII

Similar to the California law, the bill prohibits operators from discriminating against or penalizing any customer who elects to opt out of the disclosure of his or her PII. However, unlike the California law, the bill does not expressly permit the operator to charge such a consumer a different price or rate or provide a different level or quality of service to account for the fact that the operator will no longer be allowed to commercialize the data. If the California law is suspect on constitutional grounds, it is difficult to argue how this provision in the New Jersey bill does not constitute an unauthorized taking.

The panoply of disparate, overlapping, and often contradictory state privacy laws is increasingly making operating in the online space, including the digital advertising industry, complicated at best and untenable at worst. If states continue to use the flawed California law as a template for their own privacy statutes, confusion, uncertainty, and compliance costs will continue to rise.

In spite of the various benefits of federalism, there are instances where federal legislation is required. Data privacy is one such area. There are signs from the White House and the Commerce Department that the federal government may step in and create a national privacy standard. Although similar prior efforts have died on the vine, the confluence of GDPR, the Cambridge Analytica scandal, and the numerous competing state laws may compel the federal government to step in and address not only the legitimate privacy concerns of consumers, but also the business and operational realities of living in a digital, connected world.

On September 12, 2018, Johnny Ryan, Chief Policy and Industry Relations Officer at Brave Software, submitted a complaint to the Irish Data Protection Commission seeking to trigger, for the first time, an EU-wide investigation into online behavioral advertising (OBA). The complaint primarily focuses on real-time bidding (RTB), the process often used within the digital advertising industry to carry out OBA. The complaint alleges that (i) OpenRTB, the widely-used technical protocol for RTB promulgated by IAB Technology Laboratory, constitutes a ‘‘mass data broadcast mechanism’’ that violates the General Data Protection Regulation (GDPR), (ii) there are no technical measures or adequate controls to support data protection during the RTB process; and (iii) legitimate interest can never be a valid legal basis in the context of widely-broadcast RTB requests. (A companion complaint filed with the UK Information Commissioner’s Office contains virtually identical allegations.)

In its October 30 decision, French supervisory authority, the CNIL, declared that French ad tech startup, Vectaury, violated the EU General Data Protection Regulation by not obtaining valid consent for its collection and use of geolocation data from partner apps and bid requests for targeted advertising purposes. Most notably, the CNIL concluded that Vectaury’s consent management platform, based on IAB Europe’s Transparency & Consent Framework to a certain degree, did not provide sufficiently transparent information about the purposes of data processing.

Since the TCF is used throughout the digital advertising industry to obtain consent, prognosticators are predicting the decision will end the advertising ecosystem and real-time bidding as we know it. However, we argue the CNIL’s key findings show that most of Vectaury’s consent deficiencies either violated, or were not caused by, the TCF and the TCF-specific issues are remediable.

It has been a particularly rough time for the digital advertising industry recently. In September 2018, complaints were submitted to the Irish Data Protection Commission and the UK Information Commissioner’s Office seeking a declaration that the two most widely-used real-time bidding protocols are “mass data broadcast mechanisms” that violate the GDPR. Then, in late October, the French supervisory authority, CNIL, declared that French ad tech startup, VECTAURY, violated the GDPR by not obtaining valid consent for its collection and use of geolocation data from its partners’ apps and real-time bid requests for targeted advertising and profiling purposes. Most recently, on December 3, the Office of the New York Attorney General (NYAG) announced a record settlement with Oath, formerly known as AOL, for violating the Children’s Online Privacy Protection Act (COPPA).

What makes the Oath settlement so newsworthy isn’t simply that Oath has agreed to pay a record-setting amount—almost $5 million—to settle allegations that, as AOL, it violated the federal privacy statute. This settlement is significant because it has established a new standard for notification under COPPA, with wide-reaching ramifications for the broader digital advertising ecosystem.

For background, COPPA mandates, among other things, that no personal information may be collected, used, or disclosed from children who are under 13 years of age without verifiable parental consent. Typically, COPPA applies to those websites and mobile applications designed primarily for such children audiences, such as the very popular Roblox website mentioned by the NYAG in its announcement. As of 2013, COPPA expanded the traditional definition of personal information to include persistent identifiers, such as device and location information, which have historically not been considered as requiring protection in the United States. COPPA is a strict liability statute that applies to any “operator” of a website or “online service” “directed to children,” or any operator that has actual knowledge that it is collecting or maintaining personal information from a child Both the FTC and state attorneys general have the authority to enforce COPPA.

Traditionally, COPPA enforcement at the federal and state level has focused on website operators and app developers whose users fall directly in the under-13 demographic. In June 2016, the FTC announced a then-record $4 million settlement (which was suspended to $950,000 based upon the company’s financial condition) with InMobi for deceptively tracking users without their permission contrary to representations to do just the opposite and, importantly, for deceptively tracking users under 13 years of age who had explicitly flagged that fact for the company. And, in September 2016, the NYAG announced the results of “Operation Child Tracker,” which focused on violations of COPPA by some of the most popular children’s websites. In both instances, however, the focus of law enforcement was on the website or application that was directly servicing the customer. In those cases, the notice to the companies that content was COPPA protected was simply a matter of evaluating the content of the companies themselves.

The most recent NYAG COPPA enforcement against Oath changes what notice means for a company operating in a COPPA-protected environment. According to the NYAG, AOL’s offending conduct was rooted in its operation of ad exchanges to conduct business and serve online behavioral advertising (otherwise known as targeted advertising) on websites that AOL knew were subject to COPPA.

The most significant aspect of this settlement involves what the NYAG asserted to be actual knowledge in this instance. First, as described by the NYAG, AOL received information directly from its customers that its websites were subject to COPPA and nevertheless served targeted ads to those users. Second, AOL conducted independent reviews of the content and privacy policies of websites, made a determination that those websites were subject to COPPA, and nevertheless served targeted advertising. Finally, AOL disregarded notifications it received from other ad exchanges during the bid process that particular ad inventory was subject to COPPA. In some instances, according to the NYAG, this disregard for COPPA flags was done purposefully to increase revenue. In short, notice was imputed based upon COPPA flags or identifiers passed along from one part of the ad tech stack to another. And, AOL disregarded those flags at its peril.

Companies across the digital advertising ecosystem, from publishers and SSPs to advertisers, DSPs, and exchanges, should pay special attention to this recent COPPA settlement and evaluate their own systems for COPPA compliance. Given that COPPA is a strict liability statute, it is possible that even unintentionally passing along targeting information when a COPPA flag is in place could result in liability. More broadly, with the recent decisions and guidance relating to GDPR and the impending implementation of the California Consumer Privacy Act, such companies should evaluate their data collection, use, and disclosures policies and procedures to ensure that they are complying with these myriad and complex regulatory requirements.

Companies should consider reviewing contractual terms and the implementation of individual contracts for compliance and consistency. And, companies should consider training line agents in how to identify potentially improper transactions before those transactions are made. Given the increased attention by regulators in this space, industry members should contemplate adopting a comprehensive compliance system to manage their risk.

Reprinted with permission from the December 27, 2018 edition of Legaltech News.

© 2018 ALM Media Properties, LLC. All rights reserved.

Further duplication without permission is prohibited. ALMReprints.com – 877.257.3382 - [email protected].

On January 28, 2019, the Panoptykon Foundation filed a complaint with the Polish Data Protection Authority against IAB Europe on behalf of an individual, alleging that OpenRTB, the widely-used real-time bidding (“RTB”) protocol promulgated by IAB Tech Lab,[1] violates numerous provisions of the General Data Protection Regulation (the “GDPR”). The complaint recycles many of the same arguments made to the Irish Data Protection Commission and the UK Information Commissioner’s Office in 2018; we analyzed these other arguments in a previous article.

The complaint argues that IAB should be considered a data “controller” under the GDPR for all processing activities undertaken through OpenRTB.  As discussed below, such a contention would dramatically (and improperly) expand the definition of “controller” and should be rejected by regulatory authorities.

CONTROLLER FRAMEWORK

The GDPR regulates “processing activities” (i.e., discrete operations performed on personal data).  An entity is either a data “controller” or a data “processor” with respect to such processing activities. In other words, the determination of an entity as controller or processor must be analyzed in relation to whatever processing activities are in question.

All processing activities have at least one controller. A controller determines the “purposes” and “means” of such processing activities, either alone or jointly with other controllers.  The “purpose” is why a processing activity is being carried out and the “means” are how that processing activity will be carried out.

Although the definition of “controller” has been interpreted broadly, guidance and case law require that an entity, alone or with others, have a certain level of factual “influence” on the purpose and means of processing to be considered to have “control.” Indeed, the Article 29 Working Party (the “WP”) provides that “eing a controller is primarily the consequence of the factual circumstance that an entity has chosen to process personal data for its own purposes.” (Emphasis added). 

The WP further explains that determination of the means “would imply control when the determination concerns the essential elements of the means.” (Emphasis added).

Determination of the “means” therefore includes both technical and organizational questions where the decision can be well delegated to processors (as e.g. “which hardware or software shall be used?”) and essential elements which are traditionally and inherently reserved to the determination of the controller, such as “which data shall be processed?”, "for how long shall they be processed?”, “who shall have access to them?”, and so on. (Emphasis added).

Thus, although determining the purpose of a processing activity automatically renders an entity a “controller,” an entity determining the means of processing is considered a controller only when such determination concerns the essential means.

THE JEHOVAH’S WITNESSES CASE

The complaint relies primarily on one case from the European Court of Justice (“ECJ”) to support its claim that IAB is a controller with respect to all processing activities undertaken through OpenRTB: Case C-25/17 (the “Jehovah’s Witnesses case”). The complaint also cites to Case C-210/16, which we discuss and distinguish in our previous article linked above.

In the Jehovah’s Witnesses case, Jehovah’s Witnesses members (i.e., preachers) went door-to-door to convert others to their faith.  The members wrote notes on their visits, such as the names of the people they visited, their addresses, and summaries of their conversations.  The Data Protection Supervisor claimed that the Jehovah’s Witnesses religious community (the “JWC”) was a controller in relation to the notes taken by their members during this door-to-door preaching.

The JWC contended that it was not a data controller because it did not determine the purposes and means of processing, alleging (1) the JWC did not formally require the collection of notes by its members and (2) the JWC did not have access to the members’ notes. 

The ECJ, along with the Advocate General, analyzed the JWC’s potential role as a data controller in relation to the specific processing activity in question: members’ note taking when door-to-door preaching. The ECJ held that the JWC “organized and coordinated” the preaching to such a level that it defined the purposes and means of processing in the context of that preaching jointly with its members.  The Advocate General emphasized that the JWC: (1) “gave very specific instructions for taking notes;” (2) allocated areas among the members to better organize the preaching and increase the chances of converting individuals; (3) kept records on how many publications the members disseminated and the amount of time they spent preaching; and (4) kept a register of individuals who did not want to be visited.

ANALYSIS OF THE COMPLAINT

First, the complaint alleges that “IAB is one of the two leading actors...in the market of behavioural advertising which organises, coordinates and develops the market by creating specifications of an API...that is utilised by companies that participate in [OpenRTB] auctions in ad markets...Those specifications are accompanied by the rules of their application [in the IAB Transparency and Consent Framework].[2]

The complaint does not claim that IAB is the controller of any specific processing activities. Instead, it claims that IAB is the controller of all processing activities undertaken through OpenRTB that relate to “behavioural advertising” because it is a “leading actor” that “organizes, develops and coordinates the market.” In other words, it is stating that IAB’s control over the market is similar to the JWC’s control over its preachers in that the IAB co-determines what personal data shall be processed and why for all participants within OpenRTB and, by extension, the multi-billion dollar behavioral advertising industry.

The complaint's reliance on the “organized and coordinated” language within the Jehovah's Witnesses case ignores all context in which it was used. "Organization and coordination" was only relevant because it resulted in determination of the purpose and means of processing within that particular fact pattern. In other words, the JWC's "organization and coordination" of the members' note-taking activity amounted to such tight control that it was viewed as instructing them to process personal data on its behalf for a discrete purpose and, thus, determining the "purpose and means of processing" as a joint controller.

However, the level of control in the Jehovah’s Witnesses case is readily distinguishable. The JWC instructed its members (i.e., its preachers) to go door-to-door for the discrete purpose of expanding the JWC's membership (i.e., converting others to its faith). To ensure the effectiveness of the activity, the JWC "organized and coordinated" the activity by assigning members to different areas, keeping track of how long they preached and how many publications they distributed, and providing detailed instructions on what notes to take (i.e., what personal data to process).

Conversely, IAB’s promulgation of the OpenRTB standard does not result in the IAB instructing or direction another business as to what personal data it shall in fact process or transfer through the protocol, or for which purposes. Unlike preachers answering to a centralized authority, entities do not engage in processing activities over OpenRTB at the behest of, or for a shared purpose with, IAB.

A standard’s primary objective is to define technical requirements for interoperability among various systems. As such, it allows entities to carry out activities more efficiently through a common “language.” However, this development of a communication medium by which processing activities may be carried out must be differentiated from the processing activities themselves. The purpose for initiating processing and each subsequent operation relating thereto (e.g., collection, transmission to downstream partners) is determined at the business level. IAB’s defining of the protocol’s structure provides the technological capability for entities to carry out such activities, but it is not, in itself, a processing activity or “purpose.” The alternative would create a virtually unlimited definition of “controller.”  Likewise, IAB’s theoretical ability to lessen the processing capability of the protocol – such as eliminating the “device identifier” field – also cannot be a criterion by which control is decided in this context. If such ability were acted upon, it would only limit the range of processing activities capable within that technology (activities that organizations need not carry out in the first place). This attenuated “control by exclusion” leads to bizarre results and is not what the GDPR intended. For example, any provider that releases technology capable of a range of processing activities would become a controller of all such activities if it ever puts limits on that capability.

Second, the complaint also alleges that “IAB has full control of how the behavioural advertising market within...[OpenRTB] is designed and operates, so it decides at its own discretion how the processing of personal data is to be carried out, e.g., by determining the elements that must be included in the so-called bid request, i.e., a request for submission of bids in an ad market.”  In other words, the complaint argues that IAB, through OpenRTB, decides the essential means of all processing activities carried out under the technical protocol.  

When analyzed against the aforementioned elements highlighted by the WP, IAB does not determine the “essential means” of the processing activities carried out over OpenRTB:

  • Which data shall be processed?
    • IAB does not decide what data will be processed when entities carry out processing activities via OpenRTB. The complaint conflates the capability to allow entities to process personal data with deciding what data shall be processed for any given processing activity.
    • The Panoptykon Foundation and related parties argue that because almost all bid requests sent via OpenRTB contain at least a device identifier, and OpenRTB documentation recommends device identifiers be included in bid requests, IAB decides that entities shall process device identifiers. Such an argument is unavailing. Technology providers routinely recommend that personal data be processed for added business value (e.g., SaaS platforms). No one has seriously argued that these providers automatically become joint controllers by virtue of doing so. The decision to process such data is left to the autonomy of the user.
  • For how long shall the data be processed/when should the data be deleted?
    • IAB does not mandate retention periods. Such a decision is solely within the business’s own legal judgment for what satisfies the storage limitation principle for its particular processing activities.
  • Who shall have access to the data?
    • This decision is, again, controlled entirely by the entities using OpenRTB and differs dramatically depending on the context in which advertising may be transacted (e.g., open auction, private marketplace, programmatic, or direct).

Third, according to the complaint, IAB is a controller because it has general knowledge that processing is happening through OpenRTB:

[T]he JWC, which collects information on its members (as opposed to information on persons visited by its members), becomes, by creating guidelines and maps, a joint controller of personal data of persons visited by its members despite the fact that it does not establish any direct interaction with those persons and has no access to such data. This instance can be directly compared with IAB, as its role is also to provide guidelines and specifications to companies that participate in auctions in ad markets. Like the community analysed by the CJEU, IAB has a general knowledge of the fact that processing is carried out and knows its purposes (matching ads in RTB model), as well as it organizes and coordinates activities of its members by way of management of... [OpenRTB].

Although it is correct that an entity does not need direct access to data to be a controller, it still needs a level of control sufficient to determine the purposes and means of processing.  Much of the above quote is a restatement of the complaint’s previous arguments examined herein.

However, there is one slightly different argument presented: “IAB has a general knowledge of the fact that processing is carried out and knows its purposes...” and thus is a controller. This “general knowledge” standard that the complaint proposes further ignores all context and nuance from the Jehovah’s Witnesses case. Clearly, every company that has a general knowledge that processing is being carried out through its technology, and is aware of the purpose for which its carried out, cannot be a joint controller or else virtually every technology company (e.g., standards bodies, SaaS platforms, and software companies) would be a joint controller.

The CJEU’s mention that the JWC was aware of its members’ processing of personal data (i.e., note taking) was to demonstrate that excessive formalism (i.e., an express written statement telling individuals to process data) should not be required when an entity has such a level of control that it, practically speaking, instructs others to carry out processing of personal data for a defined purpose. As mentioned above, providing guidelines and technical specifications to companies, or simply knowing that personal data is being processed through OpenRTB, does not amount to the level of control contemplated by the GDPR or prior case law to become a joint controller.  

Finally, the complaint contends: “The argument that [OpenRTB] created by IAB is only a technical protocol which does not obligate particular companies to process data is ill-advised and not true. [OpenRTB] enables and facilitates the processing and dissemination of data as the protocols that are connected with [Open RTB] include certain fields that are so designed that they trigger transfers of data, including sensitive data....”

Notwithstanding such allegations, OpenRTB does not obligate companies to process personal data. None of the required fields to send a bid request per OpenRTB documentation contain personal data. Fields at issue in the complaint, such as the description of a URL’s content, geolocation, device identifier, and user agent string, are optional. This optionality makes sense because OpenRTB is used for activities outside of “behavioral advertising,” such as digital out-of-home and contextual advertising, and in such cases personal data is often irrelevant or not processed.

As mentioned above, the complaint conflates providing entities the capability to process personal data, or recommendations to process personal data (e.g., bid request examples and recommended fields within OpenRTB documentation) for added business value, with having the requisite control to decide what data shall be processed by a particular entity. 

CONCLUSION 

It seems evident that behavioral advertising critics desire IAB to amend OpenRTB so that no personal data is capable of being transmitted at all. In their singular focus to bring about this result, these parties advocate expanding the definition of controller so broadly as to render almost all companies in every industry as controllers.

If they succeed, although they may accomplish their own goals, they may also stop (or diminish) the willingness of technology providers and standards bodies to engage in the European market.  Here, we should heed the Advocate General’s warning in the FashionID opinion that excessively expanding the scope of what constitutes a controller will create such a lack of clarity that it “...crosses into the realm of actual impossibility for a potential joint controller to comply with valid legislation.”


[1] The Complaint is filed against IAB Europe; however, IAB Tech Lab, rather than IAB Europe, promulgates the OpenRTB standard. For convenience, we use the term “IAB” throughout this article.

[2] All quotes taken from the complaint have been translated from their original Polish.

The digital advertising industry is undergoing a rapid regulatory transformation. The EU General Data Protection Regulation went into effect more than a year ago, and the California Consumer Privacy Act is right around the corner with a Jan. 1, 2020, effective date. Other jurisdictions are likely to follow. Industry lawyers created legal frameworks to comply with the GDPR but now need to determine what changes are needed to comply with the CCPA and, potentially, future privacy laws in other states.

One important part of that assessment is the data processing addendum.

Chapter excerpt: Talia Joy Castellanos, a thirteen-year-old YouTube and Instagram star, passed away on July 16, 2013, after a six-year battle with neuroblastoma and leukemia. Talia’s social media postings on makeup, fashion, and her battle with cancer, connected with fans worldwide. In addition to accumulating millions of YouTube and Instagram followers, Talia appeared on The Ellen DeGeneres Show and was named an honorary face of CoverGirl cosmetics. At the time of her death, the videos on Talia’s YouTube channel had accumulated tens of millions of views and were valuable assets.

Talia’s social media following created extensive and lucrative digital assets that raise a special set of questions about how these assets should have been managed before and upon her death. Who owns Talia’s videos and other postings? Who has the right to control her accounts?

______________________________________________

The American Bar Association's Tax, Estate, and Lifetime Planning for Minors focuses exclusively on the pertinent issues facing adults when planning for younger family members. This updated edition looks at the many and often complicated issues involved in these situations, including taxation, education funding, insurance, disability, and even planning for an adult’s incapacity or death. Combining core legal concepts with practical wisdom, Tax, Estate, and Lifetime Planning for Minors is a unique resource for attorneys advising clients in their estate and financial planning when minor children are involved. This is a ready reference not only for seasoned practitioners but for the general or novice practitioner handling their first estate plan.

(purchase required to access article)

In August 2018, Congress expanded the authority of the Committee on Foreign Investment in the United States (CFIUS) to review, block or unwind certain transactions involving foreign investment without “control” of key US assets, businesses or technologies. Following the passage of the Foreign Investment Risk Review Modernization Act (FIRRMA), CFIUS acting on authorities granted in FIRRMA issued new regulations under which certain transactions involving “critical technologies” impact the new pilot program industries that targets 27 industries.

Unlike the pre-FIRRMA CFIUS submissions that were technically voluntary, the new pilot program submissions are now required. (See 31 CFR Part 801.) Failure to submit a now required CFIUS Pilot Program submission can carry a maximum civil penalty equal to the value of the underlying transaction. The pilot program went into effect on November 10, 2018.

For transactions covered by the pilot program, which currently covers 27 critical industries, parties must submit a mandatory declaration if a transaction would give a non-US person control over the business or when a foreign person does not even gain “control” but merely makes a particular investment.

Pain Points and Requirements

The core new “pain point” is the now possible mandatory CFIUS Pilot Program (PP) filing, unlike the earlier filings which were “voluntary.” Broadly and with certain exceptions, the PP sets forth a 2-part test to assess if a mandatory filing applies to a transaction.

  1. The transaction must involve a Pilot Program US business even if a foreign person does not acquire “control” but would afford the foreign person access to “any material nonpublic tech info” possessed by the PP US business or membership on a board/governing body of the PP US business or any involvement (other than thru voting of shares) in substantive decision making of the PP US business about the use, development, acquisition or release of “critical technology” or the transaction could result in foreign “control” of the PP US business.
  2. The US business that is the recipient of the foreign investment must be one that produces, designs, tests, manufactures, fabricates or develops a “critical technology” that is (i) utilized in connection with the US business activity in a PP industry or (ii) designed by the US business specifically for use in one or more PP industries.

Shortly after the pilot program went into effect, the DOC’s Bureau of Industry and Security (BIS) issued a proposed rule to add 14 technology categories to list of emerging technologies, which could also be subject to the mandatory CFIUS declarations and would also impose new export license requirements by amending or adding additional Export Control Classification Numbers (ECCN) which are an alphanumeric designation (e.g., 1A984 or 4A001) used in the Commerce Control List (CCL) to identify items for export control purposes. Companies or individuals that wish to export items, technology or software on the CCL may be required to obtain an export license depending on the item being exported as informed by the correct ECCN properly determined and the country to which the item is being exported. Among these 14 additional technology categories were robotics, quantum computing, and artificial intelligence (AI).

It’s important to note that even if your organization is not seemingly related to one of the 27 specific industries, under other recent US Commerce Department action, your company may still be subject to the mandatory export license requirements. The DOC recently added discrete microwave transistors, continuity of operation software, post-quantum cryptography, underwater transducers, and air-launch platforms to the list of emerging technologies and designated these items with ECCNs than can trigger DOC export license requirements, outside the CFIUS purview.

The Issues with AI

The Trump Administration is keenly interested in controlling foreign access to AI. Artificial intelligence is increasingly viewed as critical to protecting US security interests because of its possible implications for military and national defense or other security policy. The AI field, which is focused on the capability of a machine to imitate intelligent human behavior, is rising rapidly with the advent of technologies such as driverless cars and autonomous weapons.

Machine learning, a collection of algorithms that can learn from and make predictions based on recorded data, makes up a large part of what drives AI. The accuracy of a machine learning model depends on the quality of that data. Within the IP sector, the number of machine learning related patents is growing because software-based methods and systems are generally patentable.

The data used to train machine learning models may be classified as “technical data” or information under export regulations. The International Traffic in Arms Regulations [ITAR] and the Export Administration Regulations [EAR] generally define export-controlled technical data or technology as information for the design, development, production, manufacture, assembly, operation, repair, testing, maintenance or modification of as relates to certain items as specific in the applicable control. There are various exclusions from export regulations of technical data or technology and may include:

  • Public domain or publicly available information
  • Education information, including that commonly taught in schools and universities
  • Fundamental research

Technical data or technology that falls into one of these three categories generally does not require a license to export, reexport, transfer, release or disclose to a foreign person. It is critical to know that any such export, release or disclosure in the United States to a foreign person, such as your employee, contractor or agent will be deemed by the US Government as an export to that foreign person’s country of nationality or birth in certain cases. These are commonly called “deemed exports.” Such export control license requirements are completely outside the scope of CFIUS’ purview.

Takeaways for AI Companies

The export controls DOC has proposed could end up hindering American AI technology development because the open availability and freewheeling exchange of information among employees and contractors of AI training data is so essential to researchers making strides in the field. As more AI may become subject to new and stricter export controls, the need to consider to obtain an export license to carry on with “routine” AI work will be critical. However, DOC also stated that it will not expand its jurisdiction over what it considers “fundamental research.”

The current policies assume that differentiating between commercial and military AI applications is easy, when in reality is there is plenty of overlap between the spaces. For example, iPhone users can unlock their phones with facial recognition technology. That same technology could be used to target weapons. As the regulations continue to roll out for identifying and imposing export controls on new ECCNs to AI and other emerging technologies essential to US national security, it will be important for lawmakers to consider how the AI export controls will be implemented so as not to hinder innovation.

Companies must be aware that while an export is generally considered to be materials, information, and technology that leaves the US, a deemed export is something that may be occurring frequently under the company’s nose. If deemed exports of newly controlled AI is not properly licensed for release or disclosure to foreign persons, export control violations will likely occur which may carry severe penalties. Items and technology that are controlled either by the EAR or ITAR will also be considered as critical technology by CFIUS for both voluntary and mandatory CFIUS purposes.

Companies looking to get ahead of the potential deemed export control implications, or seeking investment from foreign investors, should determine the ECCN of their AI, software, and other technology items. While there is still uncertainty in what will be implemented for CFIUS review, knowing these classifications will make it easier to understand what export licenses may be required in the future.

Reprinted with permission from the August 8, 2019, edition of Legaltech News.

© 2019 ALM Media Properties, LLC. All rights reserved.

Further duplication without permission is prohibited. ALMReprints.com – 877.257.3382 - [email protected].

This Legal Guide for Direct Brands provides a concise overview of legal issues often confronted by companies, particularly direct brands. Direct brands are companies characterized by their direct connections to consumers and are disrupting the business model of market-leading companies. This guide covers:

  • Why founder and equity agreements are vital for direct brands and the documents necessary for such investment transactions.
  • How commercial and intellectual property issues and privacy policies and terms can ensure companies retain all rights in the intellectual property of their businesses.
  • The essentials of privacy, advertising, and marketing law for direct brands.
  • The various types of product liability claims, legal defenses for product liability, and potential product liability claims for software that direct brands face.

This guide is intended to help companies spot issues that should be discussed with counsel and is not a substitute for legal advice.


WSG's members are independent firms and are not affiliated in the joint practice of professional services. Each member exercises its own individual judgments on all client matters.

HOME | SITE MAP | GLANCE | PRIVACY POLICY | DISCLAIMER |  © World Services Group, 2019