Week of December 6, 2021
December 13, 2021
Want to receive these weekly privacy recaps in your inbox? Sign up for our privacy newsletter, A Little Privacy, Please.
The Ohio House Government Oversight Committee heard testimony in a fourth hearing of the Ohio Personal Privacy Act (OPPA). The committee heard testimony from a class action privacy litigator, who opposed the bill, criticizing the bill’s carve-outs (including an exemption of personal health information and exclusion of information gathered for the improvement of services a user has requested) and language limiting enforcement to the attorney general.
WHY IT MATTERS
Introduced in July 2021 and receiving strong support from Ohio Governor Mike DeWine, OPPA is one of a handful of U.S. state privacy bills that are still active in 2021. The bill was scheduled for a possible committee vote on December 9, which didn’t happen.
The Ohio House isn’t scheduled to meet until mid January, but legislation carries over from 2021-2022. Borrowing several aspects from California’s CCPA, OPPA would provide consumers with rights to know, access, delete and opt out of the sale of their personal data and impose certain transparency obligations on businesses.
The law would not include a private right of action, instead establishing the Attorney General as the sole entity authorized to enforce the bill’s requirements.
One unique aspect of the bill that has not been seen in existing U.S. laws is the establishment of an affirmative defense for companies that comply with a written policy program conforming to the NIST Privacy Framework.
The Deceptive Experiences to Online Users Reduction (DETOUR) Act, sponsored by three democrat and three republican U.S. Senators, was reintroduced after dying in committee in the 2019-2020 session.
The DETOUR Act would prohibit online services with more than 100,000,000 authenticated users from designing a user interface that obscures user choice to obtain consent or user data or that cultivates compulsive usage by children under 13.
It also includes transparency and consent requirements to segment consumers for behavioral research. Violations of the Act would be enforceable by the Federal Trade Commission as unfair or deceptive acts or practices under the FTC Act.
WHY IT MATTERS
The FTC has already been focusing on “dark patterns”, holding a workshop on the topic in April 2021, and identifying manipulation of interfaces, including dark patterns, as part of its resolution to prioritize deceptive and manipulative conduct on the Internet as one of its key enforcement areas for 2021.
The DETOUR Act would solidify dark patterns as enforceable under the FTC Act, but it appears that the FTC may have already adopted that interpretation without the legislation.
The U.S. Senate Finance Subcommittee on Fiscal Responsibility and Economic Growth held a hearing on Promoting Competition, Growth, and Privacy Protection in the Technology Sector. A large focus of the hearing centered around the data broker industry, in particular on the potential intentional and unintentional harms arising from the data broker industry and what limitations should be placed on the collection and sale of data to address such harms. The subcommittee heard testimony from fellows at Duke University and Yale Law School studying the data broker industry and cross-border data flows with China, respectively, as well as Senior Counsel with The Future of Privacy Forum (FPF).
Testimony highlighted that that persistent, precise location information over time, even if anonymized, pseudonimized, or de-identified, is very straightforward to relate back to individuals, that large data sets often collected for benign or even altruistic purposes can be amassed and purchased by foreign governments to run disinformation campaigns or used to infer sensitive information like mental health conditions, and that it’s becoming increasingly untenable to separate commercial uses of data from law enforcement and national security uses of data.
Recommendations included restrictions on the sale of data to foreign companies, citizens and governments, strict control or outright bans on the sale of data in sensitive categories (including inferences), stronger cyber-security standards, accountability measures such as transparency, risk assessment, and auditing, restrictions on the sale of information to law enforcement agencies, sectoral legislation for uniquely high-risk technology such as biometrics and facial recognition, and updates to existing federal law such as the Fair Credit Reporting Act to more effectively cover emerging uses of data.
WHY IT MATTERS
One common theme from all of the testimony was that the issues are incredibly nuanced and evolving and therefore that copying and pasting existing privacy legislation wouldn’t suffice. As Senator Cassidy pointed out in the hearing, big data can be very helpful, but there’s nuances between using the data appropriately and using it nefariously, so getting those nuances down is important to crafting effective legislation.
U.S. Senators Coons, Portman and Kobuchar introduced the Platform Accountability and Transparency Act (PATA), which would require social media platforms to make data available to independent researchers, under certain conditions, and would authorize the FTC to establish regular reporting and disclosure requirements regarding platform advertising (including advertisers, ad content, metrics, and targeting criteria), algorithms, content moderation, and violating content.
WHY IT MATTERS
The bipartisan legislation is positioned by the Senators as a way to hold social media platforms accountable and to understand what information they have on users and what they do with it. Based on the type of information that would be reported, however, it would also reveal (to independent researchers, the FTC and potentially the public) information about advertisers and ad campaigns that some advertisers may currently consider to be confidential and proprietary.
The Wiesbaden Administrative Court granted an interim injunction prohibiting the RheinMain University of Applied Sciences from using the Cookiebot consent management platform on its website. The Court held that Cookiebot’s collection and transfer of users’ full IP address to servers in the United States without meeting any of the conditions for transfer to a third country required by the GDPR was an infringement of user rights under GDPR and that, as the controller of the data collected from its website, the University was responsible.
WHY IT MATTERS
This decision highlights the importance of conducting due diligence on not only a company’s direct processors but also any subprocessors, to understand where data is transferred, in what form, and whether appropriate safeguards are implemented. In this case, the University did not conclude standard contractual clauses with its direct processor, and the standard contractual clauses between the processor and the subprocessor lacked supplementary safeguards.
Canada’s Privacy Commissioner ‘s Office submitted an Annual Report to Parliament, summarizing the office’s activity under the Privacy Act and the Personal Information Protection and Electronic Documents Act and highlighting some gaps and concerns with Canadian privacy legislation. The Commissioner’s Office notes that their ultimate objective of restoring Canadians’ trust in government and the digital economy will remain out of reach until the government enacts new federal laws that appropriately protect privacy rights in Canada.
The report suggested that permissible uses of data should be better defined, that a rights-based framework should be adopted (rather than placing rights and commercial interests on the same footing), that organizational accountability should be objective and demonstrable, that privacy principles should be common to public and private sectors, that the laws should be interoperable, both internationally and domestically, and that enforcement mechanisms should be quick and effective.
WHY IT MATTERS
Quebec passed Bill 64 in September 2021, which will come into force in September 2023, but Canada (despite efforts) failed to pass national privacy legislation this year. The annual report applauds the rights-based approach and accountability mechanisms adopted in Quebec and notes that a number of elements of Bill 64 are consistent with the law reform proposals the Commissioner’s office has put forward. Bill 64 may therefore set a precedent for legislation in other parts of Canada or at the national level.
The Act requires the development of data governance processes, data management policies, technical solutions for transfer and de-indexing of data, and internal guidelines. It also requires specific, express consent in certain circumstances, such as for processing sensitive personal information like medical or biometric information.
Want more of the privacy highlights that matter to adtech and martech? Sign up for our privacy newsletter, A Little Privacy, Please.
A Little Privacy, Please weekly recaps are provided for general, informational purposes only, do not constitute legal advice, and should not be relied upon for legal decision-making. Please consult an attorney to determine how legal updates may impact you or your business.
Latest Blog Posts
A memorandum from the California Privacy Protection Agency (CPPA) staff proposes...
The ICO previously made an announcement on its website warning that...
Publisher Collective recognised the importance of collecting consent in...
Latest White Papers
The current state of publisher compliance with CCPA, and...
How to review your vendor list to mitigate compliance...
Keep in touch
Sign up for our newsletter to keep up with privacy news for adtech and martech,
plus occasional company news.