13 key takeaways from recent FTC health data cases
July 31, 2023
Want to receive these weekly privacy recaps in your inbox? Sign up for our privacy newsletter, A Little Privacy, Please.
FTC Blog Post Highlights “Takeaways” From Health Data Cases
First, health data is broader than medications, procedures and diagnoses and includes “anything that conveys information — or enables an inference — about a consumer’s health”, which may be inferred from the fact that a consumer is using a particular health-related app or how they interact with that app.
Second, the need for privacy-by-design for health information should be a given.
Third, use of sensitive data for marketing or advertising purposes (including sending the data to third parties via tracking pixels or SDKs) may run afoul of the FTC Act if it conflicts with privacy promises or if the company fails to get affirmative express consent for such disclosure.
Fifth, a key to compliance is to understand all of your data flows to ensure promises to consumers are consistent with, and keep up over time with, actual practices (and it’s not an excuse that technical staff uses pixels or SDKs without letting compliance folks know).
Sixth, be careful making loose claims like “HIPAA Compliant” or “HIPAA Secure” (even if given a seal by purported certifiers)– only the Department of Health & Human Services‘ Office for Civil Rights can make that determination.
Seventh, companies offering those certifications can get in trouble too.
Eighth, reserving the right to make changes to your health data practices doesn’t constitute lawful consent to material retroactive changes.
Ninth, disclosures leading to consent should be clear, conspicuous and prominent (e.g., front-and-center on the homepage, saying “we share your health information with third-party advertising companies so that we can target you with ads”) rather than ambiguous language buried in dense privacy policies.
Tenth, deception can be about what you don’t say, not just about what you do say, so it’s crucial to disclose all material information to consumers about how you’re using and disclosing sensitive health information.
Eleventh, sensitive data protected by the FTC Act includes biometric data, such as voice data, video data and DNA information.
Twelfth, reproductive information is of crucial importance to the FTC, and half measures to protect and secure it won’t cut it.
Thirteenth, violating these principles can be expensive and lead to penalties beyond fines, such as bans from disclosing health data for advertising purposes or instruction to delete all previously collected information.
Although this is just a blog post, it may be useful insight into areas from recent cases that the FTC wants companies to pay particular attention to and potentially a precursor to future enforcement.
Federal Kids Privacy Legislation Progresses
The U.S. Senate Commerce Committee unanimously passed amended versions of two childrens’ privacy bills: the Kids online Safety Act (aka KOSA) (S. 1409) and the Children and Teens Online Privacy Protection Act (aka COPPA 2.0) (S. 1418). Both bills will be reported to the full Senate.
If passed in its current form COPPA 2.0 would expand the existing Childrens’ Online Privacy Protection Act (COPPA).
For example, COPPA 2.0 extends protections to children under 17 and broadens the scope of covered platforms to include those directed to children or with “actual knowledge or knowledge fairly implied on the basis of objective circumstances”.
KOSA would require covered platforms to filter certain content made available to children under 17.
Norway DPA Issues New Guidance on Use of Tracking Tools
The Norwegian Data Protection Authority (Datatilsynet) issued new “Advice for website analytics and tracking” to help companies assess whether the tools they use are in compliance with Norwegian privacy rules.
The guidance includes the following noteworthy reminders:
- IP address will in itself be considered personal data, and cookie ID, location data or detailed information about user devices can also constitute personal data that should be treated in accordance with GDPR;
- It is not permitted to use an analysis tool that collects personal information you don’t use or that stores personal information longer than necessary;
- A cookie banner that only meets the requirements of Norweigan Cookie regulations (and not also GDPR) is not sufficient, and, if consent is relied upon, the requirements for valid consent must be met (e.g., active action, as easy to say no as to say yes, no negative consequences if consent isn’t provided, clear disclosures–such that users understand what they’re consenting to, easy to withdraw consent, and consent on a purpose-by-purpose basis);
- Ensure that either the tool only processes data on your behalf or, if not, you are carefully assessing any further use to ensure it is compliant;
- Either avoid using analytics tools on websites that may reveal special categories of information (which may be revealed from mere visitor behavior on certain websites) or take extra precaution to ensure the extra steps for special categories of data are conducted;
- Check whether data may be transferred to (including remote access from) a country outside the EU/EEA and whether the country has sufficient level of protection;
- Provide honest and easily understandable information about how you process personal data;
- Respect user rights (e.g., to access or delete); and
- Understand and comply with all other duties under the GDPR.
This guidance was released in conjunction with a post revealing that, although the problem with use of Google Analytics stemming from the transfer of personal data to American businesses “seems to have been solved”, “there may be other privacy challenges with the tool”, and companies are still responsible for ensuring the use of the tools they select comply with privacy rules.
Meta Subs Fined Under Australian Law For Misleading Data Practices
The Australian Federal Court ordered two Meta subsidiaries, Facebook Israel and Onavo (a free virtual private network service), to pay $10 million each.
The order is based on a declaration that the companies failed to adequately disclose that user data, including internet and app activity, collected in anonymised and aggregated form from the Onavo app, would be used for purposes other than providing the app—including sharing with parent company Meta for market research activity and other commercial benefits.
The court held that this conduct was liable to mislead the public and therefore a violation of Australian Consumer Law.
Part of the issue appears to be based on how Facebook Isreal and Onavo promoted the app, making statements such as “Use a free, fast and secure VPN to protect personal information” and “Helps Keep You and Your Data Safe”, without disclosing that the app would share data with Meta.
Want more of the privacy highlights that matter to adtech and martech? Sign up for our privacy newsletter, A Little Privacy, Please.
A Little Privacy, Please weekly recaps are provided for general, informational purposes only, do not constitute legal advice, and should not be relied upon for legal decision-making. Please consult an attorney to determine how legal updates may impact you or your business.
Latest Blog Posts
The Federal Trade Commission sent warning letters to five...
Delaware HB 154, implementing the Delaware Personal Data Privacy Act,...
How do different U.S. state laws define and protect...
Latest White Papers
The current state of publisher compliance with CCPA, and...
How to review your vendor list to mitigate compliance...
Keep in touch
Sign up for our newsletter to keep up with privacy news for adtech and martech,
plus occasional company news.