Want to receive these weekly privacy recaps in your inbox? Sign up for our privacy newsletter, A Little Privacy, Please.
uK CHILDREN’S CODE GOES INTO EFFECT
The UK Age Appropriate Design Code (commonly referred to as the “Children’s Code“) went into force this week after a 12-month transition period. The code applies to providers of online products or services that process personal data and are likely to be accessed by children under age 18 in the UK (even if the service is not aimed at children). The code requires companies to meet certain standards of age appropriate design, including establishing (with a level of certainty appropriate to the risk) what age range individual users fall into and tailoring protections and safeguards accordingly.
The UK presents a much broader standard than the Children’s Online Privacy Protection Act in the United States, which applies to online services “directed to children” under the age of 13. Given the increased focus on children’s safeguards in the United States and throughout the world, the UK code has already and will likely continue to influence broad legislation beyond the UK.
NEW INTERACTIVE REFERENCE TOOL FROM THE CNIL
The France Data Protection Authority, CNIL, released an interactive informational map displaying information about the privacy laws, data protection authority and adequacy status of each country throughout the world.
WHY THIS MATTERS
This will be a useful reference guide for all privacy professionals to keep track of increasingly expanding international privacy laws.
BRAZIL CONSTITUTIONAL AMENDMENT
The Brazil Chamber of Deputies approved a Senate proposed constitutional amendment to make the protection of personal data a fundamental right. The approved version removed the creation of a regulatory agency on data protection, as originally proposed by the Senate. The final version will return to the Senate for approval.
WHY THIS MATTERS
If approved, the government will be responsible under the Constitution for organizing and supervising the protection of personal data.
CHina ALGORITHMIC RECOMMENDATION MANAGEMENT PROVISIONS
China’s Cyberspace Administration release for public comment draft “Internet Information Service Algorithmic Recommendation Management Provisions” to standardize algorithmic recommendation activities. Among other requirements and restrictions, the provisions would prohibit setting up discriminatory, biased or otherwise harmful user tags for information content recommendation or using algorithms to extend unreasonably differentiated treatment in pricing or trade conditions to consumers and would require measures to optimize transparency, avoid harmful influence on users, and allow users to choose, revise or delete user tags.
This adds to the increased focus we’ve seen recently around the world regarding bias, discrimination and other potentially harmful impacts from the use of AI and algorithmic decision-making in content and advertising delivery (see, for example, Hong Kong’s Guidance on Ethical Development and Use of AI and FTC Commissioner Rebecca Kelly Slaughter’s whitepaper on algorithmic decision-making, both published last month).
SOUTH KOREA ANNUAL REPORT ON PERSONAL INFORMATION CASES
South Korea’s Personal Information Protection Commission (PIPC) released an annual report revealing that personal information cases and dispute mediation rose 19.8% and 22.4%, respectively, in 2020 over 2019 “proving that the use of personal information and its protection are becoming increasingly important in the digital age”.
WHY THIS MATTERS
South Korea has a strict, comprehensive privacy regime that is heavily enforced. Last August, the PIPC’s status was elevated to the sole supervisory authority responsible for the country’s Personal Information Protection Act, which was previously enforced by several agencies.
BRAZIL’S LGPD APPLICATION TO SMALL BUSINESSES
Brazil’s National Data Protection Authority issued a draft resolution and opened public comment regarding application of the General Law of Personal Data Protection (LGPD) to micro and small businesses and startups, which would have “simplified and differentiated ” procedures under the Law. Certain “high risk” processing activities, such as the processing of sensitive or children’s data, would be excluded from the differentiated treatment.
This proposal, along with the DPA’s pledge to take a responsive approach to organizations failing to comply (as reported by ZDNet), indicates a potentially balanced application of the law, taking into account the feasibility and willingness of the company to comply.
Want more of the privacy highlights that matter to adtech and martech? Sign up for our privacy newsletter, A Little Privacy, Please.
A Little Privacy, Please weekly recaps are provided for general, informational purposes only, do not constitute legal advice, and should not be relied upon for legal decision-making. Please consult an attorney to determine how legal updates may impact you or your business.
Keep in touch
Sign up for our newsletter to keep up with privacy news for adtech and martech,
plus occasional company news.