Catagory:Uncategorized

1
FTC Bans Rite Aid from Using AI Facial Recognition Without Reasonable Safeguards
2
Breaking Down the Privacy Act Review Report #3: Removal of the Small Business Exemption
3
New Privacy Enforcement Act commences in Australia
4
Attorney-General Mark Dreyfus pledges sweeping data privacy reforms
5
New concerns over China’s ability to access user data on WeChat
6
Queen’s speech heralds UK GDPR overhaul
7
What is Required under The PIPL: A PRC-Based Representative or a Personal Information Protection Officer?
8
EU-REPUBLIC OF KOREA ADEQUACY DECISIONS FINALIZED
9
Mask Off: Social Media Giants to Unmask Trolls or Risk Themselves Becoming Liable for Defamation Payouts
10
Privacy Pandemic: Australians Losing Trust in Institutions’ Use of Their Data

FTC Bans Rite Aid from Using AI Facial Recognition Without Reasonable Safeguards

By Whitney E. McCollum and Eric F. Vicente Flores

The Federal Trade Commission (FTC) issued a first-of-its-kind proposed order prohibiting Rite Aid Corporation from using facial recognition technology for surveillance purposes for five years.

The FTC alleged that Rite Aid’s facial recognition technology generated thousands of false-positive matches that incorrectly indicated a consumer matched the identity of an individual who was suspected or accused of wrongdoing. The FTC alleged that false-positive matches were more likely to occur in Rite Aid stores located in “plurality-Black” “plurality-Asian” and “plurality-Latino” areas. Additionally, Rite Aid allegedly failed to take reasonable measures to prevent harm to consumers when deploying its facial recognition technology. Reasonable measures include: inquiring about the accuracy of its technology before using it; preventing the use of low-quality images; training or overseeing employees tasked with operating the facial recognition technology; and implementing procedures for tracking the rate of false positive matches.

Read More

Breaking Down the Privacy Act Review Report #3: Removal of the Small Business Exemption

By Cameron AbbottRob Pulham and Stephanie Mayhew

Currently, most small businesses (with some exceptions) are not covered by the Privacy Act – with the threshold shaping a small business being an annual turnover of $3 million or less. However the Attorney General’s Department recognises that Australians want their privacy protected and that small businesses shouldn’t be excepted from this.

In the long term, proposal 6.1 seeks to remove the small business exemption but only after:

  • an impact analysis has been undertaken
  • appropriate support is developed
  • in consultation with small businesses, the most appropriate way for small business to meet their obligations is determined (propionate to the risk) – e.g. through a code, and
  • small businesses are in a position to comply with these obligations.

Proposal 6.2, in the shorter term, seeks to ensure that small businesses comply with the Privacy Act in relation to the collection of biometric information and remove the exemption from the Privacy Act for small businesses that obtain consent to trade in personal information (trading in personal information will mean the Privacy Act applies).

Read More

New Privacy Enforcement Act commences in Australia

By Cameron Abbott, Rob Pulham and Stephanie Mayhew

As of yesterday, the Privacy Legislation Amendment (Enforcement and Other Measures) Act 2022 (Privacy Enforcement Act) is now in effect after receiving Royal Assent on 12 December 2022.

As we have previously shared, the Privacy Enforcement Act increases the maximum penalties for serious or repeated privacy breaches. For body corporates/organisations this increases the penalty from the current $2.22 million to whichever is the greater of:

Read More

Attorney-General Mark Dreyfus pledges sweeping data privacy reforms

By Cameron Abbott, Rob Pulham and Hugo Chow

Newly sworn-in Attorney-General Mark Dreyfus has announced that there is a range of “sweeping reforms” that are needed to be made to Australia’s privacy laws, and that he is committed to making these changes during the government’s first term in parliament.

Mr Dreyfus’ department is currently reviewing the feedback it has received from its discussion paper around the current review of the Privacy Act 1988 (Cth) (Privacy Act). Mr Dreyfus said that “Everyone agrees that the Commonwealth Privacy Act is out of date and in need of reform for the digital age”, and that he is hoping to bring a final report of reform proposals into the public domain in the coming months.

Privacy practitioners have for years been anticipating some level of reform as the winds of change have been blowing, but it has not been easy to predict what may change, or when. Proposed changes include strengthening individuals’ privacy rights, including creating a direct cause of action or statutory right for breaches of privacy laws; introducing specific codes for certain industries; and increasing maximum penalties which are significantly out of step with international jurisdictions and with other key Australian business laws.

However such changes are not likely to be welcomed by all, even if “everyone agrees” the Privacy Act is out of date and in need of reform, with business groups opposed to areas of proposed reform such as allowing individuals to bring claims directly against companies.

It is a fascinating precursor to what may become hotly contested reforms with significant impact on how businesses engage with their customers. It may be hard to tell but privacy nerds are on the edge of our seats as the reforms, much talked about, move a step closer to taking shape. There’s never been a better time to start paying attention.

New concerns over China’s ability to access user data on WeChat

By Cameron Abbott and Hugo Chow

A recent report by cybersecurity firm, Internet 2.0, has raised concerns about the Chinese Communist Party’s ability to access the data of millions of users around the world of social media and payment application, WeChat.

WeChat is significant as it is the application that nearly all citizens in China use on a daily basis for communication, payments for services and as a way for citizens to connect through social media. Although the majority of WeChat’s more than 1 billion users are located in China, there are approximately 600,000 users in Australia, 1.3 million users in the UK, and 1.5 million users in the United States.

One of the concerns the report outlines is that although WeChat states that its servers are kept outside mainland China, all user data that WeChat logs and posts to its logging server goes directly to Hong Kong. And the report argues that under Hong Kong’s new National Security Legislation, there is little difference between Hong Kong resident servers and servers in mainland China.

As a result, due to China’s National Intelligence Law which requires organisations and citizens to “support, assist and cooperate with the state intelligence work”, there are concerns that the WeChat logging data that goes to servers in Hong Kong may be accessed by the Chinese Government upon request. The report states that the data that goes to Hong Kong is log data, which includes the user’s mobile network, device information, GPS information, phone ID, the version of the operating system of the device, but does not include information such as content of a conversation.

Another concern the report outlines is that although there was no evidence that chats were stored outside the user’s device, the report found that WeChat had the potential to access all the data in a user’s clipboard. This means that there is the potential for WeChat to access the data that is copied and pasted by users on WeChat, which is a risk to people using password managers that rely on the clipboard feature to copy and paste their passwords.

We expect to hear more about these sorts of concerns from a range of jurisdictions.

Queen’s speech heralds UK GDPR overhaul

By Claude-Étienne Armingaud and Nóirín McFadden

In the Queen’s speech at the state opening of parliament on 10 May 2022, the UK government announced its intention to change the UK’s data protection regime in a new Data Reform Bill. This follows a consultation last Autumn on how the UK GDPR could be reformed following the UK’s exit from the European Union (EU).

The government claims that the new Bill would:

  • Create a data protection framework focused on “privacy outcomes” that would reduce the burdens on businesses, and a “clearer regulatory environment” to encourage “responsible innovation”.
  • Ensure that citizens’ data is “protected to a gold standard”, while enabling more efficient sharing of data between public bodies.
  • Modernise the Information Commissioner’s Office and require it to be “more accountable to Parliament and the public”.

The Queen’s speech also announced plans to replace the Human Rights Act 1998, which incorporated the European Convention on Human Rights into UK law. According to the government a new “Bill of Rights” would “end the abuse of the human rights framework and restore some common sense to [the] justice system”. This would be achieved by “establishing the primacy of UK case law”, which means that UK courts would no longer be required to follow the case law of the European Court of Human Rights.

Taken together, both of these proposed new legislative measures could change the balance of protection of individuals’ rights in the UK, both generally and in the specific area of personal data regulation. Their development will be closely watched by data protection professionals, because any significant changes in the UK data protection regime could prompt the EU to review its post-Brexit UK adequacy decision, potentially leading to the end of decades of seamless transfers of personal data from the EU to the UK.

What is Required under The PIPL: A PRC-Based Representative or a Personal Information Protection Officer?

By Dr. Amigo L. Xie, Xiaotong Wang, Grace Ye and Yibo Wu

Multinational entities with operations in or having businesses with the People’s Republic of China (PRC) should take note of the PRC’s new Personal Information Protection Law (PIPL), which took effect on 1 November 2021 and is extraterritorial in scope and effect. 

This alert lays out the differences between the requirements under Article 52 PIPL (PIPO appointment) and Article 53 PIPL (PRC-based representative appointment / establishment of an agency in the PRC). It also examines statutory obligations under PIPL upon designated personnel and highlights important sector-specific regulations and provincial and municipal government practices.

Click here to read the full alert.

EU-REPUBLIC OF KOREA ADEQUACY DECISIONS FINALIZED

By Claude-Etienne Armingaud, Andrew L. Chung, Camille Scarparo and Eric Yoon

Following the conclusion of the adequacy talks in March 2021, the European Commission has adopted on 17 December 2021 an adequacy decision addressing the transfers of personal data to the Republic of Korea under the General Data Protection Regulation (GDPR) and the Law Enforcement Directive.

Both texts prohibit the transfer of personal data to “third countries” unless (a) the destination country benefits from (i) an adequacy decision or (ii) appropriate safeguards, such as standard contractual clauses (see our alert here) or codes of conduct (see our alert here); or (b) one of the limited derogations under Article 49 GDPR applies.

With regards to the adequacy talks, the Republic of Korea agreed on the implementation of additional safeguards. Accordingly, the reform of Republic of Korea’s data protection framework (the Personal Information Protection Act) in August 2020, implemented several additional safeguards including transparency provisions and enforcement power strengthening of the Personal Information Protection Commission (§70).

The Republic of Korea adequacy decision complements the Free Trade Agreement (FTA) of July 2011 and allows a seamless flow of personal data between the Republic of Korea and the European Union.

Unlike the UK adequacy decision which contains a sunset clause (see our alert here), the Republic of Korea adequacy decision is not limited in time. However, pursuant to Article 45.3 GDPR, the European Commission carry out a first review of the decision after three years to evaluate any evolution in the Republic of Korea data protection framework, that would lead to divergence with the EU regulations (§220). 

The Republic of Korea now belongs to the increasing group of third countries benefiting from an adequacy decision (including, since GDPR’s entry into force, Japan and the UK).

The firm’s global data protection team (including in each of our European offices) remains available to assist you in achieving the compliance of your data transfers at global levels.

Mask Off: Social Media Giants to Unmask Trolls or Risk Themselves Becoming Liable for Defamation Payouts

By Cameron Abbott, Rob Pulham, Warwick Andersen, Max Evans and James Gray

In a significant development in online regulatory oversight, the Australian government announced over the weekend that it will introduce new laws handing Australian courts the power to order social media companies to reveal the identities of anonymous trolls or risk themselves being liable for defamation payouts.

The so called “social media anti-trolling legislation” which the government has said will be introduced into parliament this week proposes to require social media companies stand up a functional and easy-to-use complaints and takedown process for users, who upon suspecting they are being defamed, bullied or attacked may file a complaint with the social media platform requesting that the relevant content be removed.

If that request is denied, the complainant can ask the social media company to provide the details of the “troll” so as to enable the complainant to commence an action. If this request is further denied, or if the social media platform is “unable to do this”, complainants may apply to obtain a court order requiring the social media company to release the identification details of the anonymous user so that a defamation action may be pursued. Failure to comply with such a court order will render the social media company themselves liable for the defamation claim.

Significantly, the reports indicate that these new laws will push legal responsibility for defamatory content from the author or page manager to the social media company which runs the platform. This represents a key move away from social media platforms being distributors of content but rather, in the eyes of online safety, being deemed publishers themselves. We will keep you posted as these proposed laws progress.

Privacy Pandemic: Australians Losing Trust in Institutions’ Use of Their Data

By Cameron Abbott, Rob Pulham, Max Evans and James Gray

In the age of QR code check-ins and vaccination certificates, as Australia edges towards a post-pandemic (or mid-pandemic, it increasingly seems) “normal”, new research from the Australian National University (ANU) has revealed that Australians have become less trusting of institutions with regards to data privacy.

The ANU researchers said that the decrease in public trust between May 2020 and August 2021 was small but “statistically significant”. A key reason for this decrease, according to the researchers, was concern around “how their private data from check-in apps might be used by major institutions” as lockdowns and the use of apps for contact tracing intensified.

The institutions which experienced the greatest loss of trust were social media companies (10.1% decline), telecommunications companies, and federal, state and territory governments. This echoes sentiment from the OAIC following its recent ‘community attitudes to privacy’ survey that Australians trust social media companies the least when it comes to handling personal information, followed by the government.

While it remains to be seen whether this loss of trust becomes a permanent trend, one way to make Australians more comfortable with an organisation’s data practices – as reinforced by the OAIC – is to ensure the purpose of the collection and use of personal information is clearly understood. The OAIC has found that Australians are increasingly questioning data practices where the purpose for collecting personal information is unclear.

With increased penalties for privacy non-compliance looming, there’s never been a better time to revisit your privacy policies and collection statements to make sure that these are clear, so your organisation can stand out against this trend and build consumer trust.

Copyright © 2024, K&L Gates LLP. All Rights Reserved.