Category: Legal & Regulatory Risk

1
California’s answer to the GDPR – the California Consumer Privacy Act kicks in on 1 Jan 2020
2
Double-Edged Sword: Cambridge Analytica Whistle-Blower exposes the dual nature of Technology
3
AI (Adverse Inferences): AI Lending Models may show unconscious bias, according to Report.
4
You can be anonymised, but you can’t hide
5
The OAIC engages in more in-depth investigations and stronger exercise of its power
6
Canada proposes to increase penalties for tech giants in its Digital Charter
7
Sharing of ‘abhorrent violent material’ now an offence under new laws
8
Consumer Data Right Draft Rules – submissions closing soon
9
PROPOSAL TO INCREASE PENALTIES FOR PRIVACY BREACHES
10
IoT (internet of things) legislation makes an appearance in the U.S. Senate

California’s answer to the GDPR – the California Consumer Privacy Act kicks in on 1 Jan 2020

By Cameron Abbott ,Tan Xin Ya and John ReVeal

In just a short few weeks, a monumental change of privacy regulations will kick in for US businesses. On 1 January 2020, the California Consumer Privacy Act (CCPA) will come into effect, with a compliance deadline at the end of January 2020, and signifies a shift in tone in the privacy sphere for the US – with a move closer to global privacy norms, and away from the perspective that personal data is a company asset.

A series of data disasters such as Facebook’s Cambridge Analytica scandal and the massive Equifax breach left many Americans feeling powerless. Regulators stepped in after the fact to punish the companies, but at the time, there was little that U.S. consumers could do to prevent data breaches. Under the CCPA, Americans (well, Californians, mostly) move a step closer to general privacy protection. However, the Act only targets larger companies or those with prolific data use so there is still a long way to go to being general protection.

In October, the California Governor signed five bills to amend CCPA to provide some regulatory relief for businesses when the CCPA comes into effect. For a detailed analysis on the amendments, we refer you to Volume 2 of our colleagues’ Volume 2 of The Privacists available at the K&L Gates Hub.

Double-Edged Sword: Cambridge Analytica Whistle-Blower exposes the dual nature of Technology

By Cameron Abbott, Max Evans and James Gray

In his cautionary tale, 1984, author George Orwell spoke of a paradigm where the unregulated use of powerful technology, referred to as “telescreens”, manifested a society beholden to the ethics of the controller. This paradigm is perhaps more real than ever, according to an article by Reuters

By exploring the views of Cambridge Analytica whistle-blower Christopher Wylie, the article advises that the deep, multifaceted involvement of big tech companies in consumers’ lives, the ultimate dependence that arises from such involvement and the overwhelming vulnerability of such consumers renders tech companies “too big to fail”. Wylie argues that the vast imbalance of power and information in favour of these companies over users is resulting in a constant scrambling by regulators to control the rapid adoption of such technology forms.

Read More

AI (Adverse Inferences): AI Lending Models may show unconscious bias, according to Report.

By Cameron Abbott and Max Evans

We live in an era where the adoption and use of Artificial Intelligence (AI) is at the forefront of business advancement and social progression. Facial recognition technology software is used or is being piloted to be used across a variety of government sectors, whilst voice recognition assistants are becoming the norm both in personal and business contexts. However, as we have blogged previously on, the AI ‘bandwagon’ inherently comes with legitimate concerns.

This is no different in the banking world. The use of AI-based phishing detection applications has strengthened cybersecurity safeguards for financial institutions, whilst the use of “Robo-Advisers” and voice and language processors has facilitated efficiency by increasing the pace of transactions and reducing service times. However, this appears to sound too good to be true, as according to a Report by CIO Drive, algorithmic lending models may show an unconscious bias.

Read More

You can be anonymised, but you can’t hide

By Cameron Abbott, Michelle Aggromito and Karla Hodgson

If you think there is safety in numbers when it comes to the privacy of your personal information, think again. A recent study in Nature Communications found that, given a large enough dataset, anonymised personal information is only an algorithm away from being re-identified.

Anonymised data refers to data that has been stripped of any identifiable information, such as a name or email address. Under many privacy laws, anonymising data allows organisations and public bodies to use and share information without infringing an individual’s privacy, or having to obtain necessary authorisations or consents to do so.

But what happens when that anonymised data is combined with other data sets?

Read More

The OAIC engages in more in-depth investigations and stronger exercise of its power

By Cameron Abbott, Rob Pulham and Jacqueline Patishman

Following two key data incidents concerning how the Commonwealth Bank of Australia (CBA) handled data, the OAIC has successfully taken court action binding the banking heavyweight to “substantially improve its privacy practices”.

As a quick summary of the incidents, the first incident involved the loss of magnetic storage tapes (which are used to print account statements). These contained historical customer data including customer statements of up to 20 million bank customers. In 2016, the CBA was unable to confirm that the two magnetic tapes were securely disposed of after the scheduled destruction by a supplier.

Read More

Canada proposes to increase penalties for tech giants in its Digital Charter

By Cameron Abbott and Rebecca Gill

The Canadian federal government has proposed to introduce a combination of fines for companies that violate privacy laws, in order to rein in the growing power of Silicon Valley tech giants.

Canada’s Innovation Minister recently announced a 10-point Digital Charter that aims to provide more transparency into how companies collect and use personal information and stronger rights for consumers to consent to the use of their data. Key principles of the Charter include giving Canadians control over their data, promoting ethical use of data, ensuring that the online marketplace is competitive to facilitate growth of Canadian businesses, and implementing “meaningful penalties” for violations of privacy laws.

Read More

Sharing of ‘abhorrent violent material’ now an offence under new laws

By Cameron Abbott, Michelle Aggromito and Rebecca Gill

Governments around the world are imposing more responsibilities on tech providers to deal with online harms. In response to the recent attacks in Christchurch, in which a gunman livestreamed on Facebook his attack on a mosque, the Australian Government recently enacted the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Cth) (Act). The Act, which commenced on 6 April 2019, was pushed through swiftly and has a broad reach.

Under the Act, internet, content and hosting service providers must refer details of any ‘abhorrent violent material’ that records or streams ‘abhorrent violent conduct’ to the Australian Federal Police. Abhorrent violent material is material that is audio, visual or audio-visual, and that records or streams ‘abhorrent violent conduct’. Such conduct includes acts of terrorism, murder, attempted murder, torture, rape and kidnapping.

Read More

Consumer Data Right Draft Rules – submissions closing soon

By Cameron Abbott, Rob Pulham and Rebecca Gill

The deadline for submissions on the ACCC’s draft Competition and Consumer (Consumer Data) Rules 2019 (Draft Rules) is fast approaching. The ACCC is seeking feedback from community organisations, businesses and consumers on the approach and positions of the Draft Rules for the Consumer Data Right (CDR) regime until this Friday, 10 May 2019.

Key aspects of the Draft Rules (which are available on the ACCC’s website) include:

  • the three ways in which CDR data may be requested;
  • the requirements for consent to collect CDR data;
  • rules relating to the accreditation process; and
  • rules relating to the thirteen privacy safeguards for CDR data.
Read More

PROPOSAL TO INCREASE PENALTIES FOR PRIVACY BREACHES

By Cameron Abbott and Rebecca Gill

In light of concerns over how personal data is being used by social media platforms and tech companies, the Commonwealth Government has proposed amendments to the Privacy Act in order to more harshly penalise companies for privacy breaches. The new regime, which aims to update Australia’s privacy laws in line with increased social media use, will see tougher penalties for all entities that are subject to the Privacy Act, not just the headline companies like Google and Facebook.

The Commonwealth Government proposes to increase the penalties for serious or repeated breaches by such entities from $2.1 million to $10 million, or three times the value of any benefit obtained through the misuse of information, or 10 per cent of a company’s annual domestic turnover – whichever is the greater value.

Read More

IoT (internet of things) legislation makes an appearance in the U.S. Senate

By Cameron Abbott and Ella Richards

For those who are not familiar with the acronym, IoT or ‘Internet of things’ refers to the interconnection of network devices and everyday objects for increased control and ease of use.

The US Government has been steadily increasing the amount of IoT devices used in day-to-day business. In response to mounting concerns surrounding this, a bipartisan group in the Senate revealed a piece of legislation that will govern the use of IoT devices in the government context.

Read More

Copyright © 2019, K&L Gates LLP. All Rights Reserved.