Catagory:Legal & Regulatory Risk

1
AI (Adverse Inferences): AI Lending Models may show unconscious bias, according to Report.
2
You can be anonymised, but you can’t hide
3
The OAIC engages in more in-depth investigations and stronger exercise of its power
4
Canada proposes to increase penalties for tech giants in its Digital Charter
5
Sharing of ‘abhorrent violent material’ now an offence under new laws
6
Consumer Data Right Draft Rules – submissions closing soon
7
PROPOSAL TO INCREASE PENALTIES FOR PRIVACY BREACHES
8
IoT (internet of things) legislation makes an appearance in the U.S. Senate
9
Ratings agency starting to factor in Cyber risk profile
10
To encrypt or not encrypt? That is the question

AI (Adverse Inferences): AI Lending Models may show unconscious bias, according to Report.

By Cameron Abbott and Max Evans

We live in an era where the adoption and use of Artificial Intelligence (AI) is at the forefront of business advancement and social progression. Facial recognition technology software is used or is being piloted to be used across a variety of government sectors, whilst voice recognition assistants are becoming the norm both in personal and business contexts. However, as we have blogged previously on, the AI ‘bandwagon’ inherently comes with legitimate concerns.

This is no different in the banking world. The use of AI-based phishing detection applications has strengthened cybersecurity safeguards for financial institutions, whilst the use of “Robo-Advisers” and voice and language processors has facilitated efficiency by increasing the pace of transactions and reducing service times. However, this appears to sound too good to be true, as according to a Report by CIO Drive, algorithmic lending models may show an unconscious bias.

Read More

You can be anonymised, but you can’t hide

By Cameron Abbott, Michelle Aggromito and Karla Hodgson

If you think there is safety in numbers when it comes to the privacy of your personal information, think again. A recent study in Nature Communications found that, given a large enough dataset, anonymised personal information is only an algorithm away from being re-identified.

Anonymised data refers to data that has been stripped of any identifiable information, such as a name or email address. Under many privacy laws, anonymising data allows organisations and public bodies to use and share information without infringing an individual’s privacy, or having to obtain necessary authorisations or consents to do so.

But what happens when that anonymised data is combined with other data sets?

Read More

The OAIC engages in more in-depth investigations and stronger exercise of its power

By Cameron Abbott, Rob Pulham and Jacqueline Patishman

Following two key data incidents concerning how the Commonwealth Bank of Australia (CBA) handled data, the OAIC has successfully taken court action binding the banking heavyweight to “substantially improve its privacy practices”.

As a quick summary of the incidents, the first incident involved the loss of magnetic storage tapes (which are used to print account statements). These contained historical customer data including customer statements of up to 20 million bank customers. In 2016, the CBA was unable to confirm that the two magnetic tapes were securely disposed of after the scheduled destruction by a supplier.

Read More

Canada proposes to increase penalties for tech giants in its Digital Charter

By Cameron Abbott and Rebecca Gill

The Canadian federal government has proposed to introduce a combination of fines for companies that violate privacy laws, in order to rein in the growing power of Silicon Valley tech giants.

Canada’s Innovation Minister recently announced a 10-point Digital Charter that aims to provide more transparency into how companies collect and use personal information and stronger rights for consumers to consent to the use of their data. Key principles of the Charter include giving Canadians control over their data, promoting ethical use of data, ensuring that the online marketplace is competitive to facilitate growth of Canadian businesses, and implementing “meaningful penalties” for violations of privacy laws.

Read More

Sharing of ‘abhorrent violent material’ now an offence under new laws

By Cameron Abbott, Michelle Aggromito and Rebecca Gill

Governments around the world are imposing more responsibilities on tech providers to deal with online harms. In response to the recent attacks in Christchurch, in which a gunman livestreamed on Facebook his attack on a mosque, the Australian Government recently enacted the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Cth) (Act). The Act, which commenced on 6 April 2019, was pushed through swiftly and has a broad reach.

Under the Act, internet, content and hosting service providers must refer details of any ‘abhorrent violent material’ that records or streams ‘abhorrent violent conduct’ to the Australian Federal Police. Abhorrent violent material is material that is audio, visual or audio-visual, and that records or streams ‘abhorrent violent conduct’. Such conduct includes acts of terrorism, murder, attempted murder, torture, rape and kidnapping.

Read More

Consumer Data Right Draft Rules – submissions closing soon

By Cameron Abbott, Rob Pulham and Rebecca Gill

The deadline for submissions on the ACCC’s draft Competition and Consumer (Consumer Data) Rules 2019 (Draft Rules) is fast approaching. The ACCC is seeking feedback from community organisations, businesses and consumers on the approach and positions of the Draft Rules for the Consumer Data Right (CDR) regime until this Friday, 10 May 2019.

Key aspects of the Draft Rules (which are available on the ACCC’s website) include:

  • the three ways in which CDR data may be requested;
  • the requirements for consent to collect CDR data;
  • rules relating to the accreditation process; and
  • rules relating to the thirteen privacy safeguards for CDR data.
Read More

PROPOSAL TO INCREASE PENALTIES FOR PRIVACY BREACHES

By Cameron Abbott and Rebecca Gill

In light of concerns over how personal data is being used by social media platforms and tech companies, the Commonwealth Government has proposed amendments to the Privacy Act in order to more harshly penalise companies for privacy breaches. The new regime, which aims to update Australia’s privacy laws in line with increased social media use, will see tougher penalties for all entities that are subject to the Privacy Act, not just the headline companies like Google and Facebook.

The Commonwealth Government proposes to increase the penalties for serious or repeated breaches by such entities from $2.1 million to $10 million, or three times the value of any benefit obtained through the misuse of information, or 10 per cent of a company’s annual domestic turnover – whichever is the greater value.

Read More

IoT (internet of things) legislation makes an appearance in the U.S. Senate

By Cameron Abbott and Ella Richards

For those who are not familiar with the acronym, IoT or ‘Internet of things’ refers to the interconnection of network devices and everyday objects for increased control and ease of use.

The US Government has been steadily increasing the amount of IoT devices used in day-to-day business. In response to mounting concerns surrounding this, a bipartisan group in the Senate revealed a piece of legislation that will govern the use of IoT devices in the government context.

Read More

Ratings agency starting to factor in Cyber risk profile

By Cameron Abbott and Wendy Mansell

A recent report released by Moody’s Investors Services has shed some light on which business sectors are most at risk for cyberattacks.

After assessing 35 broad sectors it was concluded that banks, hospitals, security firms and market infrastructure providers face the highest risk. This was based on levels of vulnerability and the potential impact an attack would have.

The key determinative factor for these sectors is that they all rely strongly on technology and the vital role of confidential information in their operations.

The financial repercussions following a cyberattack in each of these sectors is extremely significant when considering the costs of insurance, penalties, consumer impact, potential litigation costs, R&D and technological impact to name a few.

The financial market is so high risk because of the financial and commercial data it holds and ever increasing fact that its services are being offered digitally, across multiple platforms i.e banking mobile/smart watch apps.

On a similar note because medical records are primarily collected and held in electronic form hospitals are very attractive to hackers given the sensitive nature of the data.

While the industries should not be a shock to the reader, it is important for participants in those industries and for suppliers to those participants to realise the risk profile that attaches to them and have procedures in place reflective of those risk levels.  How one manages these risks in now likely to have indirect cost implications when you see ratings agencies like Moody’s assessing these sorts of areas. 

To encrypt or not encrypt? That is the question

By Cameron Abbott and Ella Richards

In response to the new controversial anti-encryption laws, Australian tech heavyweights have banded together to kick and scream over the restrictive implications the laws are already having on their industry.

Quick history lesson; the Assistance and Access Bill permit law enforcement to demand companies running applications such as Whatsapp to allow “lawful access to information”. This can be through either decryption of encrypted technology, or providing access to communications which are not yet encrypted. These ‘backdoors’ are intended to provide the good guys with the opportunity to fight serious crime, however there’s serious fear that in reality, these doors could throw out privacy or let in unwanted guests.

While the legislation states that backdoors should only be created if it doesn’t result in any ‘systemic weakness’; this is yet to be defined in a concrete and informative way. Industry points out that once created any such measure has the potential to be exploited by others. There is no such thing as a “once” only back door.

There is little doubt that this will end up in litigation as larger industry players challenge the abstract concepts in the legislation against the reality of their technology.

StartupAUS, an industry group of tech executives, have made several recommendations to amend the legislation. Even though they’re not holding their breath for any significant changes, they’re demanding more transparency around the requirements. Their recommendations include scrapping the requirement for an employee to build capabilities to intercept communications, tightening the scope of ‘designated communication providers’, giving oversight on how companies will be targeted and increasing what constitutes a ‘serious offence’.

Australia’s legislative response to the problem faced by law enforcement is one of the most heavy handed in the democratic world, and now has the world of technology companies with their significant impact on our economy watching the latest debate on reforms with great concern.

Copyright © 2024, K&L Gates LLP. All Rights Reserved.