Keeping Pace: The First Tranche of Australian Privacy Reforms versus the GDPR Regimes
In September 2024, Australia introduced into Parliament the Privacy and Other Legislation Amendment Bill 2024 (Cth) (Privacy Bill) which would enact the first tranche of long-awaited reforms to the Privacy Act 1988 (Cth) (Privacy Act). These reforms were recommended as part of the Privacy Act Review Report released in 2022 (PAR Report).
For investors and corporates with interests across Australia, the EU, and the UK, this post addresses the reforms proposed in the Privacy Bill, and how and to what extent the changes will align the Privacy Act more closely with the EU General Data Protection Regulation (EU GDPR) and its essentially similar UK equivalent (UK GDPR, and together with the EU GDPR, GDPR).
The highly anticipated Privacy Bill was introduced to Parliament by the Attorney-General on 12 September 2024. It seeks to implement 23 of the 25 legislative proposals that were agreed to by the Australian Government in its response to the PAR Report dated 28 September 2023 (Government Response).
The Privacy Bill seeks to provide stronger privacy protections with respect to individual's personal information and among other things, covers the following:
- requires information in privacy policies regarding certain automated decision making;
- facilitates overseas data flows;
- clarifies that the steps an entity needs to take to protect personal information includes the implementation of technical and organisational measures;
- facilitates sharing of personal information to reduce the risk of harm to individuals where there is an eligible data breach;
- introduces new civil penalty enforcement powers;
- requires the Office of the Australian Information Commissioner (OAIC) to develop a children's online privacy code for services that are likely to be accessed by children;
- provides for the development of new Australian Privacy Principles (APPs);
- introduces a new statutory tort for serious invasions of privacy, allowing individuals to bring actions in Court; and
- introduces targeted criminal offences for doxxing.
Many of the Privacy Bill's changes were influenced by international examples, including the GDPR. Whilst some commentators have flagged that this Privacy Bill takes a step towards 'harmonisation' with the GDPR, on closer examination it appears that the regime in the Privacy Bill (and Privacy Act) is foundationally dissimilar to the GDPR and perhaps is unlikely ever to be completely 'harmonised' as against the GDPR standard. The Privacy Act is designed to be principles-based, flexible and indirectly aligned with human rights law, whilst the GDPR, although also principles-based, is designed to be prescriptive, codified, and strongly aligned with human rights law. The Privacy Bill does not, for example, introduce "accountability" requirements equivalent to those in the GDPR – e.g., requirements to maintain records of processing operations, to appoint data protection officers or to conduct privacy impact assessments.
In this post, we discuss the Privacy Bill and comparisons against the GDPR, including which amendments may be influenced by the GDPR.
1. Automated decision-making | |
The Privacy Bill requires that where an APP entity uses a computer program to make a decision which could “reasonably be expected to significantly affect the rights or interests” of an individual and the individual’s personal information is used to make the decision, the privacy policy must be updated to contain certain information. Privacy policies will need to include information regarding:
The Privacy Bill gives the following examples of decisions that may affect the rights or interests of an individual:
APP entities that use automated decision making to make decisions that could reasonably be expected to significantly affect the rights or interests of an individual should consider whether they need to update their privacy policies to disclose this. High-level examples were included in the Privacy Bill in response to criticism that the GDPR concept of 'legal or similarly significant effect' was uncertain – although in practice it has not proved a difficult concept to work with under the GDPR. However, the Privacy Bill's approach for disclosing this information in a privacy policy places the ultimate responsibility on individuals to manage their privacy through reading and understanding the risks of ADM. The GDPR requirements (see right hand column) place more burden on controllers to manage their use of ADM, including conducting risk assessments. The PAR Report further sought to minimise the risk posed by the GDPR – the threshold is reached for decisions being made solely on automated processing – by lowering the threshold to a substantial and direct link between the computer program and the decision. This amendment captures a broader range of automated decisions, but it avoids the fabrication of human involvement in ADM to avoid regulatory requirements that could arguably occur under the GDPR system. |
The GDPR includes information-provision obligations which, although slightly less specific than the Privacy Bill in relation to automated decision-making, are similar in substance. However, the GDPR also imposes other restrictions on automated decision-making which go well beyond the proposed requirements of the Privacy Bill. The GDPR (article 22) specifically regulates decision-making based solely on automated processing – without relevant human involvement – which produces a legal effect concerning the individual or similarly significantly affects them. The use of such techniques is prohibited unless it is:
In most circumstances, a controller proposing to use ADM techniques will also be obliged by the GDPR to conduct a prior risk assessment. Even stricter requirements apply if the decision-making relies on data in certain particularly sensitive categories (for example health data). It appears that the scope of the proposed Australian requirements will be broader than that of the GDPR regime, because their application does not depend on decisions being made solely on the basis of automated processing. On the other hand, the proposed Australian requirements are considerably less onerous. It is worth noting that the UK, which is of course no longer within the European Union, recently proposed to reduce the burden imposed by the UK GDPR to a level closer to that proposed in the Privacy Bill – the transparency requirements would still apply, but the qualified prohibition would only apply to ADM involving the use of health or other sensitive personal data. This proposal fell away with the UK's recent change in government, however, and it is not yet clear whether it will be revived. |
2. Overseas data flows | |
Australian Privacy Principle (APP) 8.1 requires that where an APP entity discloses personal information about an individual to an overseas recipient it must take such steps as are reasonable in the circumstances to ensure that the overseas recipient does not breach the APPs (other than APP 1) in relation to the information. However, an exception to the requirement under APP 8.1 is where the APP entity reasonably believes that the overseas recipient is subject to a law or binding scheme that has the effect of protecting the information in a way substantially similar to the APPs, and the individual can take action to enforce the protection of the law or binding scheme. The Privacy Bill introduces a mechanism for regulations to prescribe countries and binding schemes as providing substantially similar protection to the APPs, to assist APP entities to determine whether to disclose personal information to an overseas recipient. Before the Governor-General makes regulations for this purpose, the Minister must be satisfied that:
This amendment aligns with the 'accountability' approach under the Privacy Act, which places responsibility for breaches by third parties on APP entities. By introducing a 'prescribed countries' mechanism, it will provide greater certainty for APP entities disclosing personal information overseas that are relying on an exception to APP 8.1. This reform will be particularly relevant to organisations involved in international trade and digital service models. A benefit of the reform is that it will likely reduce the costs incurred by organisations in assessing the laws or schemes that overseas recipients are subject to. Future reforms will likely include the introduction of standard contractual clauses for APP entities to use to facilitate cross-border data transfers, similar to the "appropriate safeguards" concept in the GDPR. |
The GDPR, similarly, restricts international transfers of personal data (GDPR, chapter 5) – in this case, transfers outside (under the EU GDPR) the European Economic Area (EEA) or (under the UK GDPR) the UK. The rationale behind these restrictions is to ensure that the level of protection of the affected individuals remains the same when their data have been transferred outside the EEA or UK. The GPDR providers for the following tiered approach. Transfers are only allowed:
It appears that the Privacy Bill would move the Privacy Act closer to the GDPR position, but without introducing the more specific GDPR regime that applies if no adequacy decision has been reached. |
3. Security, retention and destruction of personal information | |
The Privacy Bill clarifies that the steps the entities are required to take to keep personal information secure (specifically, to protect, destroy and/or to de-identify personal information) include the implementation of technical and organisational measures. These include, for example, encryption of data, securing access to systems and premises, and staff training to address information security risks. Specifically, APP 11 currently states that an entity holding personal information must take "such steps as are reasonable in the circumstances" to protect the information from misuse, interference, loss, unauthorised access, modification or disclosure. Additionally, if an entity holds personal information about an individual and such entity no longer requires the information (and is not required by Australian law or a court/tribunal order to retain the information), such entity must take "such steps as are reasonable in the circumstances" to destroy the information or to ensure that the information is de-identified. The Privacy Bill will add a new APP 11.3 which clarifies that "such steps as are reasonable in the circumstances" include technical and organisational measures. The purpose of this reform is to clarify what 'reasonable steps' an APP entity must take to comply with APP 11. The PAR Report explained that this would enhance consistency with the GDPR, which similarly require appropriate technical and organisational measures from entities to secure personal information without prescribing the use of particular measures in particular circumstances (or generally). APP entities will now be expected to implement technical and organisational measures to comply with APP 11. |
The GDPR also includes an integrity and confidentiality principle (article 5.1(f)) and the obligation to adopt appropriate technical and organisational measures aimed at protecting the confidentiality, integrity and availability of personal data (article 32). Although the GDPR gives some examples of security measures that may be appropriate (e.g., pseudonymisation, encryption), it is for businesses to decide which are the most appropriate measures in any given situation , taking into account several specified circumstances (the state of the art, costs of implementation, data processed, etc.) |
4. Eligible data breaches | |
The Privacy Bill grants the Minister (being the Attorney-General) the power to declare certain data breaches an "eligible data breach", if the Minister is satisfied that such a declaration is necessary or appropriate to prevent or reduce a risk of harm arising from a misuse of personal information about one or more individuals following unauthorised access to or disclosure of the personal information from the eligible data breach of the entity. Such a declaration will permit the sharing of personal information following a data breach for the purpose of preventing or reducing the risk of harm to individuals. Where such declaration is made, the entity (which was the subject to the data breach) may handle personal information in a way that would otherwise not be permitted under the APPs insofar as they are to prevent or reduce the risk of harm to individuals. This follows the temporary amendments that were made to the Telecommunications Regulations 2021 (Cth) following a large-scale data breach, which facilitated greater information sharing from a telecommunications entity to banks that would otherwise have not been permitted. This amendment will allow for determinations to be made across a greater range of sectors following a data breach and will improve responsiveness to large-scale fraud and identity theft following a data breach. Note also that reforms in relation to reporting of cybersecurity breaches (a standalone Cyber Security Act) are also before Parliament, which propose to require reporting of ransomware payments to authorities. |
The GDPR does not include powers of governments to make similar declarations – any disclosure of personal data between entities affected by security breaches would need to meet the GDPR's general requirements. It does (articles 33 and 34) require personal data breaches to be documented, and to be reported to data protection supervisory authorities (where the breach is likely to result in a risk to the rights and freedoms of individuals) and notified to the affected data subjects (where the data breach is likely to result in a high risk to their rights and freedoms). The European Data Protection Board (EDPB) and various national supervisory authorities have issued guidelines on these requirements. |
5. Civil penalty provisions | |
The Privacy Bill amends section 13G of the Privacy Act which dealt with serious or repeated interferences with privacy to remove the reference to 'repeated' and limit the section to 'serious' interferences with privacy. This enables greater levels of enforcement from the OAIC in response to interferences with privacy, as what constitutes 'repeated' interferences with privacy was untested in the courts and was largely unclear. Whether the act was 'repeated' will no longer be relevant for determining if an interference with privacy has occurred. Whether the act was repeated or continuous will now be one of many factors to consider when assessing whether the act was 'serious'. This amendment also provides a non-exhaustive list of matters, a court can take into account in determining if the interference with privacy is "serious". These include:
The Privacy Bill amends the civil penalty tiers. A body corporate may be liable for a penalty that is an amount not more than the greater of (i) AUD 50 million, (ii) an amount three times the value of the benefit the body corporate (and any related body corporate) have obtained through the interference with the privacy, and (iii) 30% of the adjusted turnover of the body corporate during the breach turnover period for the contravention. With the amendment, this penalty tier will only be applicable for "serious" interference with privacy (it was previously applicable to "serious" or "repeated" interferences, but now with the amendment (see above), whether or not an interference is repeated will form part of the assessment of seriousness). In addition, the Privacy Bill establishes tiers of civil penalty provisions to allow for targeted and proportionate regulatory responses to breaches that may be less than 'serious'. In particular, it introduces the following two new tiers:
This amendment was inspired by the proportionate enforcement mechanisms available to the Australian Securities and Investments Commission (ASIC), which can respond to lower and higher-level breaches of corporations and financial services laws with appropriate penalties. This will empower the OAIC to take on a more proactive enforcement posture, as there will be a greater certainty in the OAIC successfully enforcing less significant breaches of the Privacy Act. |
The GDPR (article 84) similarly envisages the imposition of administrative fines for breaches of the GDPR (without a specific "seriousness" condition). These fines are subject to limits, but they are considerably higher than is contemplated by the Privacy Bill. In the case of an "undertaking" (that is, a business or group of businesses), the limit is set at either 2% or 4% of the undertaking's total worldwide turnover in the preceding financial year (or, if higher, either EUR 10M or 20M), depending on the GDPR provision breached. |
6. OAIC's enforcement and review powers | |
Under the Privacy Bill, the OAIC is provided with the ability to carry out public inquiries into specified matters directed or approved by the Minister. These public inquiries could be used to examine systemic or industry-wide issues regarding individuals' privacy and assist the OAIC in determining where further education or guidance is required for entities to comply with the Privacy Act. Furthermore, the OAIC is provided with additional general investigation and monitoring powers, including to search premises and to seize materials . This brings the OAIC in line with other Australian regulators, including ASIC. The OAIC can also issue a determination requiring a respondent to a privacy matter to perform any reasonable act or course of conduct to prevent or reduce reasonably foreseeable future loss or damage. This would allow the OAIC to compel a respondent to be more proactive following a privacy breach. In view of the OAIC's stronger enforcement powers, entities should ensure that they are compliant with their obligations under the Privacy Act and address any compliance gaps. |
Under the GDPR, the EDPB (EU GDPR) or Information Commissioner's Office (UK GDPR) can issue guidelines, recommendations and best practice advice on the interpretation and application of the GDPR. The EDPB's tasks also include the launch of coordinated actions among national supervisory authorities, which normally affect more problematic matters for which common guidance is required. The GDPR also provides for the national supervisory authority/ies in each EU member state (or the UK) to take responsibility for monitoring the application of the GDPR. The national supervisory authorities have extensive powers to monitor and enforce the application of the GDPR, including conducting audits of particular organisations and wider investigations, and ordering organisations to provide information or take steps to comply with the GDPR. |
7. Children's online privacy | |
The Privacy Bill requires the OAIC to develop and register a Children's Online Privacy Code (COP Code) within 2 years of Royal Assent. It must set out how one or more of the APPs are to be applied or complied with in relation to the privacy of children. The COP Code is to align with international approaches, including the UK's Age Appropriate Design Code. The OAIC has not yet stated when it expects to develop and register the COP Code, noting the Privacy Bill remains in committee stage, the OAIC may have until at least the start of 2027 to develop and register the COP Code. The Privacy Bill also defines a child as an individual who has not reached 18 years of age. The PAR Report explained this as more clearly aligning with the Online Safety Act 2021 (Cth), the UK's Age Appropriate Design Code, and Ireland's Data Protection Act. The COP may be relevant to entities that provide online or electronic services which are likely to be accessed by children. Organisations to which the COP Code may apply, should monitor its development and registration. Although the Privacy Bill does not explicitly address a child's capacity to consent, the PAR Report explained that the OAIC's guidance on consent can continue to be relied upon. This guidance explains that the minimum age for presumption of capacity is 15 years of age, or, if it is practicable, an entity can evaluate a child's capacity on a case-by-case basis. This proposal further argued for the codification of a higher-level consent requirement in line with Canada's Personal Information Protection and Electronic Documents Act (evaluating consent without codifying a particular age of consent), which will be less prescriptive than the GDPR (although note that the specific GDPR age thresholds do not apply outside the online services context). |
The GDPR (article 40) encourages the creation of codes of conduct governing particular kinds of processing of personal data. In the UK, for example, the Information Commissioner's Office has published the Children's Code (or Age Appropriate Design Code), dealing with processing of children's personal data in relation to various categories of online app. For the most part, however, the GDPR regulates the processing of children's personal data in the same way as the processing of adults' data, although the effect of the GDPR's requirements will often be stricter when applied to the processing of children's data because of the relative sensitivity of the processing. One specific respect in which the GDPR does deal differently with children's data is in relation to the direct offer of information society services to children based on their consent. Here the EU GDPR provides (article 6) that a child's consent will not be effective if they are less than 16 years' old (consent would instead need to be sought from a parent or other person with parental responsibility) – although member states are able to reduce the threshold below 16 as long as it is at least 13. The UK GDPR sets the threshold at 13. Concern over protection of children on the Internet has significantly increased and this has crystallised in multiple actions and initiatives in the EU and UK. For instance, the EDPB is currently working on the development of guidelines for age verification systems on the Internet; and the UK has passed the Online Safety Act 2023 which, while regulating to protect users of online services generally, includes specific provisions protecting children (with the threshold for these purposes set at 18). |
8. Developing additional APPs | |
Under the Privacy Bill, the Minister may direct the OAIC to develop an APP Code if the Minister is satisfied that it is in the public interest to develop the code and for the OAIC to do so. The direction may specify the matters to be dealt with by the code, and the entities or class of entities to be bound by the code. The Minister may also direct the OAIC to develop a temporary APP Code. Investors and corporates should be mindful of any future sector-specific codes that are announced by the Minister under this mechanism, as these APP Codes will be largely developed by government rather than directly from industry representatives. |
As discussed above, the GDPR (article 40) encourages the development of codes of conduct, approved by data protection supervisory authorities. These codes are not legally binding but compliance with them is a strong indicator of compliance with the GDPR. |
9. Statutory tort for serious invasions of privacy | |
The Privacy Bill introduces a statutory tort for serious invasions of privacy in accordance with the recommendations in the PAR Report, which is designed to provide protection against a broader range of interferences with privacy, in line with Australia's international obligations. The balance between the right to privacy as against right to freedom of expression is expressly acknowledged, and there are defences and exemptions in place to address the right to freedom of expression (e.g., for journalists and the importance of free press). A plaintiff has a cause of action in tort against a defendant if:
The question of introducing a statutory cause of action for serious invasion of privacy has been considered for many years. The lack of human rights jurisprudence in Australia has led to a general preference for a statutory solution to this issue of privacy invasions for personal, family or household affairs (which are exempted under the Privacy Act). This tort was heavily inspired by the UK's equivalent general law tort for misuse of private information. The Privacy Act's general journalism exemption will likely be considered in future tranches of reforms. However, the explicit shift from media organisations to journalists in the Privacy Bill is reflective of a tightening of the exemption under the new Schedule 2 of the Privacy Act. The more prescriptive concepts 'journalist' and 'journalistic material', as well as delinking whether the act breached journalistic codes of practice from whether the act is a serious invasion of privacy, will likely prevent 'clickbait', 'grief-knocking', and 'private gossip' about public figures from being granted the benefit of the exemption that would otherwise be allowed under the general journalism exemption in the Privacy Act. Further, this tranche of privacy reforms does not include a direct right of action for individuals for breaches of privacy generally under the Privacy Act, being a development that will be considered in future tranches of reforms. Instead, this proposal creates a cause of action that only applies under a new Schedule 2 of the Privacy Act for serious invasions of privacy. |
The GDPR does not create any equivalent tort. However, it does (article 82) contain a somewhat similar data subject right to compensation (to be awarded through the civil courts) for damage (including intangible damage) suffered as a result of any breach of the GDPR. Torts of privacy also exist in some member states independent of the GDPR and typically focussing more specifically on invasion of privacy rather than the processing of personal data. UK law includes the common law tort of misuse of private information, which (see previous column) is the inspiration for this element of the Privacy Bill. |
10. Doxxing offences | |
The Privacy Bill amends the Criminal Code Act 1995 (Cth) (Commonwealth Criminal Code) to introduce new offences which target the use of carriage services to make available, publish or distribute personal data of individuals where the person engages in the conduct in a way that reasonable persons would regard as being, in all the circumstances, menacing or harassing towards the individuals. This amendment was not part of the reforms outlined in the PAR Report, but rather reflective of contemporary events in Australian politics leading up to the launch of the Privacy Bill in 2024. This offence will be located alongside the telecommunications offences in the Commonwealth Criminal Code, rather than within the Privacy Act. This development is more relevant for individuals, but it may be something to be addressed in internal employee conduct policies as to the use of work emails and other communication methods. |
The GDPR does not specifically regulate doxxing, although, depending on the circumstances, doxxing may well breach the GDPR or other EU or member state laws. |
Where to from here?
The Privacy Bill has been referred to the Legal and Constitutional Affairs Legislation Committee, for inquiry and report by 14 November 2024.
A second tranche of reforms is under preparation by the Attorney-General and is likely to be seen in 2025 (noting that there is a federal election expected by no later than 17 May 2025). Many of the proposals that were not included in the Privacy Bill were agreed in-principle in the Government Response and are being further consulted on, subject to cost-benefit and impact analyses. We will closely follow any further developments with respect to the Privacy Bill and further reforms.