Australia's Privacy Act amendments take effect December 10, 2026 - mandating transparency for every automated decision using personal data. Penalties reach AUD $50 million or 30% of turnover. Areebi delivers every technical control the legislation demands, from immutable audit trails to an organisation-wide AI kill switch.
Time remaining until ADM obligations commence
The Australian Privacy Act automated decision-making amendments require every organisation using a computer program to make or influence decisions about individuals using personal information to disclose those practices in their privacy policy, effective December 10, 2026, with penalties reaching AUD $50 million or 30% of adjusted annual turnover.
The Privacy and Other Legislation Amendment Act 2024 (Cth), which received Royal Assent on December 10, 2024, represents the most significant reform to Australia's Privacy Act 1988 in over a decade. The first tranche of reforms introduces new automated decision-making (ADM) transparency obligations under APP 1.7, 1.8, and 1.9, which commence on December 10, 2026. These provisions emerged from the Attorney-General's comprehensive Privacy Act Review, which recommended over 100 reforms to modernise Australia's privacy framework for the digital age.
What makes this legislation uniquely consequential is its scope: it covers every computer program, not just artificial intelligence. The Explanatory Memorandum explicitly states it encompasses “a wide spectrum of automation, ranging from sophisticated machine learning to basic rule-based logic.” This captures Excel spreadsheets generating scores, RPA bots, legacy rule engines, and LLM-powered tools equally. Australia's approach is deliberately broader than GDPR Article 22, which only covers “solely automated” decisions. The Robodebt catastrophe - a $1.7 billion class action settlement affecting 430,000 individuals - directly informed these provisions, demonstrating the devastating consequences of unchecked automated decision-making.
The Act has extraterritorial reach, applying to foreign corporations doing business in Australia that handle Australians' personal data. Penalties have been restructured into three tiers, reaching AUD $50 million, three times the benefit obtained, or 30% of adjusted annual turnover - whichever is greatest. A new statutory tort for serious privacy invasions (effective June 2025) enables individuals to sue directly, opening the door to class actions. The OAIC has adopted an “enforcement-led regulatory approach” and has already begun compliance sweeps targeting 60 businesses across six sectors. This is not aspirational guidance - this is law with teeth.
The ADM transparency obligations under APP 1.7 apply when all three conditions are met simultaneously.
APP 1.7 defines an automated decision as one where a computer program makes or substantially assists a decision that significantly affects an individual's rights or interests and uses their personal information - all three conditions must be met for the transparency obligations to apply.
The entity has arranged for a computer program to make, or do a thing that is substantially and directly related to making, a decision. This includes AI systems, rule-based logic, scoring algorithms, RPA, and even Excel macros generating scores.
The Explanatory Memorandum clarifies: 'Substantially' means the program's output is a key factor in facilitating the human's decision making. 'Directly' means a clear connection with the making of the decision.
The decision could reasonably be expected to significantly affect the rights or interests of an individual. Per the Explanatory Memorandum, the impact must be 'more than trivial' and have 'the potential to significantly influence the circumstances of the individual.'
Examples: credit decisions, employment decisions, insurance underwriting, government benefits, access to services, healthcare decisions.
Personal information about the individual is used in the operation of the computer program to make the decision or do the thing substantially and directly related to making it.
Personal information under the Privacy Act 1988 includes any information or opinion about an identified individual, or an individual who is reasonably identifiable.
Important:All three conditions must be met. Critically, APP 1.9 clarifies that “making a decision” includes refusing or failing to make a decision - algorithmic inaction is captured. The obligations apply regardless of whether the outcome is beneficial or adverse to the individual.
The obligations apply to all APP entities - currently covering over 200,000 organisations in Australia.
All Commonwealth agencies and departments. The OAIC’s October 2025 report found only 17% of agencies clearly disclosed ADM use — enforcement is coming.
Any organisation with annual turnover exceeding AUD $3 million, including subsidiaries of foreign companies operating in Australia.
All private health service providers regardless of turnover. Health information is ‘sensitive information’ under the Act with stricter handling obligations.
Foreign companies carrying on business in Australia that collect or hold personal information of Australian residents. Extraterritorial reach mirrors GDPR.
Coming soon: The government has committed to removing the small business exemption in the second tranche of reforms - bringing approximately 2.3 million additional businesses (95% of all Australian businesses) under the Privacy Act. Organisations below the current $3M threshold should prepare now.
Six core compliance requirements mapped to Areebi's platform capabilities.
Privacy policies must disclose: the kinds of personal information used in automated programs (APP 1.8(a)), the kinds of decisions made solely by computer programs (APP 1.8(b)), and the kinds of decisions where a program substantially assists human decision-making (APP 1.8(c)). The OAIC expects organisations to maintain audit logs and monitoring of ADM outputs.
Areebi: Immutable audit logging captures every AI interaction with full context — user identity, prompt content, model used, response generated, policy decisions applied, and timestamps. Complete visibility of every employee’s prompt and response. Exportable audit reports generate the evidence base for privacy policy disclosures and regulator inquiries.
APP 1.8’s distinction between ‘solely automated’ and ‘substantially assisted’ decisions incentivises human oversight. The OAIC expects organisations to maintain ability to override or stop automated systems. The second tranche will introduce a right to human review of automated decisions.
Areebi: Organisation-wide AI kill switch — disable all AI across the company instantly from a single admin console. Granular controls to pause specific models, workspaces, or user groups. Real-time monitoring dashboards give compliance teams full oversight of every AI interaction, with configurable alert rules and policy override capabilities.
APP 11 requires ‘reasonable steps’ to protect personal information. New APP 11.3 (effective Dec 2024) requires ‘technical and organisational measures’ proportionate to data sensitivity and volume. For AI systems processing personal data, the standard for ‘reasonable steps’ has been materially raised.
Areebi: Real-time DLP engine with 50+ PII detectors automatically masks and redacts personal information before it reaches AI models. Configurable redaction rules per data category — names, addresses, Medicare numbers, TFNs, bank accounts, and more. Data never leaves your environment unprotected.
Sensitive information — health, biometric, racial/ethnic origin, political opinions, religious beliefs, sexual orientation, criminal records, trade union membership — requires explicit consent for collection and has the strictest handling obligations. AI systems must not process sensitive data without proper authorisation.
Areebi: Configurable data classification policies block entire sensitive data categories from entering AI interactions. Visual policy builder lets compliance teams create no-code rules that prevent sensitive information from being processed — without requiring developer intervention. Enforce at the platform level, not the prompt level.
The Notifiable Data Breach scheme requires notification to both affected individuals and the OAIC when a breach is likely to cause serious harm. New penalty tiers include infringement notices up to AUD $66,000 per contravention for listed corporations. Organisations must be able to detect and assess breaches rapidly.
Areebi: Real-time alerting when sensitive data is detected in AI interactions — before it becomes a breach. Configurable alert rules trigger notifications to compliance teams, security operations, and management. SIEM integration feeds events directly into your existing incident response workflows. Detect, alert, and respond in seconds, not days.
APP 1.3 requires privacy policies to be ‘clearly expressed and up to date.’ APP 1.4 requires they be freely available. The new APP 1.8 disclosures about ADM must integrate seamlessly into existing privacy policies. Organisations must document the logic of their automated decision-making processes.
Areebi: Compliance templates and audit exports generate documentation frameworks covering system architecture, data flows, policy configurations, and control effectiveness. Export audit reports in formats regulators and auditors accept. Your privacy policy disclosures are backed by the actual technical evidence in the platform.
The legislation enforces obligations in phases. December 10, 2026 is the critical deadline for ADM transparency.
Privacy and Other Legislation Amendment Act 2024 passes Parliament. Most Tranche 1 provisions commence immediately, including new APP 11.3 technical security measures and restructured penalties.
New private right of action for serious invasions of privacy. Individuals can sue for intentional or reckless privacy breaches, with damages up to AUD $478,550. Enables class actions against organisations.
OAIC targets approximately 60 businesses across 6 sectors for APP 1.3 and 1.4 compliance audits. Signals ‘enforcement-led regulatory approach’ — a significant shift from guidance-based compliance.
OAIC releases exposure draft of the Children’s Online Privacy Code for consultation. Registration deadline aligns with ADM obligations on December 10, 2026.
APP 1.7, 1.8, and 1.9 take effect. All APP entities must have updated privacy policies, audit systems, and technical controls in place. Applies to any decision made on or after this date, regardless of when the underlying system was deployed.
Expected to include: fair and reasonable test, right to erasure, right to human review of ADM, small business exemption removal (2.3M additional businesses), employee records exemption removal, mandatory PIAs for high-risk AI.
Weekly insights on enterprise AI security, compliance updates, and governance best practices.
Weekly insights on enterprise AI security, compliance updates, and best practices.
The 2024 amendments restructured penalties into three tiers with significantly increased maximums. Combined with the new statutory tort, the enforcement landscape is the most consequential in Australian privacy history.
Australian Privacy Act penalties for non-compliance reach AUD $50 million, three times the benefit obtained, or 30% of adjusted annual turnover - whichever is greatest - making them among the highest privacy penalties globally, alongside a new statutory tort enabling individuals to sue directly for serious privacy invasions.
of adjusted annual turnover
or AUD $50M / 3x benefit
Serious interference with privacy under section 13G. Applies to bodies corporate. Whichever of AUD $50M, 3x benefit obtained, or 30% of turnover is greatest.
per contravention
AUD (bodies corporate)
Non-serious interference with privacy. Includes inadequate privacy policies, failure to implement reasonable security measures, or non-compliance with ADM transparency obligations.
infringement notice
per contravention (listed corps)
OAIC can issue infringement notices without court proceedings for listed corporations. Up to AUD $19,800 for other entities. Low-tier violations carry penalties up to AUD $330,000.
Statutory tort (effective June 2025):A separate enforcement vector. Individuals can sue directly for serious privacy invasions, with damages up to AUD $478,550 and exemplary damages in exceptional circumstances. This enables class actions - Robodebt's $1.7 billion settlement for 430,000 individuals demonstrates the scale of potential liability.
Areebi provides the complete technical infrastructure the Privacy Act demands - from audit trails to kill switches to real-time data protection.
Areebi maps directly to every Australian Privacy Act ADM requirement - immutable audit logging for APP 1.8 transparency disclosures, organisation-wide kill switch for human oversight, real-time DLP with 50+ PII detectors for APP 11 security obligations, and sensitive data blocking across all 12 categories defined under the Act.
Follow this checklist to bring your organisation into compliance with the APP 1.7-1.9 automated decision-making transparency obligations before December 10, 2026.
Get all 45 controls across 10 compliance domains as a downloadable PDF. Mapped to specific APP provisions with industry-specific guidance for healthcare, financial services, government, and technology.
Australia's approach is broader than GDPR but lighter than the EU AI Act. Understanding the differences helps multinational organisations build unified compliance.
| Aspect | Australia (APP 1.7-1.9) | GDPR (Article 22) | EU AI Act |
|---|---|---|---|
| Scope | ALL computer programs | Solely automated decisions | AI systems by risk tier |
| Trigger | “Significantly affect rights” | “Legal or similarly significant effect” | Risk classification |
| Obligation | Transparency / disclosure | Generally prohibited | Documentation + conformity |
| Individual Rights | Disclosure only (Tranche 1) | Explanation + human review | Varies by risk tier |
| Max Penalty | AUD $50M / 30% turnover | EUR 20M / 4% global turnover | EUR 35M / 7% global turnover |
| Enforcement | Dec 10, 2026 | In effect since May 2018 | Full enforcement Aug 2, 2026 |
| Private Right of Action | Yes (statutory tort, June 2025) | Limited | No |
Key insight: Australia captures a broader range of decisions than GDPR - not just “solely automated” but also decisions where a computer program “substantially and directly” assists human decision-making. Any organisation compliant with GDPR should not assume they're automatically compliant with Australia's requirements.
The Australian Privacy Act works alongside international regulations. See how Areebi supports comprehensive compliance.
Answers to the most common questions about the Privacy Act amendments and what they mean for organisations using automated decision-making.
The Privacy and Other Legislation Amendment Act 2024 (Cth) amends Australia’s Privacy Act 1988 to introduce automated decision-making transparency obligations, restructure penalties (up to AUD $50 million or 30% of turnover), create a statutory tort for serious privacy invasions, and strengthen security requirements. The ADM provisions under new APP 1.7, 1.8, and 1.9 commence December 10, 2026.
No. The Act deliberately uses ‘computer program’ rather than ‘artificial intelligence.’ The Explanatory Memorandum clarifies this encompasses ‘a wide spectrum of automation, ranging from sophisticated machine learning to basic rule-based logic.’ This includes AI/ML models, scoring algorithms, RPA bots, legacy rule engines, and even Excel spreadsheets generating scores that influence decisions about individuals.
Yes. The Privacy Act applies to organisations with an ‘Australian link’ — including foreign corporations that carry on business in Australia or collect personal information from Australian sources. If your organisation processes personal information of Australian residents through automated systems, the ADM transparency obligations apply regardless of where you are headquartered.
Penalties are tiered: serious interference carries maximums of AUD $50 million, three times the benefit obtained, or 30% of adjusted annual turnover (whichever is greatest). Mid-tier violations face up to AUD $3.3 million for corporations. The OAIC can issue infringement notices up to AUD $66,000 per contravention without court proceedings. Additionally, the statutory tort (effective June 2025) enables individuals to sue directly, opening the door to class actions.
Australia’s approach is broader in scope but lighter in obligation. GDPR Article 22 generally prohibits solely automated decisions with legal effect unless specific conditions are met. Australia’s APP 1.7 covers both fully automated AND substantially assisted decisions (a wider net), but only requires transparency/disclosure rather than prohibition. GDPR provides explicit rights to explanation and human intervention; Australia’s Tranche 1 focuses on disclosure, with individual rights expected in the second tranche.
APP 1.8(b) requires disclosure of decisions made solely by a computer program - where no human is involved in the final determination. APP 1.8(c) covers decisions where a computer program does something substantially and directly related to making the decision, but a human makes the final call. The distinction matters because organisations must separately identify and disclose both categories in their privacy policies. A credit scoring algorithm that auto-approves loans falls under APP 1.8(b), while the same algorithm generating a recommendation that a loan officer reviews falls under APP 1.8(c).
The small business exemption (which currently excludes organisations with annual turnover under AUD $3 million) is expected to be removed in the second tranche of Privacy Act reforms. When this happens, approximately 2.3 million additional businesses - representing 95% of all Australian businesses - will become subject to the Privacy Act, including the ADM transparency obligations. Organisations below the threshold should begin preparing now, as the second tranche timeline has not been confirmed but is anticipated to follow shortly after December 2026.
An APP 1.8 disclosure must be included in your APP privacy policy and should cover three elements: the kinds of personal information used in automated programs (APP 1.8(a)), the kinds of decisions made solely by computer programs (APP 1.8(b)), and the kinds of decisions where programs substantially assist human decision-making (APP 1.8(c)). Use plain language per APP 1.3, describe each system in terms individuals can understand, and update disclosures whenever systems change. Areebi’s audit exports provide the technical evidence base to accurately describe what each system does.
The ADM transparency obligations are approaching. Areebi provides the governance infrastructure the Privacy Act demands. Explore our audit logging, DLP controls, and platform controls. Review our EU AI Act compliance guide, check pricing, take our AI risk assessment, or visit our Trust Center.
Or download our free compliance checklist - 45 controls across 10 domains, mapped to APP provisions