Back to blog
Privacy ActAutomated Decision-MakingAI ComplianceHealthcare PrivacyOAICAPP 1General PracticeAllied Health

Automated Decision-Making and the Privacy Act: Healthcare Practices' December 2026 Deadline

ClinicComply Team
13 min read

Key Takeaways

  • From 10 December 2026, all APP entities must update their privacy policies to disclose any use of automated decision-making (ADM) that could significantly affect the rights or interests of individuals. This obligation is introduced by the Privacy and Other Legislation Amendment Act 2024 (Cth) as new Australian Privacy Principles 1.7, 1.8, and 1.9.
  • All health service providers are APP entities under section 6D(4) of the Privacy Act 1988 (Cth), regardless of annual turnover. The small business exemption does not apply to health practices. Every GP clinic, allied health practice, and registered NDIS provider must comply.
  • ADM is triggered when a computer program uses personal information about an individual to make, or substantially and directly assist in making, a decision that could significantly affect the individual's rights or interests. Access to healthcare is specifically identified in OAIC guidance as a significant right or interest.
  • Healthcare systems that commonly fall within the ADM definition include AI triage tools, appointment booking algorithms, AI scribes that auto-suggest codes or diagnoses, billing algorithms that select MBS item numbers, and chronic disease risk stratification platforms.
  • Privacy policies must now disclose: the kinds of personal information used in ADM programs; the kinds of decisions made solely by automated means; and the kinds of decisions where automation substantially and directly assists a human decision-maker.
  • Failure to have a compliant privacy policy is a civil penalty offence under the low-tier regime: up to $330,000 for corporations and up to $66,000 for other entities per contravention. The OAIC can issue infringement notices without commencing court proceedings.
  • The OAIC named healthcare as a high-risk enforcement sector in its 2025-26 regulatory priorities and has new powers under the Act to issue compliance notices alongside infringement notices.

From 10 December 2026, every health service provider that uses a computer program to make or assist in making decisions about patients or services must update its privacy policy to disclose that use. The obligation comes from new Australian Privacy Principles 1.7, 1.8, and 1.9, inserted into the Privacy Act 1988 (Cth) by the Privacy and Other Legislation Amendment Act 2024. Healthcare is among the least prepared sectors for this change, and the OAIC has explicitly named health practices as a high-risk enforcement priority.

Most GP practices and allied health providers do not think of themselves as organisations that use automated decision-making. Many do. AI scribes that auto-generate structured clinical notes, appointment triage tools that sort patients by urgency, billing systems that auto-select MBS item numbers, and risk stratification platforms that determine which patients are recalled for chronic disease management all fit within the definition, depending on how they are configured and used. If your practice uses any of these, and patient personal information flows through them into decisions that affect the care those patients receive, you may have an ADM disclosure obligation you are not currently meeting.

Who Must Comply

Most small businesses can rely on the Privacy Act's small business exemption, which excludes entities with annual turnover below $3 million from the Act's requirements. Health service providers cannot. Under section 6D(4) of the Privacy Act 1988 (Cth), health service providers are explicitly excluded from the small business exemption and are APP entities regardless of their size, revenue, or corporate structure.

That means every GP practice in Australia, whether a sole-practitioner clinic or a large corporate group, must comply with the ADM transparency obligations from 10 December 2026. The same applies to physiotherapy clinics, psychology practices, occupational therapy services, dental clinics, and registered NDIS providers.

What Counts as Automated Decision-Making

Under APP 1.7, the ADM disclosure obligation is triggered when all three of the following conditions are met:

  1. A computer program is used to make a decision, or to do something substantially and directly related to making a decision.
  2. Personal information about the individual is used in the operation of that program to make the decision or to do the thing related to making it.
  3. The decision could reasonably be expected to significantly affect the rights or interests of the individual.

The third condition is where the healthcare context matters most. The OAIC's APP 1 guidelines specifically identify decisions affecting an individual's access to a significant service or support, including healthcare, as decisions that significantly affect rights or interests. A decision about whether a patient is seen, when they are seen, what their consultation is coded as, or whether they are recalled for follow-up is a decision about access to a significant healthcare service.

ADM does not require full automation. A program that "substantially and directly assists" a human to make a decision qualifies, even if the final call rests with a clinician or administrator. If an AI tool generates a risk score that a practice nurse uses to determine whether to escalate a patient to a GP, and the score substantially drives that determination, the arrangement is likely to meet the ADM definition.

The definition also covers the situation where the automated output is accepted without material review. Making a decision includes refusing or failing to make a decision, so the test applies equally to systems that exclude patients from services as to those that grant access.

Common Healthcare Systems That May Trigger the Obligation

AI Scribes and Clinical Documentation Tools

AI scribes such as Heidi Health, Lyrebird Health, and Dragon Medical One transcribe consultations and generate structured clinical notes. Some go further: they auto-suggest diagnoses, populate problem lists, recommend follow-up actions, or generate billing codes from the clinical record. Where an AI scribe's output is used to determine what a consultation is coded as, and therefore what MBS item is billed and what the patient pays, that output substantially and directly relates to a financial and clinical decision affecting the patient. Whether a specific product triggers the ADM obligation depends on how it is configured and how its output is used in your practice's workflow.

Appointment Booking and Triage Tools

Online booking platforms that use patient-reported symptoms and history to assess urgency, allocate appointment types, or filter patients to particular clinicians are using personal information to make or assist a decision about access to healthcare. If a triage algorithm determines that a patient is not urgent and books them in several weeks, when the patient believes they need to be seen sooner, that is a decision that could significantly affect the patient's health. Platforms that use risk scoring or natural language processing to sort booking requests are the clearest examples.

Billing and Coding Algorithms

Some practice management systems use algorithms that suggest or auto-complete MBS item numbers based on clinical note content. These systems use sensitive health information to make a decision that affects what a patient is billed and what Medicare pays. That meets both the personal information element and the significant effect element of the ADM test.

What Your Privacy Policy Must Include

Under APPs 1.8 and 1.9, where the ADM threshold in APP 1.7 is met, your privacy policy must state:

  • The kinds of personal information used in the operation of each automated decision-making program. For example: "Consultation audio recordings and clinical notes are used by AI scribe software to generate structured clinical documentation and billing codes."
  • The kinds of decisions made solely by automated means, without any human review before the decision takes effect.
  • The kinds of decisions where an automated program performs a function substantially and directly related to making the decision. For example: "Appointment urgency categorisation is substantially assisted by automated triage tools that analyse patient-reported symptoms."

The disclosure must be accurate and specific. A vague statement that "we may use technology to assist in some decisions" is unlikely to satisfy APP 1.8. Patients must be able to understand, from reading the policy, what personal information is being used and for what category of decision.

Your broader privacy policy obligations, including APP 11 security safeguards and the notifiable data breach scheme requirements, are covered in our guide to healthcare data breach obligations. The wider implications of the Privacy and Other Legislation Amendment Act 2024 for health practices are in our AI privacy compliance guide for healthcare.

A separate change introduced by the same Act, the new statutory tort for serious invasion of privacy, is covered in our guide to Australia's new privacy tort. That cause of action allows individuals to sue in court independently of the OAIC complaint mechanism and may be relevant where an automated tool produces an output that a patient argues was used to make an adverse decision about their care.

Use the compliance calendar tool to map the 10 December 2026 ADM deadline against your other privacy, accreditation, and workforce compliance obligations.

The Enforcement Risk

The OAIC received new enforcement powers under the Privacy and Other Legislation Amendment Act 2024. The most significant for ADM compliance is the power to issue infringement notices for failure to have a compliant privacy policy. Infringement notices allow the OAIC to impose civil penalties administratively, without commencing Federal Court proceedings. The amount is set within the low-tier penalty framework.

The low-tier civil penalty for failure to have a compliant privacy policy is up to $330,000 per contravention for corporations and up to $66,000 per contravention for individuals and other entities. For contraventions that constitute a serious interference with privacy, penalties escalate to the larger of $50 million, three times the benefit obtained from the contravention, or 30% of the entity's adjusted annual turnover during the breach period. The Federal Court imposed the first penalty under the reformed regime in 2025, ordering $5.8 million against an entity for serious privacy breaches, signalling that the OAIC will pursue enforcement action where warranted.

Healthcare is named in the OAIC's 2025-26 regulatory priorities as a high-risk sector for privacy enforcement alongside pharmacies and other health service providers. An ADM disclosure that is absent or materially inadequate in a healthcare privacy policy after 10 December 2026 is the kind of failure the OAIC has indicated it will investigate.

How ClinicComply Helps

ClinicComply's document library lets you store your updated privacy policy against the specific APP obligations it addresses, with version history and review dates tracked against the policy itself. You can assign the ADM technology audit as a task to the person responsible, set a completion date well before the December 2026 deadline, and track progress through the dashboard.

Automated reminders can flag the 10 December 2026 deadline at 90, 60, and 30 days out, giving your practice enough lead time to conduct the audit, draft the updated disclosure language, and have the policy reviewed before it must be published.

ClinicComply consolidates your privacy framework, clinical governance documents, and workforce records in one place. A single compliance review covers ADM disclosure requirements alongside APP 11 security obligations, notifiable data breach procedures, and consent documentation.

For downloadable privacy policy templates and compliance checklists, see the ClinicComply template library. For the full feature set, visit cliniccomply.com.au/features or start a free 30-day trial at cliniccomply.com.au/signup.

Frequently Asked Questions

Does the automated decision-making obligation apply to small GP practices?

Yes. Health service providers are explicitly excluded from the Privacy Act's small business exemption under section 6D(4) of the Privacy Act 1988 (Cth), regardless of annual turnover or practice size. A sole-practitioner GP clinic has the same ADM transparency obligations as a large hospital group.

What is the deadline for updating my privacy policy to disclose automated decision-making?

APPs 1.7, 1.8, and 1.9 commence on 10 December 2026. Privacy policies must include the required ADM disclosures from that date. Because identifying which systems qualify and drafting precise disclosure language takes time, practices should begin the assessment process no later than mid-2026.

Does an AI scribe count as automated decision-making?

It depends on how the scribe is used in your practice. If the AI scribe transcribes a consultation and produces a draft note that a clinician reviews and amends before accepting, the transcription function alone is unlikely to constitute ADM. If the scribe auto-suggests billing codes, diagnosis labels, or referral recommendations that the clinician routinely accepts without material review, the output substantially and directly assists decisions that affect the patient's care and financial position. That arrangement is likely to meet the ADM definition. The answer depends on the specific product and workflow.

What must my privacy policy say about automated decision-making?

Your privacy policy must identify: the kinds of personal information used in the operation of your automated decision-making programs; the kinds of decisions made solely by automated means; and the kinds of decisions where an automated program performs a function substantially and directly related to making the decision. Generic statements are insufficient. The disclosure must be specific enough that a patient can understand what data is being used and for what category of decision.

Does the obligation apply if a human reviews the automated output before acting?

Partly. Decisions made solely by automated means must be disclosed. Decisions where an automated program substantially and directly assists a human to make a decision must also be disclosed under APPs 1.8 and 1.9, even if the final decision rests with a person. The obligation is not limited to fully automated decisions. If a clinician or administrator's decision is substantially driven by an automated output, the arrangement falls within the scope of the new provisions.

What are the penalties for failing to have a compliant privacy policy after December 2026?

The low-tier civil penalty for failure to have a compliant privacy policy is up to $330,000 for corporations and up to $66,000 for individuals and other entities per contravention. The OAIC can issue infringement notices to impose these penalties without court proceedings. For contraventions that constitute a serious interference with privacy, penalties can reach the larger of $50 million, three times the benefit obtained, or 30% of adjusted annual turnover for the breach period.

Which healthcare tools are most likely to trigger the obligation?

The most common candidates in a general practice or allied health setting are AI triage tools that score urgency and determine booking allocation; clinical decision support software that flags high-risk patients or recommends treatment pathways; AI scribes that auto-generate billing codes or diagnosis suggestions; and billing algorithms that auto-select MBS item numbers from clinical note content. The test is whether patient personal information is used and whether the output substantially influences a decision that could significantly affect the patient.

Is this obligation the same as the new privacy tort introduced by the same Act?

No. The statutory tort for serious invasion of privacy and the ADM transparency obligations are distinct changes introduced by the same legislation. The privacy tort creates a civil cause of action individuals can bring in court, separate from the OAIC complaint mechanism. The ADM obligations require APP entities to update their privacy policies. Each has different triggers, different enforcement mechanisms, and different consequences for non-compliance.

30-day free trial, no credit card

Your next accreditation visit starts today.

Join Australian GP clinics and medical practices that have replaced spreadsheets and email threads with a single healthcare compliance platform. Your free trial starts the moment you sign up.

No credit card required
Australian data residency (Sydney)
Cancel anytime