AI automation in Canberra sits at the heart of Australia's federal AI governance stack. The DTA Policy for the responsible use of AI in government (Version 2.0) — effective 15 December 2025 — plus the APS AI Plan 2025 (released 12 November 2025) are the operational obligations for any Canberra automation touching government.
Mandatory requirements: every agency identifies Accountable Officials (notified to DTA within 90 days of policy effect), publishes a public transparency statement within 6 months, develops a strategic approach to AI adoption, establishes operational governance, ensures designated accountability per AI use case, and undertakes risk-based use-case-level actions. The Australian Government AI Assurance Framework with its AI Impact Assessment tool is the instrument.
The APS AI Plan 2025 adds Chief AI Officers, mandatory foundational AI literacy training for all APS staff, sovereign GovAI Platform + GovAI Chat, and a new AI Review Committee providing non-binding advice on sensitive and high-risk AI deployments.
Federally, Australia's AI Ethics Principles and the Privacy Act 1988 (APP 1.7 commencing 10 December 2026) apply. The ACCC's AI transparency statement confirms ACL applies to AI outputs — A$50M max per contravention.
Every Mindiam Canberra automation includes full documentation against the DTA AI Assurance Framework, AI use-case register entry, Accountable Official briefing pack, and (for defence-adjacent clients) DISP-compliant security documentation. Automation in Canberra is a compliance activity as much as a technical one — which is why we embed compliance in the build, not afterwards.