AI automation in Brisbane runs under two regulatory frames — federal and Queensland — both of which shape what data your automations can touch, how decisions get documented, and what you're liable for when something fails.
Federally, the relevant rules are Australia's AI Ethics Principles (eight voluntary principles — human/societal wellbeing, human-centred values, fairness, privacy and security, reliability and safety, transparency, contestability, accountability) and the Privacy Act 1988 as amended by the Privacy and Other Legislation Amendment Act 2024. The 2024 amendment is particularly relevant to automation: new APP 1.7 transparency obligations for automated decision-making commence on 10 December 2026. If your automation touches customer decisions — loan approvals, claims triage, eligibility, dynamic pricing — APP 1.7 will require you to disclose in your privacy policy how those decisions are made. For a Brisbane insurer, banker or lender, that's not optional.
The ACCC's AI transparency statement confirms Australian Consumer Law prohibitions apply regardless of whether misleading output came from a human or an AI. Chatbot hallucinations in customer-facing automations can breach the ACL, with penalties up to A$50 million per contravention — which is why our builds include explicit guardrails, fallbacks and human-escalation paths for any customer-facing interaction.
At Queensland state level, Queensland signed on to the National Framework for the Assurance of AI in Government (June 2024). The Queensland Audit Office's 2024 AI ethics report sets public-sector expectations on AI risk management. Sector-specific strategies like the Department of Transport and Main Roads AI Strategy flow through procurement. If your Brisbane business sells to or partners with Queensland government, expect these frameworks in contract clauses — and Queensland private-sector enterprises are increasingly using them as their internal governance baseline too.
Every Mindiam automation build includes governance documentation: what the automation does, what data it processes, what decisions it makes autonomously versus hands off to humans, how failures are logged, and how it would map to the National AI Assurance Framework. For financial-services clients (Suncorp's and BOQ's sectors) we layer on sector-specific compliance documentation because APRA prudential standards apply alongside the general AI frameworks.