Positioning Australia as a Leader in Digital Economy Regulation - Automated Decision Making and AI Regulation
The Law Council is grateful for the opportunity to make a submission in relation to ‘Positioning Australia as a leader in digital economy regulation – Automated decision making and AI regulation – Issues Paper’ (Issues Paper), published by the Digital Technology Taskforce in the Department of Prime Minister and Cabinet.
The Law Council’s submission is divided into two parts:
- it first addresses the issues which arise from regulating the use of artificial intelligence (AI) and automated decision making (ADM) by government agencies performing functions and powers under Commonwealth legislation; and
- then considers issues that arise from regulation which applies or is relevant to the use of AI and ADM by private entities.
In relation to public sector use of AI and ADM, increasingly, new and amended Commonwealth Acts are empowering a senior Commonwealth official to arrange for the use of computer programs to exercise statutory powers and functions, including to make, and assist in making, administrative decisions, in the stead of officials.
The Law Council considers that there are evident public benefits in the increased deployment of AI and ADM, where this is done appropriately and judiciously. Those benefits can include increased efficiency, consistency, and accountability in decision-making by government agencies. However, the Law Council considers that the threshold objective of regulation in relation to public sector use of AI and ADM is to ensure that it is employed consistently with administrative law principles which underpin lawful decision making – lawfulness, fairness, rationality, and transparency.
The Law Council considers that comprehensive regulatory reform is required to ensure that Commonwealth legislation which authorises the use of ADM and AI to exercise statutory powers is consistent with administrative law principles. Specifically, the Law Council recommends that:
(a) the Australian Government commission an audit of all current or proposed use of AI and automation to make or assist in making administrative decisions by or on behalf of Government agencies;
(b) legislative amendments be made to ensure that where it is intended that a statutory power be exercised by using ADM or AI, the statute expressly authorises the use of ADM or AI;
(c) all legislation which authorises the use of ADM and AI to exercise, or assist in the exercise, of statutory powers should:
(i) be consistent with regard to types of powers which may be exercised by ADM or AI, and employ standard statutory language for expressing the power to use ADM or AI;
(ii) require an assessment be undertaken of the suitability of the proposed automated system to exercise the statutory power, as a precondition to making arrangements for use of AI or ADM;
(iii) require that all arrangements for the use of ADM be subject to ongoing governance requirements by a multidisciplinary team to ensure they remain lawful and up-to-date, including auditing, testing and reporting obligations;
(iv) require that officials publish all arrangements for the use of ADM and any suitability assessment which underpins it, including sufficient information to enable a broad understanding of how AI or ADM o perates to produce lawful administrative decisions;
(v) require that any affected individual must be notified where there is significant use of automation, including AI, in making an administrative decision;
(vi) require that an automated decision must be capable of being reduced to a statement of reasons explicable by a human, produced by a full audit trail of the decision-making path, for the purpose of enabling it to be reviewed by a tribunal or court, and the person affected by the decision should have a right to request such reasons; and
(vii) provide for the automated decision to be subject to review, preferably review by a human internal to the agency, and the person affected by the decision must be informed of that review avenue; and
(d) the Digital Technology Taskforce should:
(i) investigate the possible use of an algorithmic impact assessment to perform the suitability assessment referred to in [6(c)(ii)]; and
(ii) consider models for a regulatory body to oversee the proposed regulatory framework.
In relation to the regulation of private sector use of AI and ADM, the Law Council makes the following observations, assisted by the submissions of its constituent bodies and specialist committees:
- it generally supports technology neutrality as a key principle to underpin any new regulation regulating the use of AI by private entities;
- it notes that there are opportunities to make clear the obligations and risks arising from commercially dealing with data in the review of the Privacy Act 1988 (Cth) (Privacy Act), the ‘consumer data right’ and legislative reform associated with electronic communication and data protection;
- it emphasises the critical importance of a harmonised approach being taken across regulators to ensure consistency, avoid duplication, and avoid fragmentation of regulation;
- it suggests, a s a general principle, that regulatory frameworks applied to the use of AI by private entities require the use of AI which affects individual rights to be transparent and subject to human oversight and review; and
- any approach to regulation of private sector use of AI must be consistent with internationally recognised guidelines and frameworks where appropriate in Australia’s circumstances.
Last Updated on 06/06/2022
Share
Related Documents
Tags
Most recent items
Legal Practice Section
Inquiry into Not-for-profit Entities – Tax Assessments
Business Law Section
Response to IP Australia Public Consultation – Design Law Treaty (DLT)
Trending Items
Law Council
National Public Register of Child Sex Offenders
Business Law Section