AI Act: What Changes for Accountants and Consultants
A practical guide to EU Regulation 2024/1689 for professional firms. Obligations, deadlines and how to prepare.
Caricamento...
With the AI Act and national implementing legislation, every professional engagement letter should include an AI clause. Here is why, what to write, and a ready-to-adapt template.
2026 is the year in which transparency about AI use stops being a professional courtesy and becomes a regulatory obligation. The AI Act is fully operational. National implementing legislation is in force across EU member states. And every professional engagement letter that does not mention AI use is, to an increasing extent, an incomplete letter.
This is not alarmism. It is the logical consequence of a regulatory shift that many firms have monitored but not yet translated into concrete documents.
Risk 1 — Violation of national implementing legislation and Art. 50 AI Act (transparency) National implementing legislation (e.g. Art. 13 of Italy's Law 132/2025, in force since 10 October 2025) already requires professionals to disclose AI use to clients. Art. 50 of the AI Act, applicable from 2 August 2026, will further strengthen transparency obligations for deployers, requiring disclosure when persons interact with AI systems and marking of AI-generated content.
Risk 2 — Violation of national implementing legislation National laws implementing the AI Act add specific requirements for professionals who use AI in professional services. Client disclosure requirements under these laws typically require written notification identifying the AI systems used, the activities for which they are employed, and the manner of human oversight. No clause in the engagement letter means no evidence of compliance.
Risk 3 — Civil liability for AI errors The most concrete risk: AI output that is not adequately supervised leads to a professional error. The client challenges. The professional cannot demonstrate that the client was informed of the use of AI. In the absence of this disclosure, the professional's defensive position is significantly weakened — regardless of who is wrong on the substance.
The clause below is adaptable to accountants, lawyers and HR consultants. The placeholders in square brackets must be replaced with the firm's specific details.
Use of artificial intelligence tools
In carrying out this engagement, [FIRM NAME] may use artificial intelligence (AI) tools to support activities such as [SPECIFIC DESCRIPTION: e.g. "regulatory and case-law research", "drafting document templates", "analysis of legal and financial data"]. AI tools currently in use include: [LIST: e.g. Microsoft Copilot, ChatGPT Enterprise].
Every output produced by AI systems is subject to review and validation by a qualified professional of the Firm, who assumes full professional responsibility in accordance with the applicable ethical and regulatory rules. The use of AI does not modify the fiduciary relationship between Firm and Client or the professional liability regime.
The Client has the right to request at any time specific information on the use of AI in their matter, including the systems employed and the manner of human oversight. Such information is provided within 10 working days of written request. The Client also has the right to request that specific activities be performed without AI assistance by notifying the Firm in writing.
Documentation relating to the use of AI is retained for [5] years as a matter of professional best practice and in line with the firm's general document retention policy. For any queries, the Client may contact [EMAIL OR FIRM CONTACT].
This clause covers all the elements required by regulation: identification of tools, description of the activity, human oversight, client's right to information, documentation retention.
Where to insert it: the clause fits best in the "Method of service delivery" section or as a standalone article before the liability section. It should not be relegated to footnotes or general terms and conditions — the regulations require that it be intelligible and accessible.
How to present it to the client: if the client asks questions, the answer is direct — "we use AI for some operational activities, but every output is reviewed by a professional. You are informed and can always ask for details." There is no need to be defensive about using AI: it is normal, transparent and governed.
If the client asks you not to use AI: this request must be assessed case by case. For some activities (regulatory research, case-law searches) AI is now integrated into the workflow in a way that is difficult to separate. For others (drafting highly personalised high-value documents) it is possible to exclude AI on request. Documenting the response is essential.
Contracts already signed will not include the AI clause. But the disclosure obligation applies regardless, from the moment AI starts being used on their matters.
The practical solution is a separate communication sent to all active clients, with content similar to the clause. Email is sufficient, provided that evidence of sending is retained. The text can be concise:
"Dear Client, we are writing to inform you that [FIRM NAME] uses artificial intelligence tools to support some operational activities. Every AI output is supervised by a qualified professional. You have the right to request specific information. For details, please contact [EMAIL]."
No need to renew the contract. What is needed is documented evidence that the disclosure was provided.
National laws implementing the AI Act generally introduce for regulated professionals the obligation of written disclosure to the client when AI is used in delivering professional services. The disclosure must typically indicate: (a) the AI systems used, (b) the activities for which they are employed, (c) the manner of human oversight, (d) the client's rights in this regard.
This obligation applies to all regulated professionals — accountants, lawyers, HR consultants, notaries — and to any non-regulated professional who carries out reserved or para-reserved activities with AI.
Failure to disclose is not only a regulatory risk: it is an element that can be invoked in a professional dispute before the relevant regulatory body.
Updating the standard engagement letter template is a 30-minute task if you have a template to hand. The real work is doing the inventory of AI tools in use — because the clause must be precise, not generic — and verifying that the listed tools have adequate contracts with their suppliers.
In our guides collection you will find the complete AI clause template with variants for accountants, lawyers and HR consultants, plus a communication template for existing clients.
Download the templates — Explore the AI Act Ready service
To understand the full regulatory framework that makes these contractual disclosures compulsory — including the risk classification system and the obligations that differ by AI system type — read our AI Act guide.
Technically, the law requires the disclosure (national implementing legislation such as Art. 13 of Law 132/2025 in Italy, already in force, plus Art. 50 AI Act from August 2026), not necessarily a formal contractual clause. But including the disclosure in the engagement letter is the most practical and defensible way to comply: it creates written evidence, obtains the client's acceptance and covers potential future disputes.
Contracts already signed do not include the clause, but the disclosure obligation applies regardless. The practical solution is to send a separate communication to all active clients, informing them of the use of AI in the services relating to them. No need to renew the contract — a documented letter or email is sufficient.
The client has the right to be informed, not the right to prohibit the professional from using AI (unless the system in question is high-risk and directly impacts the client). If a client objects to the use of AI on their data, the professional must assess case by case whether it is possible to deliver the service without AI or whether the engagement is incompatible.
Generally the professional themselves, adapting a standard template. No lawyer is needed for standard disclosure clauses. However, for firms handling particularly sensitive data (e.g. law firms on disputes, tax firms on complex corporate restructurings), legal review is advisable.
Ogni settimana: guide pratiche, novità normative e casi d'uso per studi professionali. Niente spam.
A practical guide to EU Regulation 2024/1689 for professional firms. Obligations, deadlines and how to prepare.
A practical checklist with the 15 key AI Act obligations for professional firms. From AI tool inventory to documented training, internal policy and client disclosure.
What the AI literacy obligation requires, who it applies to, what penalties risk non-compliance, and how to fulfil it before the deadline.