Florida Bar Opinion 24-1 and AI: What Every Florida Attorney Needs to Know
Florida attorneys are asking a single question more often than any other in 2026: can we use generative AI without violating our ethical duties? The Florida Bar's Board of Governors answered that question in January 2024 with Florida Bar Ethics Opinion 24-1, the first comprehensive ethics guidance on generative AI from a state bar in the country. The opinion does not prohibit AI. It defines the conditions under which an attorney may use it.
This article walks through the four ethical duties Opinion 24-1 addresses, the practical compliance steps each one requires, and the architectural distinction (cloud vs. on-premise) that materially changes the analysis. Citations to the underlying Florida Bar Rules of Professional Conduct appear throughout.
What Opinion 24-1 Actually Says
Opinion 24-1 is structured around four core ethical obligations under the Florida Rules of Professional Conduct. The opinion treats generative AI not as a categorically prohibited technology, but as a tool whose ethical use depends on how the attorney deploys it.
The four obligations addressed are:
- Confidentiality of information under Rule 4-1.6
- Competence under Rule 4-1.1
- Candor toward the tribunal under Rule 4-3.3
- Supervision of nonlawyer assistance under Rule 4-5.3
The opinion concludes that an attorney may ethically use generative AI provided the attorney complies with each of these duties. The opinion is permissive in tone but specific in its requirements.
Confidentiality: Rule 4-1.6 and Third-Party AI Tools
Rule 4-1.6(a) prohibits a lawyer from revealing information relating to the representation of a client without the client's informed consent, unless an exception applies. Rule 4-1.6(e) requires lawyers to make reasonable efforts to prevent inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation.
Opinion 24-1 applies these rules to generative AI in a direct way. Inputting client information into a third-party AI tool that retains, uses, or trains on that input is a disclosure under Rule 4-1.6. The attorney must obtain the client's informed consent before doing so.
The practical implication is significant. Most cloud-based generative AI products, including general-purpose tools like ChatGPT and legal-specific tools like CoCounsel and Harvey, transmit user inputs to third-party servers. The terms of service for these products vary on the question of data retention and training. An attorney using one of these tools must, at minimum:
- Read the vendor's data retention and training terms before any client information is entered
- Verify that the vendor will not train models on the firm's inputs
- Obtain informed consent from the client if the tool will receive any information that could identify the client or the matter
- Consider whether the vendor's confidentiality posture is sufficient for the sensitivity of the data involved
The opinion does not require client consent for every conceivable use of AI. It requires consent when the use involves the disclosure of confidential client information to a third party. The threshold question is whether disclosure occurs.
The On-Premise Distinction
Opinion 24-1 is silent on the architecture of the AI system, but the rule it interprets, Rule 4-1.6, turns on the fact of disclosure. An AI system that processes data entirely on the firm's own hardware, without transmitting any inputs or outputs to a third-party server, does not effect a disclosure. No third party receives the data. No external retention occurs.
This is the operational distinction between cloud AI and on-premise AI. Cloud AI sends data outside the firm. On-premise AI does not. For purposes of Rule 4-1.6 analysis, the on-premise architecture removes the disclosure event that would otherwise trigger the consent requirement.
This is not a loophole. It is a straightforward application of the rule. Rule 4-1.6 governs disclosure to others. If no disclosure to others occurs, the rule's consent requirement is not triggered for that act of processing.
Competence: Rule 4-1.1 and the Duty to Understand the Tool
Rule 4-1.1 requires that a lawyer provide competent representation, which includes the legal knowledge, skill, thoroughness, and preparation reasonably necessary for the representation. The 2012 amendment to Comment 6 of the corresponding ABA Model Rule 1.1 added that competent lawyers should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology. Florida adopted similar language.
Opinion 24-1 reads this duty as requiring an attorney using generative AI to understand:
- What the tool does and does not do
- The tool's known limitations, including hallucination (the generation of fabricated case citations or factual content)
- The need to verify any AI-generated work product before relying on it or filing it with a tribunal
- The data flow of the tool (where inputs go, where outputs come from, what is retained)
The competence duty is not satisfied by treating AI output as presumptively correct. Cases like *Mata v. Avianca, Inc.* (S.D.N.Y. 2023), in which counsel filed a brief containing fabricated citations generated by ChatGPT and were sanctioned, demonstrate the failure mode the competence duty is designed to prevent.
For document search systems specifically (the category Mi Assist Legal occupies), the competence inquiry is narrower. A document search system that returns answers grounded in the firm's own documents with source citations on every answer is verifiable in a way that a generative tool drawing on the broader internet is not. The attorney can confirm each cited source. But the duty to verify before relying remains.
Candor: Rule 4-3.3 and Disclosure to Tribunals
Rule 4-3.3(a)(1) prohibits a lawyer from knowingly making a false statement of fact or law to a tribunal. Opinion 24-1 connects this to AI output: if the attorney uses AI to generate work product that contains hallucinated case citations or factual statements, and the attorney files that work product without verification, the duty of candor is at risk.
Florida courts have begun to formalize this duty through standing orders. The 11th Judicial Circuit (Miami-Dade) and 17th Judicial Circuit (Broward) have issued orders requiring attorneys to disclose when AI was used in the preparation of filings and to certify that any AI-generated content has been verified for accuracy. Attorneys practicing in those circuits have an additional explicit disclosure obligation that supplements Rule 4-3.3.
This means a Florida attorney using AI in litigation practice now has three overlapping duties:
- Verify factual and legal accuracy of AI-assisted work product (Rule 4-3.3)
- Disclose AI use in filings where local court orders require (11th and 17th Circuit standing orders)
- Maintain professional responsibility for the work product regardless of how it was produced (Rule 4-1.1)
Disclosure rules are evolving. Other Florida circuits may adopt similar requirements. Attorneys should monitor circuit-level standing orders in addition to bar-level guidance.
Supervision: Rule 4-5.3 and AI as Nonlawyer Assistance
Rule 4-5.3 requires that a lawyer with managerial or supervisory authority make reasonable efforts to ensure that the conduct of nonlawyer assistants is compatible with the lawyer's professional obligations. Opinion 24-1 treats generative AI as a form of nonlawyer assistance for purposes of this rule.
The practical effect: an attorney who delegates a research, drafting, or document review task to an AI system has the same supervisory obligation as if a paralegal had performed the task. The attorney must:
- Review the AI's output for accuracy and completeness
- Take responsibility for the final work product
- Train other firm personnel using AI tools on the same ethical obligations
For firms using AI document search, this means firm policy should designate which personnel can use the system, what oversight is required for AI-assisted output that leaves the firm, and how training records are maintained.
A Plain Summary for Florida Attorneys
Opinion 24-1 establishes the following operational framework. An attorney using generative AI in Florida should be able to answer yes to each of the following questions before deployment:
| Question | Rule | Yes/No |
|---|---|---|
| Have I obtained client consent if my AI tool transmits client data to a third party? | 4-1.6 | Yes / Not applicable for on-premise |
| Do I understand what the tool does and its known limitations? | 4-1.1 | Yes |
| Have I established a verification step before any AI-assisted work is filed or sent? | 4-3.3, 4-1.1 | Yes |
| If I practice in the 11th or 17th Circuit, have I followed the AI disclosure orders? | Local rule | Yes |
| Have I supervised any firm personnel who use the AI tool? | 4-5.3 | Yes |
The framework is straightforward. The compliance burden depends primarily on the architecture of the tool. Cloud tools require client consent, vendor diligence, and ongoing monitoring of vendor practices. On-premise tools require the same competence, candor, and supervision discipline, but eliminate the disclosure question that drives the consent requirement under Rule 4-1.6.
What Mi Assist Legal Does
Mi Assist Legal is an on-premise AI document search system installed on a Mac Mini or server inside the firm's office. The system indexes the firm's own case files, contracts, pleadings, and correspondence. Attorneys ask questions in natural language and receive answers grounded in the firm's documents, with source citations on every answer.
Because the system processes data entirely on the firm's own hardware, no client information is transmitted to a third party in the course of routine use. This addresses the Rule 4-1.6 disclosure question directly. The system does not eliminate the attorney's competence, candor, or supervision duties under Opinion 24-1, but it removes the consent requirement that cloud AI tools introduce.
For Florida firms evaluating AI tools, the architectural question is the first question. How Mi Assist Legal works on-premise and the security architecture describe the deployment in detail.
Frequently Asked Questions
Q: Does Florida Bar Opinion 24-1 prohibit attorneys from using ChatGPT or similar tools?
No. Opinion 24-1 does not prohibit any specific tool. It requires that an attorney using any generative AI tool comply with the four ethical duties identified: confidentiality, competence, candor, and supervision. For tools that transmit client information to a third party, this includes obtaining informed client consent under Rule 4-1.6.
Q: Do I need client consent to use an on-premise AI system?
The Rule 4-1.6 consent requirement is triggered by disclosure of confidential information. An on-premise system that processes data entirely on the firm's own hardware does not effect a disclosure to a third party. The other duties under Opinion 24-1 (competence, candor, supervision) still apply, but the consent requirement under Rule 4-1.6 is not triggered for that processing.
Q: What does the 11th Judicial Circuit's AI disclosure order require?
The 11th Judicial Circuit (Miami-Dade) issued a standing order requiring attorneys to certify on the face of any filing whether AI was used in its preparation, and if so, to certify that any AI-generated content has been verified for accuracy. The 17th Judicial Circuit (Broward) issued a substantially similar order. Attorneys practicing in those circuits should review the current standing orders before filing.
Q: Does Opinion 24-1 apply to AI-assisted document search, or only to generative AI used for drafting?
Opinion 24-1 addresses generative AI broadly. The four ethical duties apply to any AI use that involves client information or work product. Document search systems that retrieve and summarize content from the firm's own documents fall within the scope, though the practical compliance posture differs depending on whether the system is cloud or on-premise.
Q: Where can I read the full text of Opinion 24-1?
The Florida Bar publishes ethics opinions at floridabar.org under the Ethics Opinions section. Opinion 24-1 is available at floridabar.org/etopinions/opinion-24-1/.
---
This article is intended for general educational purposes for Florida attorneys evaluating AI tools. It does not constitute legal advice. Attorneys should consult the current text of Florida Bar Opinion 24-1 and the Florida Rules of Professional Conduct before making compliance decisions.
Mi Assist Legal
Private AI document search for Florida law firms.
Mi Assist Legal installs on a Mac Mini or server inside your firm. No cloud. No third-party access. Designed for Florida Bar Rule 4-1.6 and ABA Model Rule 1.6 compliance by architecture.
Book a Consultation