The Arkansas Supreme Court issued a formal reprimand to attorney Dana McClain on Feb. 5 after she filed legal pleadings containing case law and statutes generated by artificial intelligence, marking another instance of AI-related misconduct in the legal profession.
The court's action stems from a show-cause order issued Dec. 11, 2025, in *Arkansas Department of Human Services v. April Ward and Minor Child* (No. CV-25-758). During the court's review of the expedited matter, justices discovered that McClain's response brief contained citations to cases that could not be located, including direct quotes from non-existent legal precedents.
McClain, who served as an attorney ad litem in the case, admitted to using Microsoft Office Copilot to generate legal research. She specifically entered queries such as "can you provide case law to support this argument with citations from Arkansas dependency-neglect law" and "can you provide statutes and case law that state this with citations." The AI tool generated citations to cases that did not exist.
In her response to the court's show-cause order, McClain acknowledged falling short of her professional obligations. She explained that she intended to verify the accuracy of the AI-generated citations but failed to do so due to what she described as "extraordinary personal circumstances." McClain characterized her actions as "a regrettable decision and lack of oversight made under significant personal strain."
The attorney took immediate corrective action following the court's inquiry. McClain resigned from her position as an attorney ad litem and self-reported the incident to the Arkansas Supreme Court's Office of Professional Conduct. She also accepted self-imposed sanctions, which the court approved in its Feb. 5 order.
The court emphasized that attorneys must ensure their legal arguments are "warranted by existing law or a good faith argument for the extension, modification, or reversal of existing law." The incomplete text of the court's opinion suggests the justices were addressing fundamental professional responsibility standards that require lawyers to verify the accuracy of their legal citations.
McClain confirmed that she did not upload any sealed juvenile records to the AI platform, addressing potential concerns about confidentiality breaches in the dependency-neglect case. The case involved the Arkansas Department of Human Services seeking action regarding a minor child, a type of proceeding that typically involves sensitive family law matters.
This incident reflects a growing trend of AI-related misconduct cases in courts across the United States. Legal professionals have increasingly turned to artificial intelligence tools for research assistance, but several high-profile cases have emerged where attorneys submitted AI-generated content without proper verification.
The Arkansas Supreme Court's handling of the McClain case demonstrates judicial awareness of AI technology's limitations and potential for generating false information. The court's decision to issue a formal reprimand, rather than pursue more severe sanctions, appears to reflect McClain's cooperation and immediate remedial actions.
The case highlights critical ethical considerations for attorneys using AI tools in legal practice. Professional responsibility rules require lawyers to exercise competence and diligence in representing clients, which includes verifying the accuracy of legal research and citations before filing documents with courts.
McClain's use of Microsoft Office Copilot represents the mainstream adoption of AI tools by legal practitioners. However, the incident underscores that these technologies, while potentially useful for generating initial research directions, cannot replace traditional verification methods and professional judgment.
The court's order serves as a warning to the legal profession about the risks of relying on AI-generated content without proper verification. Attorneys remain personally responsible for the accuracy of all pleadings and briefs filed with courts, regardless of the tools used in their preparation.
The Arkansas Supreme Court's action in this matter establishes precedent for how courts may address similar AI-related misconduct cases. The combination of formal reprimand and acceptance of self-imposed sanctions suggests that cooperation and immediate corrective action may influence judicial responses to such incidents.
Legal practitioners across Arkansas and other jurisdictions will likely view this case as a cautionary tale about AI use in legal practice. The incident emphasizes the continuing importance of traditional research verification methods and professional oversight, even as technology transforms legal practice.
The court's swift action in addressing the AI-generated citation issue demonstrates judicial commitment to maintaining the integrity of legal proceedings and ensuring that technological advancement does not compromise professional standards.
