Professional Judgment
AI can draft a memo, summarize a return, and categorize a transaction. It cannot sign your return, defend you in an audit, or take fiduciary responsibility for what it produces. Here is what still requires a human.
I use AI every day. ChatGPT, Claude, Intuit Assist, Sage’s automation features, the categorization engine inside QuickBooks — all of it. The tools have improved dramatically over the past two years, and any CPA who claims they don’t use them is either being dishonest or being left behind. AI now drafts client emails, summarizes long contracts, surfaces footnote disclosures from 200-page 10-Ks, and proposes journal entries faster than I can. That is genuinely useful, and the productivity gains are real.
And yet, the question keeps coming up from business owners: do I still need a CPA, or can I just use ChatGPT?
The short answer is yes, you still need a CPA. The longer answer — the one that actually matters — is about understanding the line between what AI can do and what AI fundamentally cannot do, no matter how good the model gets. That line is not about intelligence or speed. It is about responsibility, judgment, and accountability.
Modern AI is excellent at pattern recognition, summarization, drafting, and the mechanical execution of well-defined tasks. It can categorize the vast majority of bank transactions correctly. It can read a vendor invoice and pull out the line items. It can draft a memo explaining the difference between a Section 179 deduction and bonus depreciation. It can summarize a 50-page partnership agreement in two paragraphs and flag the unusual provisions. It can take a financial statement and write a coherent management discussion. For the routine, the repeatable, and the previously documented, AI is a force multiplier — and a CPA who is not using it is delivering less value for the same fee.
It is also genuinely useful for first-pass research. A question about how a specific Texas franchise tax provision applies to a particular industry can be answered in thirty seconds, with citations, instead of an hour of manual research. The first draft of a tax planning memo, an engagement letter, or a board presentation can be generated and refined in a fraction of the time it used to take. The economic impact of this is meaningful, especially for smaller firms that previously could not afford the staff hours these tasks required.
The list is shorter, but everything on it matters more than what AI does well.
AI cannot sign your tax return. The IRS requires a human preparer with a Preparer Tax Identification Number to sign the return and accept legal responsibility for its accuracy. ChatGPT does not have a PTIN. Intuit Assist does not have a PTIN. The AI inside any tax software you use is a tool the human preparer uses — not the preparer itself. When the IRS sends a notice asking why a particular deduction was claimed, the agency expects to hear from the human who signed the return, not from a chatbot. That signature is the embodiment of fiduciary responsibility, and no software can carry it for you.
AI cannot exercise judgment under ambiguity. Tax law and accounting standards are full of gray areas where the right answer depends on facts and circumstances that are not in any database. Is this consultant a contractor or an employee? Should this lease be capitalized? Is this expense ordinary and necessary, or is it a personal benefit? Does this revenue meet the criteria for recognition this period or next? AI will give you an answer to every one of these questions — confidently, in beautiful prose — but the answer is a probabilistic guess based on training data. It is not a judgment call by someone who understands the consequences of being wrong. A CPA weighs the materiality, the audit risk, the client’s broader situation, the precedent it sets, and the reasonableness of the position. AI weighs none of that, because it does not have a license at stake or a client to protect.
AI cannot defend you in an audit. If the IRS, a state tax authority, or an outside auditor raises a question about a position taken on your return or in your financial statements, you need a credentialed human who can speak to the rationale, produce contemporaneous workpapers, and stand behind the methodology. AI cannot represent you. It cannot file a Power of Attorney. It cannot sit across from a revenue agent and explain why the position taken was reasonable under the facts as they existed. A CPA can. That representation right is statutory, and it requires a license that AI does not have and cannot acquire.
AI cannot detect fraud or material misstatement with intent. AI is good at flagging anomalies in well-defined data — a transaction that does not match historical patterns, a balance that does not reconcile, a journal entry posted on a weekend. It is not good at the broader skepticism that defines professional judgment. A CPA reviewing a set of books knows to ask why receivables jumped 40% but cash collections did not move. To question why a related-party transaction was structured the way it was. To notice that the timing of a large vendor payment lines up suspiciously with a quarterly tax filing. AI can produce a list of unusual items. A trained accountant knows which unusual items matter and what they might mean.
AI cannot take fiduciary responsibility. This is the deepest issue. When a CPA accepts an engagement, they take on a duty of care to the client that is enforceable under state law and the AICPA Code of Professional Conduct. That duty is the foundation of trust in the profession. It is why an audit opinion means something, why a tax return signed by a CPA carries weight with the IRS, and why a financial statement prepared by a credentialed professional is treated differently than one assembled by anyone else. AI cannot accept fiduciary responsibility because it cannot be held accountable. There is no malpractice insurance, no licensing board, no civil liability, no professional ethics review. If the AI is wrong, the consequences fall entirely on the user.
The most worrying pattern I see is business owners using AI for tax planning and bookkeeping decisions without understanding that the output is unverified. ChatGPT will tell you, with complete confidence, that a particular deduction is allowable. It will cite a Code section. It will explain the reasoning in clear prose. And occasionally, it will be wrong — either because the model hallucinated, because the citation refers to a provision that was repealed, or because the analysis missed a material fact that changes the answer. The owner, having received an authoritative-sounding response, files the return and discovers the error two years later when the IRS sends a notice. The penalty interest by then often exceeds what a CPA would have charged to do it correctly the first time.
Similarly with bookkeeping platforms that promise full automation. The AI categorization works most of the time, until a vendor changes its name, a refund posts as income, or a bank transfer gets recorded as revenue on both ends. The system does not know it has made an error. It confidently assigns the transaction to a plausible-looking account and moves on. By the time anyone notices, twelve months of compounding misclassifications have produced financial statements that look reasonable and are quietly wrong. Decisions made on bad data are worse than decisions made on no data, because false confidence is worse than honest uncertainty.
AI is a tool. A powerful one. The right model is to use AI to do everything it does well — mechanical work, first drafts, pattern recognition, research summaries — and to have a credentialed human review the output, exercise judgment on the ambiguous calls, and accept responsibility for the result. That is exactly how I work, and it is how every responsible CPA practice will work going forward. The question is not AI or CPA. The question is whether the human reviewing the AI output has the training, the license, and the accountability to catch what the AI gets wrong.
If the answer is no — if you are asking ChatGPT directly, or relying on a fully automated bookkeeping platform with no human oversight, or filing a return generated by software with no professional review — you are taking on a risk that the savings do not justify. The cost of a CPA review is small relative to the cost of a tax notice, an audit adjustment, or a year of misstated financials. The CPA does not exist to do the mechanical work. The CPA exists to know when the mechanical work is wrong, and to take responsibility when it matters.
The age of AI does not eliminate the need for a CPA. It changes what the CPA spends time on. Less data entry, more judgment. Less first-draft writing, more review and signoff. Less manual reconciliation, more skepticism about what the system is producing. The skill set is shifting, but the responsibility remains where it has always been: with the human who signs the work product and stands behind it.
If you are running a growing business and wondering whether your current setup — AI bookkeeping, online tax software, occasional check-ins with a CPA at year-end — is enough, the honest answer is that it depends on what you are doing. For a small, simple business with predictable transactions and no audit exposure, it might be. For a business with multiple entities, complex revenue recognition, payroll across states, lender covenants, or a transaction on the horizon, it is almost certainly not. The line between “automation is fine” and “you need professional oversight” is exactly where the consequences of being wrong stop being trivial.
If you want a straight answer about which side of that line your business is on, call us at 817-415-5563 or schedule a 30-minute consultation. We will tell you honestly — including whether you actually need us. AI is good. It is not a substitute for a human who can sign their name to the work.
Related Services
More Insights
The platform has come a long way. The question is not whether automation works — it is whether you will know when it stops working.
Fractional CFOSix signals that your business has outgrown its current financial infrastructure — and why the answer usually isn't a full-time hire.
Transaction ReadinessWhat buyers actually look at, how far in advance to start, and the mistakes that erode value before diligence even begins.
Get Started
A 30-minute conversation is usually enough to know whether your current setup needs professional oversight. No pitch, no pressure — just a straight answer.
Schedule a ConsultationFree initial consultation • Completely confidential