What GauntletScore checks
Case Citations
Verify case names, citations, and holding summaries against CourtListener case database.
Quoted Holdings
Validate that quoted case holdings match original case text, not paraphrased incorrectly.
Regulatory References
Check regulatory citations against eCFR and statute databases. Verify section numbers and subsections.
Statute Numbers
Ensure statute references (18 U.S.C. § 1001, etc.) are accurate and currently valid.
Procedural Claims
Verify procedural requirements, jurisdictional assertions, and court rules compliance.
Mathematical Claims
Check damage calculations, statutory multipliers, and quantified legal arguments.
How it works
Submit Brief or Filings
Upload your legal document via API or web interface. We accept PDFs, Word documents, and plain text.
Adversarial Verification
Seven agents debate case citations, holdings, regulatory references, and procedural claims across four rounds.
Receive Verdicts
Get per-claim verdicts (VERIFIED/DEBUNKED/INCONCLUSIVE) with source citations and score.
Sample Analysis Walkthrough
The following walkthrough describes a representative analysis from the GauntletScore validation study. Company identity has been anonymized. All findings are drawn from actual tool-verified results.
Document Submitted
A 47-page legal brief was submitted for verification. GauntletScore's Gemini Flash extraction pass identified 127 verifiable claims in the document, including:
- 23 case citations (federal and state courts)
- 8 regulatory references (CFR sections)
- 4 damage calculations (arithmetic figures with stated methodology)
- 12 factual assertions about named corporate officers and their attributed roles
Total processing time: 7 minutes.
What the Engine Found
Citation Failures — 3 of 23 case citations
CourtListener queries returned no matching records for three cited cases. Two citations were structurally plausible — correct reporter format, plausible court and year — but did not correspond to any actual decision in the federal case database. The third citation used a valid reporter format but cited a volume and page number that predates the court's establishment. Emmy (Evidence Analyst) flagged all three as DEBUNKED. Pyrrho confirmed through independent query. Verdict: DEBUNKED with 0.97 confidence.
These are the errors that have drawn Rule 11 sanctions in documented federal cases. They are invisible to proofreading, invisible to spell-check, and invisible to a single AI asked to review the brief — because the citations are grammatically and structurally correct. The case simply does not exist.
Superseded Regulatory Reference — 1 of 8 CFR citations
One regulatory reference cited a CFR section that had been amended and superseded prior to the date the brief was filed. The eCFR query confirmed the current regulation differs materially from what the brief represented. Thomas (CFO/Compliance) flagged this as a potentially material misrepresentation in the context of the underlying regulatory argument.
Arithmetic Error — 1 of 4 damage calculations
The brief stated that total damages of $19.07M consisted of $13.05M in direct damages plus $2.81M in consequential damages. The mathematical proof engine calculated the actual sum as $15.86M — a $3.21M discrepancy. The claimed total was not supported by the stated components. Thomas flagged the error as Critical severity; Ada identified a consistent pattern in which the inflated total figure was used four additional times in the brief's damages argument.
Executive Attribution Errors — 2 of 12 factual assertions
Two factual assertions attributed specific corporate titles to named individuals. SEC EDGAR queries confirmed both individuals were associated with the named company, but their current titles differed from those stated in the brief. One had been promoted and held a different role; one had departed the organization prior to the events described.
Summary Output
| Category | Count | Verified | Debunked | Inconclusive |
|---|---|---|---|---|
| Case citations | 23 | 20 | 3 | 0 |
| Regulatory references | 8 | 7 | 1 | 0 |
| Damage calculations | 4 | 3 | 1 | 0 |
| Executive assertions | 12 | 10 | 2 | 0 |
| Other claims | 80 | 74 | 2 | 4 |
| Total | 127 | 114 | 9 | 4 |
A full audit transcript — 47 pages of structured debate across four rounds, with source citations for every finding — was returned alongside the score and a downloadable Ed25519-signed certificate.
Why This Matters
Each of the findings above is the kind of error that a competent attorney reviewing the brief under time pressure could miss. Case citations that look real are not cross-referenced against live databases in manual review. Arithmetic in a complex damages section is rarely independently recalculated. Executive titles for third-party companies are rarely verified against EDGAR filings.
GauntletScore does not replace the attorney's judgment about what these errors mean strategically or legally. It surfaces the factual discrepancies so that judgment can be applied to verified ground truth rather than AI-generated text accepted at face value.
GauntletScore provides assistive verification and is not a substitute for professional judgment.