Case Study: How a Multi-Site SUD Treatment Organization Achieved CARF Three-Year Accreditation Under the 2025 MIC Standards

Last updated: April 2026

Last updated: April 2026

Client details are presented in anonymized form consistent with IHS confidentiality obligations. Bracket placeholders indicate where client-specific data will be inserted prior to publication.

Client Overview

  • Organization type: [Multi-site SUD treatment organization / Community Mental Health Center / CCBHC]
  • Location: [State]
  • Programs in scope: [e.g., Intensive Outpatient Program (IOP), Medication-Assisted Treatment (MAT), Residential Treatment]
  • Number of sites: [X sites]
  • Patients served annually: [X]
  • Reason for pursuing CARF: [e.g., Ohio HB 33 state mandate / Medicaid contracting requirement / opioid settlement grant eligibility / competitive differentiation]
  • Prior accreditation status: [None / Previous accreditation lapsed / First-time applicant]
  • Engagement start date: [Month, Year]
  • Survey date: [Month, Year]
  • Outcome: CARF Three-Year Accreditation awarded

The Challenge

[Organization name] came to IHS [X months] before their target survey date facing [describe the specific compliance challenges — e.g., "a documentation infrastructure that had grown organically over 12 years without a unifying compliance framework"]. The organization's [X] programs across [X] sites each maintained separate documentation systems, with no centralized quality management function and no standardized treatment planning process.

Three specific challenges defined the engagement:

1. Measurement-Informed Care Implementation (Standard 2.A.12)

[Organization name] had no existing MBC infrastructure. The PHQ-9, GAD-7, and DAST-10 were not routinely administered, and the organization's EHR [system name] had no fields configured for systematic outcome data collection. Clinical staff had no training on MIC workflows, and the quality manager had not been aware of the 2025 standard change when the engagement began.

CARF's 2025 Standard 2.A.12 is non-negotiable: organizations must demonstrate real-time use of validated psychometric tools to dynamically adjust treatment plans. A gap of this magnitude, discovered [X months] before survey, required immediate triage.

2. Treatment Planning Documentation Quality

A chart audit of [X] randomly selected patient files revealed that [X%] of treatment plans used boilerplate language that did not reflect the patient's individual voice, circumstances, or goals. Goals were not consistently written to SMART criteria. Treatment plan revision timelines were inconsistently documented, with [X%] of files showing revisions more than [X days] past required intervals.

3. Personnel Records and Competency Documentation

HR files for [X] of [X total] clinical staff were missing one or more required elements: primary source verification of licensure in [X] files; background check documentation in [X] files; competency-based training records (as distinct from attendance logs) in the majority of files reviewed. CARF surveyors audit HR files directly — incomplete records are a reliable source of conditions.

IHS's Approach

Phase 1: Gap Assessment and Triage (Weeks 1–3)

IHS conducted a comprehensive gap analysis against the 2025 CARF Behavioral Health Standards Manual across all applicable programs and sites. The gap report identified [X] deficiency categories, rated by severity and remediation timeline. The most urgent finding — the MBC infrastructure gap — was immediately escalated to organizational leadership with a remediation timeline that would allow the minimum six months of operational data collection required before survey.

Phase 2: MBC Infrastructure Build (Months 1–3)

IHS worked with [organization name]'s clinical leadership and IT team to configure [EHR system] data fields for PHQ-9, GAD-7, and DAST-10 administration. IHS developed clinical staff training materials explaining Standard 2.A.12's requirements — not just the mechanics of instrument administration, but the clinical reasoning that connects outcome data to treatment plan revision decisions. [X] clinical staff completed competency-based MBC training over [X weeks], with demonstration requirements that produced the documentation CARF surveyors would review.

Phase 3: Treatment Planning Remediation (Months 2–4)

IHS developed individualized treatment planning templates for each program type in scope — IOP, MAT, and residential — that embedded patient-voice language requirements and SMART criteria checklists at the structural level. Rather than relying on training alone to change documentation habits, IHS built the compliance requirement into the template architecture so that completing the template correctly produced compliant documentation automatically. [Organization name]'s clinical supervisors were trained on chart review protocols to identify non-compliant plans before they accumulated.

Phase 4: HR and Competency Documentation Remediation (Months 2–5)

IHS conducted a 100% audit of all [X] clinical staff HR files against the CARF personnel records checklist. Every deficiency was documented and assigned to a responsible staff member with a remediation deadline. Primary source verification was obtained for [X] licenses. Background check documentation gaps were resolved for [X] files. Competency-based training frameworks were implemented for [X] clinical procedure categories, replacing attendance-based training records with demonstrated competency documentation.

Phase 5: Mock Survey (Month [X])

IHS conducted a [X]-day mock survey across [X] sites, interviewing [X] staff members across clinical, administrative, HR, and leadership roles. The mock survey identified [X] remaining deficiencies requiring remediation before the formal survey. The most significant finding was [describe finding]. IHS produced a written remediation report with prioritized action items and provided [X weeks] of targeted support to close each identified gap.

Phase 6: Survey Preparation (Final 60 Days)

Application submitted and reviewed by Dr. Goddard before submission. Leadership prepared for surveyor entrance conference. Document production organized. Emergency drill documentation completed across all shifts at all sites. All outstanding HR file deficiencies confirmed resolved. Six months of MBC operational data confirmed documented and accessible for surveyor review.

Outcome

[Organization name] received CARF Three-Year Accreditation following its [Month Year] survey. The survey outcome included:

  • [X] commendations from CARF surveyors, including specific recognition of the organization's [MBC implementation / treatment planning framework / other]
  • [X] Quality Improvement Plan items (all minor / none / describe) — below the [industry average / prior survey baseline]
  • No conditions requiring corrective action prior to accreditation award

Operational Impact

  • Medicaid contracting: [Organization name] secured [describe contract outcome — e.g., "eligibility for the state's 1115 SUD Waiver contracting program, adding X new patients to their caseload within 90 days of accreditation"]
  • Opioid settlement funding: [Organization name] qualified for [describe grant outcome]
  • State inspection frequency: [If Florida — reduced from annual to triennial DCF inspection]
  • Clinical quality: [Describe measurable MBC-driven outcome improvements, if available]
  • Staff competency: [Describe training infrastructure improvement]

Key Lessons for Organizations Pursuing CARF Accreditation in 2025–2026

Start MBC Implementation Before Everything Else

Standard 2.A.12's six-month minimum operational data requirement means MBC infrastructure must be the first thing built in any CARF engagement. Organizations that delay EHR configuration and staff training while working on other documentation domains will find themselves unable to demonstrate the operational data CARF surveyors will request regardless of how strong every other element of the application is.

Build Compliance Into Templates, Not Just Training

Training alone does not produce sustainable documentation change in high-volume clinical environments. The most effective intervention is structural: build SMART criteria requirements, patient-voice language prompts, and revision timeline alerts into the treatment planning template itself. When the template cannot be completed incorrectly, compliance becomes the path of least resistance rather than an additional burden.

Audit HR Files 90 Days Before Survey — Not 30

Primary source verification for [X] clinical licenses takes time that cannot always be compressed. Background check documentation retrieval from vendors or prior employers takes time. Starting the HR file audit 90 days before survey gives organizations enough runway to resolve every deficiency before the surveyor opens the first file.

Mock Survey Is the Most Accurate Predictor of Survey Outcome

The [X] deficiencies identified in IHS's mock survey were the [X] findings the CARF survey team would have found — and would have cited in the organization's accreditation outcome. Mock survey investment directly translates to survey outcome improvement.

Is Your Organization Preparing for CARF Accreditation?

Schedule a no-obligation gap assessment with Thomas G. Goddard, JD, PhD. IHS will assess your current compliance posture against the 2025 CARF standards and give you a clear phased roadmap to Three-Year Accreditation.

Schedule Your CARF Gap Assessment