Case Study: How a [STATE] [CLIENT TYPE] Achieved CARF Crisis Intervention Accreditation and Strengthened Its Position in the [STATE] Crisis System

Last updated: April 2026

A [SIZE: e.g., mid-sized community behavioral health center / regional crisis services organization / CCBHC] in [STATE] engaged IHS to guide their [PROGRAM DESCRIPTION: e.g., mobile crisis team / psychiatric emergency service / urgent care behavioral health program] through CARF Crisis Intervention Accreditation. Within [TIMELINE] months, they achieved [ONE-YEAR / TWO-YEAR / THREE-YEAR] CARF accreditation on their first attempt — satisfying [STATE AGENCY / FUNDER / 988 NETWORK] requirements and positioning the organization as the quality leader in [STATE / REGION]'s behavioral health crisis system.

Background

[CLIENT TYPE] is a [DESCRIPTION: e.g., non-profit community behavioral health organization / CCBHC / behavioral health authority / health system] serving [POPULATION / GEOGRAPHY: e.g., a five-county rural region / an urban metro area with approximately X residents]. Their crisis services portfolio includes [PROGRAM DESCRIPTION: e.g., a 24/7 mobile crisis team operating across three counties, a co-located urgent care behavioral health walk-in program, and an emergency department partnership with the regional hospital providing embedded psychiatric emergency services].

[CLIENT TYPE] had been operating crisis services for [X] years and had [ACCREDITATION HISTORY: e.g., no prior CARF accreditation history / held a prior CARF accreditation that had lapsed / held state certification but not external CARF accreditation]. The decision to pursue CARF Crisis Intervention Accreditation was driven by [DRIVER: e.g., a state behavioral health authority requirement tied to contract renewal / 988 Lifeline network participation requirements / a board-directed quality initiative / an opportunity to differentiate the organization in a competitive contract environment].

The Challenge

[CLIENT TYPE] faced a specific accreditation challenge: [SUMMARY OF CHALLENGE — e.g., their mobile crisis team was clinically strong but had never built the formal compliance infrastructure CARF accreditation requires. Frontline staff were skilled at crisis response but the organization had no documented competency assessment system, no functioning quality improvement program with outcome data, and no individualized safety planning documentation process — crisis encounters were managed with clinical skill but documented inconsistently]. Their specific challenges included:

  • No individualized safety planning documentation: [DESCRIPTION — e.g., Mobile crisis team clinicians completed safety planning in the field, but documentation was inconsistent — some encounters had detailed notes, others had brief entries. There was no standardized format, and plans were not consistently individualized to each person's specific situation, history, and support network. CARF's requirement for individualized safety plan documentation for every person served could not be met with the existing approach.]
  • Quality improvement program in name only: [DESCRIPTION — e.g., The organization collected basic output data — number of calls, response times, disposition — but had no formal QI structure. There was no designated QI committee, no documented leadership review cycle, and no evidence that data had ever driven a specific program change. CARF's expectation of an active improvement cycle with documented outcomes was not being met.]
  • Staff competency documentation gaps: [DESCRIPTION — e.g., Crisis team clinicians had strong clinical training histories but individual competency records were scattered across HR files and supervisor notes. There was no consolidated competency documentation system, no consistent assessment of crisis-specific skills (suicide risk assessment methods, de-escalation techniques, field safety protocols), and no annual competency review cycle aligned to CARF's requirements.]
  • Transition and linkage process gaps: [DESCRIPTION — e.g., Mobile team clinicians made follow-up referrals routinely, but the process was not documented in a trackable way. There was no warm handoff documentation standard, no tracking of whether referred persons actually connected with follow-up services, and no data on 30-day follow-up contact rates — a key CARF quality improvement metric.]
  • Mobile team operational policy gaps: [DESCRIPTION — e.g., Dispatch protocols, field staff safety check-in procedures, and law enforcement coordination protocols existed informally as shared practice among experienced staff, but were not documented in formal policies. If CARF surveyors asked to see written protocols for how the team manages high-acuity field encounters, there was nothing to show.]
  • Timeline pressure: [DESCRIPTION — e.g., The state behavioral health authority had notified all mobile crisis providers that CARF accreditation or equivalent external credentialing would be required for contract renewal in [TIMEFRAME]. Failure to achieve accreditation status would put the contract — and the organization's crisis services revenue — at risk.]

The IHS Approach

IHS deployed a [TIMELINE]-month engagement structured around five phases of CARF Crisis Intervention Accreditation preparation. Here is what we delivered in each phase.

Phase 1: Gap Assessment Against CARF Standards (Months 1–2)

IHS conducted a comprehensive gap assessment against all applicable Section 1 and Crisis Intervention program-specific CARF standards. The assessment included structured interviews with [CLIENT TYPE]'s leadership, program director, clinical supervisors, and frontline mobile crisis staff; review of existing policies, procedures, and documentation systems; examination of quality improvement data; and direct review of a sample of person-served encounter records. The output was a prioritized remediation roadmap organized into three tiers: immediate priorities requiring new infrastructure before any other work could proceed, standards requiring policy development or revision, and standards where [CLIENT TYPE] was already substantially conformant and needed only documentation refinement.

The gap assessment identified [NUMBER] standards requiring new or substantially revised documentation, [NUMBER] areas requiring operational process changes, and [NUMBER] areas where existing practice was strong but undocumented. Key findings included:

  • [SPECIFIC GAP FINDING 1 — e.g., Rights of persons served: informed consent process existed verbally in field encounters but was not documented consistently; no written rights acknowledgment was provided to persons served during mobile crisis encounters, creating a gap against CARF's rights standards]
  • [SPECIFIC GAP FINDING 2 — e.g., Safety planning documentation: 43% of sampled encounter records had individualized safety plan elements; the remaining 57% had generic or absent safety plan documentation]
  • [SPECIFIC GAP FINDING 3 — e.g., Quality improvement: no committee structure, no documented leadership review, no improvement-cycle evidence across any metric]
  • [SPECIFIC GAP FINDING 4 — e.g., Staff competency: clinical training records existed for 8 of 12 mobile team clinicians; field safety competency had never been formally assessed or documented for any staff member]
  • [SPECIFIC GAP FINDING 5 — e.g., Financial sustainability: operating budget was documented but multi-year financial projections and sustainability evidence required by Section 1 financial management standards were absent]

Phase 2: Policy and Program Infrastructure Development (Months 2–7)

IHS provided policy templates and program development frameworks across all applicable standard areas and advised [CLIENT TYPE]'s team on customizing each to their specific program model, geography, and population. Key deliverables and development areas included:

  • Individualized safety planning system: [DESCRIPTION — e.g., IHS provided a structured safety planning documentation framework — including evidence-based safety plan components, individualization requirements, documentation format standards for mobile encounters, and supervisor review protocols. The framework was designed for integration with [CLIENT TYPE]'s existing electronic health record system. Implementation included policy language, staff training guidance, and documentation audit protocols for supervisors.]
  • Quality improvement program design: [DESCRIPTION — e.g., IHS designed a functioning QI program structure for [CLIENT TYPE]'s crisis services — including QI committee charter and composition, quarterly data review cycle, specific crisis-relevant metrics (response times, linkage rates, 30-day follow-up rates, safety incidents, ED diversion), dashboard format, meeting agenda templates, and minutes framework with root cause analysis and follow-up accountability elements. The program was designed to be sustainable with existing staff — not dependent on a dedicated QI director [CLIENT TYPE] did not have.]
  • Staff competency framework: [DESCRIPTION — e.g., IHS developed a comprehensive staff competency documentation system covering all mobile crisis team roles — clinicians, peer specialists, supervisors. The system included role-specific competency assessment tools for suicide risk assessment methods (specifying the standardized tools in use), trauma-informed care practice, de-escalation techniques, field safety protocols, documentation standards, and co-responder coordination procedures. Annual competency review cycle and documentation requirements were built into the framework.]
  • Transition and linkage protocols: [DESCRIPTION — e.g., IHS developed a warm handoff documentation standard and referral tracking system for all mobile crisis encounters — specifying documentation requirements at time of referral, follow-up contact procedures within 24 and 72 hours, and outcome tracking. The protocol was designed to interface with [CLIENT TYPE]'s care coordination team and existing community partner relationships.]
  • Mobile team operational policies: [DESCRIPTION — e.g., IHS developed written policies for dispatch protocols and response time standards, field staff safety check-in procedures, vehicle and equipment standards, law enforcement and EMS coordination protocols, after-hours coverage documentation, and high-acuity encounter management — formalizing the practices that experienced staff had been applying informally.]
  • Rights of persons served infrastructure: [DESCRIPTION — e.g., IHS revised [CLIENT TYPE]'s rights policies for crisis encounter contexts — including mobile-appropriate informed consent procedures, rights acknowledgment documentation for field encounters, grievance procedure integration with crisis documentation, and privacy policies specific to field-based service delivery.]

Phase 3: Application Preparation (Months 7–8)

IHS prepared [CLIENT TYPE]'s CARF application, including the program scope definition, service delivery model description, staffing structure documentation, geographic coverage documentation, and organizational narrative. The application fee of $995 was submitted to CARF. IHS reviewed the complete application package for accuracy and completeness before submission, with particular attention to the program-type designation and scope language — ensuring the application accurately described [CLIENT TYPE]'s crisis intervention delivery model and did not inadvertently include scope that would require Crisis Stabilization standards.

Phase 4: Survey Readiness Preparation (Month 8–9)

IHS conducted [NUMBER] mock survey sessions with [CLIENT TYPE]'s team, including:

  • Leadership mock interview: [DESCRIPTION — e.g., Two-hour session with the Executive Director and Program Director covering Section 1 organizational standards — strategic planning, financial management, governance, and risk management. IHS provided post-session feedback on response quality, documentation referencing, and areas requiring additional preparation.]
  • Clinical staff mock interviews: [DESCRIPTION — e.g., Separate sessions with mobile crisis team clinicians and peer specialists covering program-specific standards — assessment protocols, safety planning practices, transition and linkage procedures, rights of persons served, and emergency response. IHS assessed staff confidence levels and identified individuals requiring additional preparation.]
  • Documentation readiness audit: [DESCRIPTION — e.g., IHS reviewed a sample of [NUMBER] person-served records against documentation standards — individualized safety plans, encounter documentation, rights acknowledgment, and referral records. Audit found conformance levels of approximately [X]% across sampled records, with targeted remediation in [SPECIFIC AREAS].]
  • QI program review: [DESCRIPTION — e.g., IHS reviewed three full QI committee cycles — meeting records, data dashboards, and improvement action documentation — against CARF expectations for an active improvement cycle. The QI program had been running for [X] months at time of survey preparation and had [NUMBER] documented improvement cycles with measurable outcomes.]

The final pre-survey documentation audit confirmed [CLIENT TYPE]'s readiness across all standard areas. IHS delivered a written Survey Readiness Report identifying [NUMBER] minor documentation refinements to complete in the week before the survey.

The Survey

The CARF survey lasted [NUMBER] days and was conducted by [NUMBER] CARF surveyor(s). The survey included:

  • [SURVEY COMPONENT 1 — e.g., Leadership interviews with the Executive Director, Program Director, and Board Chair covering governance, strategic planning, financial management, and organizational risk management]
  • [SURVEY COMPONENT 2 — e.g., Clinical staff interviews with [NUMBER] mobile crisis team clinicians and [NUMBER] peer specialists covering program-specific assessment, safety planning, and transition and linkage practices]
  • [SURVEY COMPONENT 3 — e.g., Documentation review of [NUMBER] person-served records, QI program documentation, staff competency files, and organizational policies]
  • [SURVEY COMPONENT 4 — e.g., Operational review of dispatch records, field safety check-in logs, and law enforcement coordination documentation]
  • [SURVEY COMPONENT 5 — e.g., Exit conference with leadership reviewing surveyor findings and identifying any areas flagged for the Quality Improvement Report]

[SURVEY NARRATIVE — e.g., The surveyor noted the quality and specificity of [CLIENT TYPE]'s safety planning documentation as a program strength, citing the individualization of safety plans across all sampled records. The QI program was recognized as functioning and data-driven, with documented improvement cycles. One area of improvement was identified in [SPECIFIC AREA — e.g., financial sustainability documentation — the surveyor recommended strengthening multi-year projections] and was noted in the Quality Improvement Report as an area for continued development rather than a deficiency requiring corrective action.]

Results

[CLIENT TYPE] received [ONE-YEAR / TWO-YEAR / THREE-YEAR] CARF Crisis Intervention Accreditation [TIMEFRAME: e.g., [X] months after engaging IHS / on their first attempt]. The outcomes included:

  • Contract qualification: [RESULT — e.g., CARF accreditation status satisfied the [STATE] behavioral health authority's credentialing requirement for the crisis services contract renewal, securing [CLIENT TYPE]'s position as the designated mobile crisis provider for [REGION] through [YEAR]. The contract represented approximately $X in annual revenue.]
  • 988 network participation: [RESULT — e.g., CARF Crisis Intervention accreditation satisfied the 988 Lifeline network's external accreditation requirement, enabling [CLIENT TYPE] to formally join the [STATE] 988 mobile crisis response network and receive referrals from the 988 crisis line.]
  • Quality infrastructure that outlasts the survey: [RESULT — e.g., The QI program, safety planning documentation system, and staff competency framework built during the engagement are operational systems — not one-time accreditation artifacts. [CLIENT TYPE]'s Program Director reported that the QI dashboard has been used in every monthly leadership meeting since implementation, and response time data identified a dispatch protocol adjustment that reduced average response time by [X] minutes.]
  • Staff competency and confidence: [RESULT — e.g., The structured competency assessment process identified [NUMBER] mobile team clinicians who needed additional training in [SPECIFIC SKILL AREA]. Targeted training was completed before the survey, and post-training competency assessments confirmed skill proficiency. Staff reported increased confidence in their documentation practices and a clearer understanding of the rights and safety planning standards they are implementing in every encounter.]
  • Organizational differentiation: [RESULT — e.g., [CLIENT TYPE] is now one of [NUMBER] CARF-accredited mobile crisis programs in [STATE], a distinction that supported successful responses to [NUMBER] competitive RFPs for crisis services contracts issued in the [TIMEFRAME] following accreditation.]
  • CCBHC compliance support: [RESULT — if applicable: e.g., CARF Crisis Intervention accreditation documentation substantially supported [CLIENT TYPE]'s CCBHC certification renewal survey, with CARF-compliant policies and QI documentation directly satisfying several CCBHC crisis service quality requirements.]

Key Lessons for Crisis Programs Pursuing CARF Accreditation

[CLIENT TYPE]'s experience surfaces four lessons applicable to any mobile crisis team or crisis intervention program beginning the CARF accreditation process.

Individualization Is the Standard, Not the Exception

The single most common survey deficiency in crisis intervention programs is using boilerplate or semi-standardized safety plan documentation rather than genuinely individualized plans. CARF surveyors read person-served records during the survey and compare them. Plans that look identical across different persons served — even when the template is sophisticated — are flagged. Building a documentation culture of individualization before the survey is the most important preparation investment a crisis program can make.

Quality Improvement Must Be a Living System

A QI program that exists on paper but cannot demonstrate an active improvement cycle — data reviewed by leadership, specific program changes made in response, outcomes tracked — will not pass a CARF survey. The QI program needs to have been running and producing documented cycles for several months before the survey. Starting the QI program six weeks before the survey and trying to backfill cycles does not work. Build it early and let it accumulate real cycles.

Field Staff Must Be Survey-Ready, Not Just Clinically Skilled

Mobile crisis team clinicians are often highly skilled at crisis response but unprepared for a CARF surveyor's interview questions. "How do you document informed consent in a field encounter?" and "Walk me through your safety planning documentation process" are different questions than anything in a clinical supervision session. Mock interviews expose gaps in how staff articulate their practice against standards — and those gaps can be closed with preparation. The staff who demonstrate the most difficulty in mock interviews are not necessarily the weakest clinicians; they are often the most experienced staff who have never been asked to translate their practice into compliance language.

Informal Practice Needs to Become Written Policy Before the Survey

Many experienced crisis programs have strong operational practices that have never been written down — dispatch protocols, law enforcement coordination procedures, field check-in systems — because they evolved through shared practice among experienced staff. CARF surveys require written policies. Everything the team does well needs to be documented before the survey so surveyors can see it, not just hear about it in interviews.

Ready to Begin Your CARF Crisis Intervention Accreditation?

IHS guides mobile crisis teams, psychiatric emergency services, urgent care programs, and CCBHC crisis services through every phase of CARF Crisis Intervention Accreditation — from gap assessment through three-year award. Thomas G. Goddard, JD, PhD, former COO and General Counsel of URAC, leads every engagement.

Schedule a no-obligation discovery session. We will assess your program's current readiness and give you a clear picture of what the path to accreditation looks like for your specific program model.

Schedule a Free Discovery Session