Last updated: April 2026

CARF Partial Hospitalization Accreditation: Case Study

Client details have been anonymized. Bracketed fields indicate where client-specific information will be inserted for published versions.

Client Profile

  • Organization type: [ORGANIZATION TYPE — e.g., freestanding behavioral health center / hospital-based day program / multi-site behavioral health organization]
  • Program: [PROGRAM DESCRIPTION — e.g., adult psychiatric partial hospitalization program / dual-diagnosis PHP serving adults with co-occurring mental health and substance use disorders]
  • Location: [STATE/REGION]
  • Accreditation sought: CARF Partial Hospitalization Program — initial three-year accreditation
  • Engagement trigger: [TRIGGER — e.g., payer network contract requirement / state licensing preference / organizational quality initiative / hospital referral partner requirement]

The Situation

[CLIENT NAME / "The organization"] had been operating its partial hospitalization program for [DURATION] and had established clinical processes that worked in practice. The problem was documentation: what the program did and what its records showed were two different things. A key managed care payer had added CARF accreditation as a network participation requirement, effective [DATE], creating a hard deadline for the engagement.

The program had never gone through a CARF survey. Leadership understood they needed to prepare but had limited familiarity with how CARF evaluates behavioral health programs — particularly the organizational standards that apply across all program types regardless of clinical specialty.

What the Gap Analysis Found

IHS conducted a structured gap analysis against CARF's Behavioral Health Standards for the partial hospitalization program designation. The assessment covered all applicable organizational and program-specific standards. The highest-priority findings fell into four clusters:

Outcome measurement — data collected, not used

The program administered a standardized symptom measure at admission and discharge for all persons served. The data existed. What did not exist: any aggregate analysis, trend report, or documented evidence that the data had influenced a program or organizational decision. CARF requires outcome data to drive quality improvement — not simply to be filed. The program's QI committee met regularly but discussed process issues rather than outcome trends. The connection between data collection and improvement cycles had never been formalized.

Strategic plan disconnected from performance data

The organization had a strategic plan, updated annually. It contained well-written goals. None of the goals referenced measurable outcomes tied to actual program performance data. CARF surveyors look for the feedback loop between data collection, analysis, and strategic goal-setting. A plan that reads like aspirational prose without measurable indicators anchored in real performance data will generate a finding.

Discharge planning timing

The program's clinical staff initiated discharge and step-down planning in the final week of a person's PHP stay — standard practice in many programs, but inconsistent with CARF's expectation that discharge planning begins at admission. The clinical rationale for PHP placement and the discharge trajectory should be documented in the initial assessment and treatment plan, not retroactively at the end of care.

Personnel file completeness

An audit of [NUMBER] personnel files identified gaps across [PERCENTAGE] of records: missing or expired license verifications, annual performance reviews not completed within required intervals, and clinical supervision logs that existed in supervisor notes but had never been formalized into a system-level documentation structure.

What IHS Did

QI infrastructure build

IHS worked with the program's QI coordinator to restructure the outcome data analysis process. Rather than filing admission and discharge scores, the team built a quarterly trend report template that aggregated scores by program month, tracked average symptom severity at admission and discharge, and flagged variance from prior quarters for QI committee review. The template was designed to be maintained by existing staff without additional FTE — two hours per quarter of data entry and one structured QI discussion per quarter using a prepared agenda.

Strategic plan revision

IHS revised the strategic plan's goal structure to link each organizational goal to a specific performance measure tracked in the QI system. Goals that had been stated in qualitative terms were rewritten with measurable indicators, baseline values from the prior 12 months of program data, and target thresholds. The plan retained the organization's voice and priorities — the revision was structural, not substantive.

Treatment planning protocol update

IHS drafted an updated initial assessment protocol that required clinical staff to document: (1) the clinical basis for PHP-level placement using ASAM Level 2.5 criteria, (2) the anticipated step-down trajectory and target discharge level of care, and (3) measurable discharge criteria defined collaboratively with the person served. The change was integrated into the existing EHR documentation workflow rather than requiring a parallel paper process.

Personnel file remediation

IHS provided a personnel file audit checklist mapped to CARF standards requirements and worked with HR to close identified gaps: obtaining current license verifications, completing overdue performance reviews, and building a centralized clinical supervision log template that captured supervisor, supervisee, date, format, and content summary for each supervision contact.

Staff survey preparation

CARF surveyors routinely interview frontline staff and, separately, persons served. Staff who cannot accurately describe the program's grievance process, person-served rights, or how treatment plans are developed with the individual present will generate findings regardless of what the written policies say. IHS conducted a preparation session with clinical and administrative staff covering: what surveyors ask, how to answer accurately and without over-elaborating, and how to handle questions about processes that are being actively improved.

The Survey

The CARF survey took place [TIMEFRAME] after IHS was engaged. [NUMBER] surveyors conducted a [NUMBER]-day on-site review. Surveyors interviewed [NUMBER] staff members and [NUMBER] persons currently enrolled in the PHP. The document review covered [LIST KEY DOCUMENT AREAS].

The survey team identified [NUMBER] Areas for Improvement and [NUMBER] Recommendations. [DESCRIBE FINDINGS AT HIGH LEVEL — e.g., "Two of the three Areas for Improvement were in areas the gap analysis had flagged as elevated risk. The third involved emergency preparedness drill documentation — a gap the pre-survey audit had identified as lower priority and had not fully remediated in time."]

The Outcome

[OUTCOME — e.g., "The organization received Three-Year CARF Accreditation. The payer contract requirement was satisfied prior to the effective date. The organization submitted its Quality Improvement Plan addressing the cited Areas for Improvement within the required 90-day window."]

What This Engagement Demonstrated

The most common pattern we see in PHP accreditation engagements is the gap between what a program does clinically and what its documentation shows. Programs that deliver high-quality care often fail surveys not because the care is deficient, but because the evidence infrastructure — QI reports, personnel files, treatment plan documentation — has not been built to CARF's evidentiary expectations.

The four deficiency clusters in this engagement — outcome data not trended, strategic plan disconnected from data, discharge planning timing, and personnel file gaps — appear consistently across PHP surveys. Knowing where the risk concentrates allows preparation to be targeted rather than exhaustive.

The changes IHS implemented in this engagement were structural and sustainable. The QI reporting system, the revised treatment planning protocol, and the personnel file audit process became part of the program's ongoing operations — not a pre-survey sprint that would erode before the renewal cycle.

About IHS

Integral Healthcare Solutions provides accreditation consulting, compliance services, and program development for healthcare organizations nationwide. Thomas G. Goddard, JD, PhD — former COO and General Counsel of URAC — leads every client engagement. IHS works across 28 accreditation programs including CARF, URAC, NCQA, ACHC, and NABP.

IHS engagements are scoped to each client's organizational size, accreditation history, and complexity. Contact us for a tailored proposal.

Schedule a Free Discovery Session

If your partial hospitalization program is preparing for an initial CARF survey, a renewal, or a post-survey remediation plan, let's talk.

Schedule a Free Discovery Session