URAC MBC Designation Case Study — Building Clinical Outcomes Infrastructure for a Managed Behavioral Health Organization
Last updated: April 2026
This case study describes how IHS guided a mid-size managed behavioral health organization (MBHO) through the URAC Measurement-Based Care (MBC) Designation process. The organization held URAC Behavioral Health Accreditation and had informal MBC practices in place but lacked the documentation infrastructure, classification framework, and quality improvement integration required for Designation. IHS completed the engagement in five months — from gap assessment to Designation submission. Client identity is protected; operational details are representative of the engagement type.
Client Profile
The client was a regional MBHO administering behavioral health benefits for three commercial health plan clients and one state Medicaid managed care organization. The organization employed approximately 40 licensed clinical staff across outpatient, intensive outpatient (IOP), and care management service lines. It held URAC Behavioral Health Accreditation, renewed 18 months prior to the MBC engagement.
The MBHO had been administering PHQ-9 assessments at intake for several years — a practice originally implemented to satisfy HEDIS measurement requirements for depression screening. GAD-7 assessments were used informally by some clinicians but were not part of any standardized protocol. No other validated instruments were in routine use across service lines.
A payer RFP for network participation — a commercial health plan rolling out a new behavioral health carve-in product — included explicit MBC requirements: encounter-level assessment, documented score classification, and evidence of MBC integration into the quality improvement program. The MBHO's VP of Quality identified the URAC MBC Designation as the mechanism for demonstrating compliance and engaged IHS.
The Challenge
The organization had the instinct of measurement-based care without the infrastructure. Clinicians understood that assessment tools were useful. The PHQ-9 was familiar. But the program as a whole had four structural gaps that would have made a URAC MBC Designation application unsuccessful.
Gap 1: Intake-Only Assessment Administration
PHQ-9 was administered at intake. That was it. For most patients, no subsequent PHQ-9 appeared in the clinical record — not at 30 days, not at 90 days, not at discharge. Clinicians conducted clinical interviews at every encounter, but symptom severity was not systematically measured after the first session.
URAC's MBC standard requires validated tool administration at each clinical encounter. Intake-only administration satisfies HEDIS depression screening requirements. It does not satisfy MBC standards. The gap was complete — not a documentation gap but an operational one. The workflow did not include encounter-level assessment.
Gap 2: No Symptom Severity Classification Framework
When PHQ-9 scores were recorded, they were recorded as raw numbers. No organizational policy defined what score thresholds constituted response, partial response, non-response, or remission. Clinicians applied informal clinical judgment to interpret scores — some used published cutoffs, some did not use cutoffs at all. The organization could not produce a consistent classification of any patient's trajectory because the classification standard did not exist at the organizational level.
URAC requires classification of symptom severity changes into clinically meaningful categories. Without a defined threshold framework, the organization's clinical records would not support the classification evidence URAC reviewers require.
Gap 3: No Documented Care Plan Linkage
Treatment plans were documented. Assessment scores, where they existed, were documented. But no policy required that treatment plan decisions be explicitly linked to assessment score trends. Clinicians made clinically appropriate decisions based on their overall evaluation of the patient — but those decisions were not documented in a way that traced the connection between a score trajectory and a treatment change.
URAC's MBC standard requires that assessment data inform individualized care plans. The evidence of that linkage must exist in the clinical record. Clinically appropriate care that is not documented in a way that shows the connection between score and decision does not satisfy the standard.
Gap 4: MBC Data Absent from the QI Program
The organization's URAC-compliant quality improvement program tracked process measures: appointment adherence, no-show rates, time-to-first-appointment, and care coordination touchpoints. These measures satisfied URAC Behavioral Health Accreditation QI requirements.
MBC outcome data — response rates, remission rates, non-response identification — did not appear anywhere in the QI program. Assessment scores were clinical records, not QI inputs. The program had no mechanism for aggregating assessment data, no performance measure built around clinical outcomes, and no QI reporting structure for MBC-derived indicators.
The IHS Approach
Phase 1: Gap Assessment and Remediation Roadmap (Weeks 1–4)
IHS began with a structured gap assessment mapping the organization's current clinical operations against each MBC Designation standard. The assessment included review of clinical policies, representative clinical records from three service lines, the existing QI program framework, and EHR configuration documentation.
Findings were organized into two tiers: operational gaps requiring workflow changes and documentation gaps requiring policy development. The four gaps described above were all operational — the organization could not build documentation around practices that did not yet exist at the required level. The remediation roadmap established a sequenced implementation plan: operational changes first, documentation second, so that by the time the mock review occurred, clinical records would reflect compliant operations for at least 60 days.
Phase 2: Instrument Selection and Encounter-Level Protocol Design (Weeks 4–7)
IHS worked with the organization's clinical leadership to expand the instrument set beyond the PHQ-9 and design an encounter-level administration protocol for each service line.
The instrument selection process prioritized three criteria: free availability (no per-use licensing cost), EHR integration status (instruments already present in the organization's EHR template library), and clinical appropriateness for the patient population by service line. The resulting instrument set:
- Outpatient mental health: PHQ-9 (depression), GAD-7 (anxiety), PCL-5 (PTSD for patients with trauma presentations)
- SUD outpatient and IOP: AUDIT-C (alcohol), DAST-10 (drug use), PHQ-9 (co-occurring depression, present in approximately 60% of the SUD caseload)
- Care management: PHQ-9 (depression), GAD-7 (anxiety), WHO-5 (wellbeing, for patients with complex co-occurring medical and behavioral presentations)
IHS provided a clinical assessment protocol specifying: which instrument to administer by service line and diagnosis, administration timing (patient self-completion in the waiting room or via the patient portal before each encounter), scoring procedure, documentation location in the EHR, and the clinical supervisor review trigger for scores indicating clinical deterioration.
Phase 3: Symptom Severity Classification Framework (Weeks 5–8)
IHS developed an organizational symptom severity classification framework defining standardized thresholds for each instrument across four categories: remission, response, partial response, and non-response. Thresholds were grounded in the published psychometric literature for each instrument and translated into clear operational language for clinical documentation purposes.
For the PHQ-9, the framework defined:
- Remission: Score of 4 or below sustained for two consecutive encounters
- Response: Score reduction of 50% or more from baseline maintained for two consecutive encounters
- Partial response: Score reduction of 25–49% from baseline
- Non-response: Score reduction of less than 25% from baseline after 8 weeks of treatment at adequate intensity
Equivalent frameworks were developed for GAD-7, AUDIT-C, DAST-10, PCL-5, and WHO-5. Each framework included a clinical review trigger: a defined score or trajectory that required documented clinician review and a treatment plan reassessment note within a specified number of business days. IHS provided EHR documentation templates embedding the classification language as structured fields rather than free-text entries — making consistent classification a default behavior rather than a discretionary one.
Phase 4: Care Plan Linkage Policy and EHR Template Design (Weeks 7–10)
IHS developed a care plan integration policy requiring explicit documentation of the relationship between assessment score trends and treatment plan decisions at each care plan review. The policy established that care plan reviews must include: current assessment scores, comparison to baseline and prior scores, current classification (remission/response/partial response/non-response), and a treatment plan decision narrative explicitly referencing the score classification.
To make compliance the path of least resistance, IHS advised the organization's EHR team on care plan template revisions that built the linkage documentation into the required fields of the care plan form. Clinicians completing the care plan template would encounter required fields for current score, classification, and treatment decision rationale — not a free-text section where linkage might or might not be documented.
Phase 5: MBC Quality Improvement Integration (Weeks 8–12)
IHS worked with the organization's QI director to design three MBC-specific performance measures for integration into the existing URAC-compliant QI program:
- MBC Assessment Compliance Rate: Percentage of clinical encounters with documented validated instrument administration (target: 90% within 90 days of protocol implementation; 95% at 6 months)
- Clinical Response Rate: Percentage of patients reaching response or remission classification within 12 weeks of treatment initiation, tracked by service line and diagnosis
- Non-Response Action Rate: Percentage of non-response classifications with documented treatment plan reassessment within the required timeframe (target: 100%)
IHS provided QI reporting templates and a quarterly MBC outcomes report format incorporating these measures alongside the organization's existing process measure reporting. The QI director adapted the templates to the organization's existing QI committee reporting structure so that MBC outcomes appeared in the same format as established measures — integrated into the program, not presented as a parallel compliance exercise.
Phase 6: Staff Training Program (Weeks 8–11)
IHS developed a staff training curriculum covering: the rationale for MBC and the outcomes evidence base, instrument-specific administration and scoring procedures for each tool in the instrument set, classification framework application with case examples, documentation requirements under the new care plan template, clinical review triggers and the escalation protocol, and QI reporting roles and data submission requirements.
The curriculum was delivered in two formats: a 90-minute live training session for all clinical staff (recorded for future onboarding) and a written competency assessment requiring clinicians to correctly classify representative PHQ-9 and GAD-7 score sequences before being cleared to document MBC assessments in the EHR. IHS provided the training framework and competency assessment tool; the organization's clinical supervisor administered and documented training completion.
Phase 7: Mock Review (Weeks 14–16)
Sixty days after protocol implementation, IHS conducted a mock review of the organization's MBC documentation against each Designation standard. The review examined: a random sample of 25 clinical records across the three service lines for assessment administration compliance, classification documentation consistency, and care plan linkage evidence; the QI director's first MBC outcomes report for completeness and measure calculation accuracy; staff training records for documentation completeness; and the full policy set for internal consistency and alignment with the classification framework and protocol.
The mock review identified two documentation gaps: a subset of IOP clinical records where the care plan template had not been updated to the new version, leaving clinicians completing the legacy template without the required classification fields; and three records where the non-response trigger had been met but no treatment plan reassessment note appeared within the required window.
Both gaps were corrected before the application was submitted. The EHR team completed the template rollout to IOP within one week. The three non-response gaps were remediated with prospective corrective action documentation and the escalation protocol was clarified to the IOP clinical supervisors. A follow-up spot check two weeks later confirmed the corrections held.
Outcome
IHS submitted the formal MBC Designation application to URAC in week 20 of the engagement — five months from gap assessment to submission. URAC issued the MBC Designation without a request for additional information. The Designation attached to the organization's existing URAC Behavioral Health Accreditation.
At submission, the organization's operational MBC data — 60 days of encounter-level assessment compliance — showed:
- Assessment compliance rate of 94% across outpatient and care management service lines (91% in IOP, reflecting the later template implementation)
- PHQ-9 clinical response rate of 61% among patients with 8+ weeks of treatment, compared to the organization's prior baseline (intake-only measurement) which showed no comparable data
- 100% non-response action rate — every non-response classification with a documented treatment plan reassessment within the required window, after the two IOP gaps were corrected
The payer that had triggered the engagement — the commercial health plan with MBC requirements in its RFP — accepted the URAC MBC Designation as satisfying its MBC documentation requirement. The organization was awarded network participation in the new behavioral health carve-in product.
What the Organization Built
Beyond the Designation itself, the engagement produced durable clinical infrastructure: an encounter-level assessment protocol across three service lines, a validated symptom severity classification framework embedded in EHR documentation templates, a care plan integration policy requiring documented score-to-decision linkage, three new QI performance measures tracking clinical outcomes rather than process compliance alone, and a staff training curriculum with documented competency assessment for current and future clinical staff.
This infrastructure does not expire with the Designation. It operates as the organization's ongoing clinical outcomes tracking system — producing the data that supports future payer contract negotiations, QI reporting, and accreditation renewals.
Lessons for Organizations Pursuing the URAC MBC Designation
Informal MBC practices are not a head start if they are not documented
This organization had been administering PHQ-9 assessments for years. That history did not accelerate the Designation timeline — because the assessments were intake-only and unclassified, they were not evidence of MBC compliance. The Designation timeline is driven by when compliant operations begin, not when assessment tools were first introduced. Organizations with existing tool use need to evaluate compliance gaps honestly rather than assuming existing practice satisfies the standard.
EHR template design is a compliance intervention
The single most effective compliance tool in this engagement was not a policy — it was a required field in the clinical documentation template. When classification and score linkage appeared as required fields rather than optional narrative sections, documentation compliance followed naturally. When the legacy IOP template remained in place without those fields, compliance gaps appeared. Template design is not a technology decision — it is a clinical quality decision.
QI integration requires deliberate design, not data export
MBC data does not automatically become QI data. The clinical records contained PHQ-9 scores after implementation — but scores in clinical records are not performance measures until someone has defined the measure, established the denominator, specified the data source, and built a reporting cadence. IHS designed the measures before implementation so that the QI reporting framework was operational by the time the application was submitted. Organizations that implement MBC clinically and then try to build QI reporting around it after the fact will find the Designation application delayed by the QI integration lag.
The non-response trigger is where accreditation reviewers look first
The MBC standard's clinical intent is to prevent clinical inertia — the continuation of ineffective treatment in the absence of objective reassessment. URAC reviewers will look specifically for evidence that non-response classifications prompted documented clinical action. Of the two gaps identified in the mock review, both were non-response trigger failures. Organizations should design the escalation protocol and the documentation requirement for non-response action before any other aspect of the MBC program — it is the operational test that URAC most carefully evaluates.
How IHS Can Help Your Organization
IHS guides behavioral health organizations through every phase of the URAC MBC Designation process — from gap assessment through Designation submission. Thomas G. Goddard, JD, PhD — former Chief Operating Officer and General Counsel of URAC — leads IHS's URAC consulting practice. The institutional knowledge of how URAC standards are written and applied is the foundation of every IHS engagement.
If your organization holds URAC accreditation and delivers behavioral health services to MH/SUD populations, the MBC Designation is a demonstrable clinical quality signal that payers, employer purchasers, and state behavioral health authorities are increasingly using to evaluate network quality. IHS has the standards expertise, the clinical policy infrastructure, and the URAC process knowledge to get your organization there efficiently.
Related Resources
Ready to Get Started?
Schedule a no-obligation consultation with IHS. We will assess your organization's current MBC infrastructure, identify the gaps between your current operations and URAC's Designation standards, and give you a clear roadmap to submission.