URAC Employer-Based Population Health Accreditation: A Case Study in Closing the Gap Between Program Strength and Documentation Readiness

Last updated: April 2026

The following case study illustrates the IHS engagement model for URAC Employer-Based Population Health Accreditation. Client details have been anonymized. The gap patterns, preparation challenges, and outcomes described reflect real conditions IHS has encountered across multiple employer-based population health engagements.

The Situation

Organization Profile

A regional clinically integrated network (CIN) had developed a population health management program serving several large self-insured employers through direct employer contracting relationships. The network employed approximately 400 primary care and specialist physicians across a multi-county service area and had been operating its employer-sponsored population health program for three years.

The network's clinical leadership had built a substantive program: a dedicated care management team, a chronic disease registry, embedded care coordinators at primary care practice sites, and a behavioral health integration model connecting mental health providers with primary care for high-risk patients. Clinical outcomes data showed measurable improvements in diabetes management and hypertension control across the attributed employer population.

The Problem

Despite strong clinical performance, the network's employer clients — particularly one large regional employer considering renewing a multi-year direct contract — were asking a question the network could not answer with independent evidence: how do we know your population health program meets national quality standards?

The network's Chief Medical Officer had encountered URAC Employer-Based Population Health Accreditation at a conference. The program appeared to be exactly what the organization needed — an independent, nationally recognized validation of population health program quality. The question was whether the organization was ready, and if not, what it would take to get there.

The network engaged IHS.

Phase 1: Gap Analysis — What the Organization Found Out About Itself

IHS conducted a comprehensive standard-by-standard gap analysis against URAC Employer-Based Population Health Accreditation standards. The gap analysis took six weeks and involved structured interviews with the care management team, behavioral health program leadership, benefits administration staff, quality improvement committee members, and IT and data governance personnel.

The findings surprised the organization's leadership. The clinical program was stronger than average. The documentation infrastructure was not.

Gap 1: Social Determinants — Program Without Infrastructure

The network had conducted a pilot social determinants of health screening program at two of its largest primary care sites using a validated screening tool. Care coordinators were informally connecting patients with identified needs to community resources. But the program had no written protocol governing which populations were screened, at what intervals, under what conditions; no documented referral pathways naming specific community organizations and the needs they addressed; and no tracking system linking identified needs to referral outcomes.

URAC's standard evaluates operational social determinants programming — not pilot activity. The network had the clinical work but lacked the documented infrastructure that would demonstrate compliance to a URAC surveyor.

Gap 2: Behavioral Health Integration — Policy Without Connection

The network's behavioral health integration model was clinically real: warm handoffs from primary care to embedded behavioral health providers, shared electronic health record access at integrated sites, and a co-management protocol for patients with concurrent physical and behavioral health conditions. But the written policies governing behavioral health integration had been drafted by administrative staff using general language and did not describe the actual clinical model the network operated.

The disconnect between the written policy and the operational reality created a documentation gap: surveyors reviewing the policy would see a generic integration framework, not the substantive integrated clinical model the network had built. The clinical model would not be visible to URAC without policies that accurately described it.

Gap 3: Engagement — Activity Without Measurement

The network ran three employer-facing engagement programs: a health risk assessment campaign, a chronic disease coaching program, and a preventive care outreach initiative. All three programs were active. None had documented engagement metrics frameworks — no definitions of engagement, no baseline measurements, no outcome targets, no evaluation schedules.

URAC's engagement standards require not just that engagement programs exist, but that organizations measure engagement and evaluate program effectiveness. Active programs without documented measurement frameworks do not satisfy the standard.

Gap 4: Quality Improvement — Committee Structure Without Documentation

The network's Quality Improvement Committee met quarterly. Meeting minutes existed but were brief — typically two to three paragraphs noting agenda items discussed and action items assigned. No minutes documented the specific quality metrics reviewed, the corrective action plans adopted in response to performance gaps, or the re-measurement results confirming that corrective actions achieved improvement.

URAC's quality improvement standards evaluate the substantive operation of the QI process — not simply the existence of a committee. Meeting minutes that do not document the QI cycle in sufficient detail will not satisfy URAC's evaluation framework.

Gap 5: Population Identification — Methodology Without Documentation

The network's care management team used a proprietary algorithm embedded in its population health IT platform to identify high-risk patients for care management outreach. The algorithm incorporated claims data, clinical data, and patient-reported outcomes. It was sophisticated and clinically defensible. It was entirely undocumented in written policy.

URAC requires documented, evidence-based criteria for population identification and risk stratification. An undocumented algorithm — regardless of its clinical sophistication — cannot be evaluated by a surveyor and will not satisfy the standard.

What Was Strong

The gap analysis identified several domains where the network's preparation was already at or near accreditation-ready status:

  • Care coordination policies were substantive and operationally accurate
  • Chronic disease management protocols cited appropriate evidence-based guidelines
  • Staff qualifications documentation was complete and current
  • Administrative governance structure was documented and consistent with actual operations
  • Data governance and privacy policies were comprehensive

The gap profile was concentrated, not distributed. This meant the remediation workload was manageable — but required focused effort on specific domains rather than general policy cleanup.

Phase 2: Remediation — Closing the Gaps

IHS developed a prioritized remediation roadmap sequencing work against the URAC review timeline. The five gap areas required different remediation approaches.

Social Determinants: Building the Documentation Infrastructure

IHS provided a social determinants of health program policy template structured around URAC's evaluation framework. The network's care management leadership adapted the template to reflect the screening tool already in use, expanded the pilot screening protocol to apply network-wide, and worked with community relations staff to document formal referral relationships with ten community organizations serving identified social needs.

A tracking system was established within the population health IT platform to link SDOH screening results to referral activity and outcome documentation. The network operated the documented SDOH program for five months before application submission — sufficient to generate operational evidence for the URAC desktop review.

Behavioral Health: Rewriting Policy to Reflect Clinical Reality

IHS facilitated a structured policy rewrite with the network's behavioral health program director and primary care medical director. The process documented the actual clinical model: the warm handoff protocol, the shared EHR access framework, the co-management criteria, the care coordination workflow between embedded behavioral health providers and primary care teams, and the parity compliance monitoring process.

The rewritten policy was longer than the original — more detailed, more specific, and more accurate. The clinical model had always been there. The policy now described it.

Engagement: Establishing Measurement Frameworks

IHS provided engagement metrics framework templates for each of the three active employer programs. For each program, the network defined: engagement definition (what counts as engaged), baseline measurement methodology, engagement targets, evaluation schedule, and the process for using evaluation results to modify program design.

The health risk assessment campaign had never tracked completion rates against eligible population denominators. Establishing the measurement framework required IT coordination to pull historical HRA completion data and establish the reporting infrastructure for ongoing tracking. This took eight weeks — the longest single remediation workstream.

Quality Improvement: Restructuring Documentation

IHS provided a QI meeting minutes template aligned with URAC's quality improvement documentation requirements. The template structured minutes to document: specific performance metrics reviewed (with data), gap identification and root cause analysis, corrective action plan adoption (with responsible party and timeline), and re-measurement results at the subsequent meeting.

The network's QI Committee adopted the new documentation format immediately. Two full QI cycles under the new format were completed before application submission, generating the substantive documentation evidence URAC surveyors would review.

Population Identification: Documenting the Algorithm

IHS worked with the network's population health analytics team to produce a written methodology document describing the risk stratification algorithm: the data inputs (claims, clinical, patient-reported), the weighting methodology, the risk tier definitions, the thresholds triggering care management outreach, and the evidence base supporting each algorithmic component.

The documentation process took three weeks and required coordination between clinical leadership (who understood the clinical logic) and IT staff (who understood the technical implementation). The resulting methodology document was the most detailed single document produced during remediation — and one of the most important for URAC's evaluation of the population identification standard.

Phase 3: Mock Review — Finding What Remediation Missed

Five months after the gap analysis, with all five gap areas remediated and two months before planned application submission, IHS conducted a full mock review simulating URAC's desktop assessment.

The mock review identified two issues the remediation process had not fully resolved.

Issue 1: SDOH Referral Network Documentation

The social determinants referral network documentation named ten community organizations but did not document the nature of the referral relationship — whether organizations had formal MOUs with the network, the specific services available through each organization, or the referral process for each service category. URAC's standard evaluates documented referral pathways, not simply a list of community resources.

The network's community relations team produced one-page relationship summaries for each of the ten organizations, documenting the referral process, services available, and contact protocols. Three organizations without formal agreements were brought under simple referral MOU frameworks in the two months before application submission.

Issue 2: Engagement Evaluation Cycle Not Yet Complete

The engagement metrics frameworks had been established, but only one evaluation cycle had been completed for the health risk assessment campaign. The chronic disease coaching program and preventive care outreach initiative had documented metrics frameworks but no completed evaluation cycles — the measurement infrastructure had been built but had not yet generated evaluation results.

Application submission was delayed by six weeks to allow a complete evaluation cycle for all three programs. The delay was the right call: submitting with incomplete evaluation documentation would have generated an RFI that delayed the accreditation determination by longer than the six-week application delay.

The Outcome

The network submitted its URAC application with all five gap areas remediated and mock review findings resolved. URAC conducted its independent assessment over approximately five months.

URAC issued one RFI during the review — a request for additional documentation on the behavioral health integration model's parity compliance monitoring process. The surveyor's inquiry indicated the policy described the monitoring process at a conceptual level but did not document the specific data sources reviewed, the monitoring frequency, and the escalation protocol for identified parity compliance concerns.

IHS drafted the RFI response within ten days of issuance. The response provided: a detailed description of the data sources used for parity compliance monitoring (claims data by service category, utilization management denial data, cost-sharing comparison data), the quarterly monitoring schedule, the threshold criteria triggering escalation, and the escalation pathway to the network's compliance committee. Supplementary documentation included the most recent quarterly parity monitoring report and the compliance committee minutes documenting the report's review.

URAC accepted the RFI response without further inquiry. The network received URAC Employer-Based Population Health Accreditation approximately three weeks after RFI response submission.

What the Accreditation Delivered

The network's Chief Medical Officer presented the URAC accreditation to the large regional employer client considering contract renewal within two weeks of the award. The employer's benefits committee cited the independent accreditation as a significant factor in the contract renewal decision — specifically the validation that the program addressed social determinants of health and behavioral health integration, two areas the employer's wellness team had identified as organizational priorities.

The network subsequently included URAC accreditation in its proposals to three additional employer prospects. IHS is not in a position to attribute specific contracting outcomes to the accreditation — too many variables affect employer direct contracting decisions. But the accreditation gave the network a credentialing signal it had not previously been able to offer: independent, nationally recognized validation of population health program quality from an organization whose standards were written by people who understand what comprehensive population health management actually requires.

What This Engagement Illustrates

Clinical Program Strength Does Not Equal Documentation Readiness

This organization had built a genuinely strong population health program. Its clinical outcomes data was measurable and positive. Its care management team was experienced and operationally effective. None of that protected it from a gap analysis revealing five significant documentation deficiencies across five standard categories. URAC evaluates programs through documentation — surveyors cannot observe clinical operations directly. The program that exists in practice must also exist on paper, in sufficient specificity to satisfy URAC's interpretive framework for each standard.

The Gap Between Policy and Operations Runs Both Directions

Two of this organization's five gaps ran in opposite directions. The behavioral health integration gap represented operations that were stronger than the written policy suggested — the clinical model was real, but the policy didn't describe it accurately. The engagement gap represented policies that were weaker than the written programs suggested — programs existed but measurement infrastructure did not. Both types of disconnect create deficiencies in URAC review. Accurate documentation requires understanding both directions of the gap.

Mock Review Is Where Avoidable Failures Are Caught

The two issues identified in mock review — incomplete SDOH referral network documentation and incomplete engagement evaluation cycles — would both have generated RFIs in the live URAC review. The six-week application delay imposed by the engagement evaluation issue cost six weeks of elapsed time. Responding to that issue via RFI after submission would have cost longer, with more uncertainty. Mock review is the investment that prevents preventable delays at the worst possible moment — after preparation investment is complete and organizational patience is exhausted.

RFI Response Is a Specialized Skill

The single RFI this organization received was on behavioral health parity compliance monitoring — a technical area where the surveyor's concern was precise and the response had to be equally precise. Generic RFI responses that restate policy language without addressing the specific interpretive concern behind the surveyor's inquiry typically generate follow-up inquiries or result in Not Met determinations. Understanding what a URAC surveyor is actually asking — and what documentation will satisfy that specific concern — requires institutional knowledge of how URAC standards are applied. That knowledge is what IHS brings to every RFI response.

Is Your Organization's Program Ready for URAC Evaluation?

Most organizations pursuing URAC Employer-Based Population Health Accreditation have stronger programs than their documentation reflects — and more gaps than they expect. The IHS gap analysis tells you exactly where you stand before you invest in preparation. Schedule a no-obligation consultation with IHS to get a realistic assessment of your program's URAC readiness.

Schedule a Free Discovery Session

Related Resources