URAC Disease Management Accreditation Case Study — Regional Managed Care Organization
Last updated: April 2026
The following case study describes an IHS engagement with a regional managed care organization pursuing URAC Disease Management Accreditation. Client identity is confidential per IHS standard practice. Program details, findings, and outcomes are accurate.
Background
A regional managed care organization managing approximately 180,000 covered lives across commercial, Medicaid managed care, and Medicare Advantage lines of business had operated an internal disease management program for seven years. The program addressed five primary chronic conditions in its population: diabetes, congestive heart failure, chronic obstructive pulmonary disease, asthma, and hypertension.
The organization had pursued URAC Disease Management Accreditation independently eighteen months earlier, submitting an application through AccreditNet without prior consulting support. URAC's desktop review resulted in findings against multiple standards, and the validation review surfaced additional deficiencies. The organization received a provisional determination with a required corrective action plan — effectively a conditional outcome requiring remediation before accreditation could be granted.
Rather than attempt a second self-directed correction cycle, the organization's Chief Medical Officer engaged IHS. The immediate driver was contractual: the organization's two largest commercial health plan clients had each added a URAC Disease Management Accreditation requirement to their upcoming contract renewal terms. The renewal window was fourteen months out.
The Situation at Engagement
IHS conducted a Standard-by-Standard Review against current URAC Disease Management standards within the first three weeks of engagement. The review identified five categories of deficiency — consistent with what IHS sees across organizations that attempt self-directed URAC accreditation:
1. Look-Back Period Failures
The organization had developed policies responsive to URAC standards in the six weeks preceding its initial AccreditNet submission. Those policies were compliant on paper. They were not operationally embedded — no meeting minutes, no documented quality committee activity, no outcome tracking reports, and no member-level communications existed from the period before the policy adoption dates. URAC reviewers correctly identified that the organization could not demonstrate longitudinal adherence to its stated procedures.
This is the most common and most damaging error in self-directed URAC accreditation. Organizations focus on policy documentation and miss that reviewers assess operational history — what the organization actually did, not what its policies say it will do.
2. Non-Individualized Patient Communication
The organization's disease management communications — outreach letters, educational materials, self-management guides — were condition-generic. The same diabetes education packet was sent to a 34-year-old newly diagnosed member and a 72-year-old member with insulin-dependent diabetes and three co-morbidities. URAC standards require communications that are adapted to the member's specific situation, health literacy level, and disease stage. Generic materials across a heterogeneous population do not satisfy this requirement.
3. Risk Stratification Documentation Gaps
The organization had a risk stratification algorithm embedded in its care management platform. The algorithm functioned. What did not exist was documentation of how stratification decisions were made, how the organization responded differently to high-risk versus moderate-risk members, and how risk tier assignments were reviewed and updated over time. URAC reviewers could not assess a process they could not read.
4. Quality Management Infrastructure Without Operational Evidence
The organization had a Quality Management Committee on paper — a charter, a membership list, and defined meeting frequency. The committee had met twice in the preceding eighteen months, and the meeting minutes from both sessions were sparse. There was no documentation of quality improvement initiatives, no outcome data reviewed at committee level, and no evidence of any improvement activity resulting from committee deliberation.
URAC standards require that quality management infrastructure be operational — not merely established. A committee that exists without functioning is not an operational quality management system.
5. Provider Collaboration — No Documented Process
The organization's DM nurses reported routinely contacting member primary care physicians. No documentation of those contacts existed in a systematic, reviewable format. URAC standards require demonstrated evidence of provider collaboration — communication logs, care coordination records, or equivalent documentation. Verbal practice without documentation does not satisfy accreditation standards.
The IHS Approach
With fourteen months to contract renewal and a previous failed accreditation attempt on record, the engagement required a sequenced remediation and rebuild strategy — not an accelerated sprint.
Phase 1: Corrective Action Plan Response (Months 1–2)
IHS first addressed the outstanding URAC corrective action plan from the provisional determination. Each finding was mapped to its specific standard, the deficiency precisely characterized, and a compliant remediation drafted. IHS submitted the corrective action response on behalf of the organization with supporting documentation. URAC accepted the response and closed the corrective action without additional findings — clearing the prior determination and resetting the accreditation path.
Phase 2: Policy and Procedure Rebuild (Months 2–4)
IHS rewrote the organization's disease management policies and procedures in full — not as template documents, but as operational manuals reflecting how the organization's specific program functioned. Each policy was mapped to the URAC standard it satisfied. Language was reviewed for plain-English readability, internal consistency, and alignment with the organization's actual care management platform workflows.
Member communication materials were restructured. IHS worked with the organization's clinical staff to develop condition-specific, health-literacy-appropriate communication pathways segmented by risk tier and disease stage. A 72-year-old insulin-dependent member with heart failure received different materials, different outreach frequency, and different self-management support than a newly diagnosed member with controlled Type 2 diabetes. The personalization architecture was documented so reviewers could trace from the standard to the policy to the implementation.
Phase 3: Look-Back Period Architecture (Months 2–11, concurrent)
IHS built the look-back period strategy from day two. This meant establishing operational infrastructure that would generate reviewable evidence over time — not in the weeks before submission:
- Quality Management Committee meeting schedule reset, with agenda templates, documentation protocols, and outcome data review integrated into every session
- Member communication logs established as a standard workflow step — every DM nurse contact with a member or provider documented in the care management platform with date, content, and outcome fields
- Risk stratification review cycles formalized — monthly reassessment documentation, escalation records, and tier transition logs
- Quality improvement initiative formally opened: a medication adherence improvement project for the diabetic population, with baseline data, monthly tracking, and quarterly committee review
By month eight, the organization had seven months of documented QMC meeting minutes, quality data, provider communication records, member outreach logs, and improvement project tracking — exactly the operational history URAC reviewers would assess.
Phase 4: Mock Survey (Month 9)
IHS conducted a full mock survey at month nine — reviewing all submitted documentation against current standards and conducting structured interviews with the DM nursing staff, quality manager, and Chief Medical Officer using validation-review-style questioning. The mock survey identified three remaining gaps:
- Consumer engagement measurement — the organization tracked outreach activity but had not documented member engagement outcomes (did members act on the education provided? did self-management behaviors change?)
- One policy section referenced a care management platform workflow that had been updated, creating a documentation discrepancy
- Provider collaboration logs existed but were not organized in a format that would allow a reviewer to quickly pull a random sample — a practical presentation problem, not a substantive gap
All three were corrected before AccreditNet submission. The mock survey finding on consumer engagement measurement led to the addition of a quarterly member self-management behavior survey — a genuine program improvement, not merely a documentation fix.
Phase 5: AccreditNet Submission and URAC Review (Months 10–12)
IHS managed the complete AccreditNet submission. URAC's desktop review resulted in one RFI — a request for additional documentation on the organization's provider collaboration process for members who declined DM program participation. IHS drafted the RFI response within four business days, providing documentation of the organization's procedure for documenting declination, the notification pathway to the member's PCP, and three de-identified case examples. URAC accepted the response without follow-up.
Phase 6: Validation Review and Determination (Month 13)
The validation review was conducted remotely. URAC reviewed documentation and interviewed the Chief Medical Officer, quality manager, and two DM nurses. IHS had prepared each participant with the specific standards their function related to, the documentation supporting each standard, and the question patterns reviewers typically use. No additional findings were issued. URAC granted Disease Management Accreditation at the three-year term within thirty days of the validation review.
Outcomes
- URAC Disease Management Accreditation granted at the three-year term — thirteen months from IHS engagement, one month before the commercial contract renewal deadline
- Both commercial client contracts renewed with the accreditation requirement satisfied
- No standards findings in the validation review — a clean result following a provisional determination eighteen months earlier
- Quality Management Committee fully operational — monthly meetings with documented agendas, outcome data review, and improvement activity sustained after accreditation
- Medication adherence improvement project — initiated as part of the accreditation preparation, continued as an ongoing QI initiative post-accreditation, with documented improvement in diabetic population PDC (proportion of days covered) metrics over twelve months
- Provider collaboration documentation converted from informal practice to a systematic workflow embedded in the care management platform — a program improvement that extended beyond accreditation compliance
What This Engagement Illustrates
Self-directed URAC accreditation has a predictable failure pattern
The organization's initial failure was not unique. Across IHS's 25+ years of URAC accreditation work, self-directed first-time applicants consistently fail in the same places: look-back period, individualized communications, and quality management operational evidence. These are not obscure standards — they are the central requirements. Organizations fail them because they focus on what they need to document, not on what they need to demonstrate.
Look-back period cannot be rushed
The most important strategic decision in this engagement was made in month two: building operational infrastructure designed to generate look-back period evidence over time, not in the weeks before submission. Organizations that understand this succeed. Organizations that treat accreditation preparation as a documentation exercise at the end of a program maturation cycle fail.
A prior failure is not disqualifying
A provisional determination or outright denial does not permanently bar an organization from URAC accreditation. It does require a structured remediation strategy — one that addresses the specific findings, rebuilds the affected infrastructure, and generates new operational evidence before resubmission. IHS manages this process routinely.
Accreditation preparation improves the program
The consumer engagement survey, the medication adherence improvement project, the formalized provider communication logs, and the operational Quality Management Committee were not artifacts of accreditation preparation — they were genuine program improvements that emerged from the discipline the accreditation process imposed. This is URAC Disease Management Accreditation functioning as intended: not a credential appended to an existing program, but a process that strengthens the program.
Facing a Similar Situation?
Whether your organization is pursuing URAC Disease Management Accreditation for the first time, recovering from a prior findings-heavy review, or working against a contract renewal deadline, IHS provides the specialized expertise to build a compliant, sustainable path to accreditation.
IHS engagements are led by Thomas G. Goddard, JD, PhD — former Chief Operating Officer and General Counsel of URAC. No other consulting firm brings that level of institutional knowledge to disease management accreditation work.
Schedule a Free Discovery Session to discuss your organization's situation, timeline, and where IHS can add the most value.