Case Study — URAC IME Accreditation

Building a URAC-Compliant IME Program from the Ground Up

This composite case study reflects common patterns IHS encounters across URAC IME Accreditation engagements. Identifying details have been generalized. The operational challenges, gap findings, and remediation approach are representative of actual client work.

The Situation

A regional IME scheduling and management company had been operating for seven years, coordinating independent medical examinations primarily for workers' compensation insurers and self-insured employers across five states. The company maintained a panel of approximately 80 examining physicians across a range of specialties — orthopedics, neurology, psychiatry, occupational medicine, and internal medicine.

The company's operations had grown organically. Examiners were selected through referral networks. Turnaround performance was tracked informally via a shared spreadsheet. Conflict-of-interest disclosures were collected on a paper form at panel onboarding but not reviewed systematically. A HIPAA policy existed as a standalone document but had not been updated since 2021 and was not integrated into staff training.

A large regional workers' compensation carrier — the company's largest client, representing roughly 40% of revenue — notified the company that it would be implementing a new vendor qualification standard effective 12 months from the date of the notice. IME vendors would be required to hold URAC IME Accreditation or demonstrate active pursuit of accreditation with a credible timeline to completion. Vendors that could not satisfy the requirement would be removed from the approved panel.

The company engaged IHS within two weeks of receiving the carrier's notice.

The Gap Analysis

IHS began with a structured gap analysis against URAC's IME Standards v1.0 — mapping the company's existing documentation, operational practices, and governance structure against each standard's requirements. The analysis was completed over four weeks and included document review, structured interviews with operations leadership, and a process walkthrough of the company's examiner onboarding, scheduling, reporting, and complaint handling workflows.

What the Analysis Found

Critical Gap

Examiner Credentialing: No Documented Process

The company maintained a list of approved examiners with licensure information on file, but had no documented credentialing procedure, no primary source verification records, and no re-credentialing schedule. Several examiners on the active panel had not had their credentials re-verified in over three years. One examiner's board certification had lapsed; there was no record indicating the lapse had been identified or addressed.

URAC's standard requires a documented credentialing process — not a list of approved providers. The organization must conduct primary source verification of licensure, board status, and sanction history at appointment and at defined renewal intervals. The company's current approach would not satisfy the standard.

Critical Gap

Conflict-of-Interest Management: Policy Without Process

The company's COI disclosure form collected information at onboarding but had no mechanism for ongoing disclosure, no defined review process, and no documentation of how disclosed conflicts had been evaluated or resolved. Three of the most active examiners on the panel had prior employment relationships with the company's largest carrier client — a relationship that created at minimum the appearance of a conflict requiring evaluation and documentation.

URAC requires not just a disclosure form but a documented process for evaluating and resolving disclosed conflicts, with records of disposition. A form that collects information but goes nowhere does not satisfy the standard.

Significant Gap

Timeliness Monitoring: Informal Tracking Without Performance Records

The company tracked report delivery dates in a shared spreadsheet, but the data was inconsistently maintained and had never been used to generate a performance report. There was no defined turnaround standard against which performance was measured, no escalation protocol for at-risk cases, and no documentation of any corrective action taken when delivery dates were missed.

URAC requires active monitoring of turnaround performance against defined timeframes, with documented escalation processes and evidence that performance data is reviewed and acted upon through the QI program.

Significant Gap

Quality Improvement Program: Framework Without Function

The company had a one-page "Quality Policy" document that described a commitment to quality but did not define performance metrics, data collection methods, review cadence, or corrective action procedures. There were no QI committee meeting minutes, no performance reports, and no records of any corrective action having been initiated or resolved.

URAC reviewers look for evidence that a QI program is functioning — not just that a quality policy exists. A document that describes intent without any record of implementation will not satisfy the standard.

Moderate Gap

HIPAA Privacy and Security: Outdated and Unintegrated

The company's HIPAA policies were last updated in 2021 and had not been reviewed following the company's adoption of a new electronic scheduling and document management platform in 2023. The new platform handled PHI differently from the prior system — including new data flows, new access controls, and new breach detection mechanisms — none of which were reflected in the existing privacy or security policies. Staff HIPAA training records were incomplete.

Moderate Gap

Appeals Process: Undocumented and Inaccessible

The company had an informal practice of responding to complaints about examination findings, but no documented appeals policy, no defined timeframes for response, and no mechanism by which claimants or requestors were informed that an appeals process existed. The process had never been communicated in writing to any party.

URAC requires a documented, accessible, and time-defined appeals process. An informal responsiveness practice does not satisfy the standard.

The Remediation Plan

IHS presented the gap analysis findings to the company's leadership team with a prioritized remediation roadmap. Given the 12-month window imposed by the carrier's requirement, the plan was structured to address critical gaps first — because URAC reviewers will not approve an application with unresolved critical deficiencies — and to build documentation in a sequence that supported operational adoption, not just paper compliance.

Phase 1 Months 1–3

Examiner Credentialing Infrastructure

IHS designed a documented credentialing procedure that defined the primary source verification requirements at initial appointment and at three-year renewal. The procedure identified which sources constituted acceptable primary source verification for each credential element — licensure (state licensing board), board status (certifying board directly), and sanction history (OIG/SAM.gov).

The company conducted a credentialing audit of its full examiner panel using the new procedure. The lapsed board certification was identified and the examiner was moved to inactive status pending resolution. Two additional examiners were identified as due for re-credentialing; both were re-verified and remained on the active panel.

Credentialing records were established for all active panel members, including documentation of the verification source and date for each credential element.

Phase 2 Months 2–4

Conflict-of-Interest Program Rebuild

IHS developed a COI policy that defined what constituted a conflict at the organizational and examiner level, established an annual disclosure requirement for active panel members, and created a documented review and disposition process. The policy defined categories of conflicts that required removal from specific cases, conflicts that could be managed with documented mitigation, and conflicts that required removal from the panel entirely.

The company issued updated COI disclosure forms to all active panel members and collected disclosures from all 80 examiners within 30 days. Disclosed relationships — including the three examiners with prior employment relationships with the major carrier — were evaluated against the policy and dispositioned in writing. Two of the three were cleared with documented rationale; one was restricted from cases involving that carrier's claims.

Disposition records were filed in each affected examiner's credentialing record.

Phase 3 Months 3–5

Timeliness Standards and Monitoring System

IHS worked with the company's operations team to define timeliness standards for each key milestone in the IME workflow: acknowledgment of referral receipt, examination scheduling confirmation, report delivery following examination, and report delivery following records review requests. Standards were set based on carrier contract requirements and URAC's guidance, and documented as formal organizational policy.

The spreadsheet-based tracking system was replaced with a structured log that captured actual performance against each defined timeframe. An escalation protocol was documented that defined the internal notification chain when a case was at risk of missing a deadline.

The company ran three months of structured timeliness data before filing the URAC application — providing evidence of active monitoring and performance measurement that reviewers could evaluate.

Phase 4 Months 4–6

Quality Improvement Program Build

IHS designed a QI program structure that defined three core performance metrics — timeliness rate, report completion rate, and complaint rate — with quarterly data collection and a defined review process. A QI committee was established with documented membership, a meeting schedule, and a charter that defined scope, reporting obligations, and corrective action authority.

The committee held its first meeting in month five. IHS prepared the agenda, facilitated the review of the first timeliness data report, and documented the meeting minutes. Two corrective actions were initiated — both related to a single examiner whose report turnaround times were consistently at the low end of the defined window. Both were documented and tracked to closure before the application was filed.

Phase 5 Months 5–7

HIPAA Update and Appeals Policy

IHS reviewed the company's HIPAA privacy and security policies against the current platform's data architecture and updated both documents to reflect the 2023 system migration. A staff training program was developed and delivered; training completion records were documented for all staff with access to PHI.

A formal appeals policy was drafted that defined the process available to claimants and requestors who wished to dispute examination findings or procedures, the timeframes for acknowledgment and resolution, and the communication mechanism through which parties were informed of the process. The policy was incorporated into the company's standard examination notification letters.

Phase 6 Months 7–9

Application Assembly and Submission

IHS assembled the full URAC application package — organizing all policy documents, procedure manuals, operational records, QI data, credentialing records, and supporting evidence against the standard-by-standard structure URAC's review process requires. Narrative responses were drafted for each standard, anticipating the documentation questions reviewers typically raise and ensuring the evidence submitted addressed those questions directly.

The application was submitted in month nine — three months ahead of the carrier's 12-month deadline.

The Outcome

URAC issued one RFI following its initial review, requesting additional documentation on the COI evaluation process for the three examiners with prior carrier employment relationships. The RFI asked specifically for evidence that the evaluations had followed the company's documented policy process — not just that a disposition decision had been made.

IHS prepared the RFI response within the required timeframe, submitting the disposition records, the policy section governing the evaluation criteria applied, and a narrative explanation of how the policy had been applied to each specific relationship. No additional information was requested. URAC issued the accreditation decision approximately six weeks after the RFI response was submitted.

1
RFI issued — resolved in a single response cycle
9 mo.
Application submitted — 3 months ahead of carrier deadline
3 yr.
Accreditation cycle — no conditions or partial approval
80
Examiners re-credentialed under the new documented process

The carrier was notified of accreditation status before the deadline. The company retained its panel position. In the months following accreditation, two additional carriers approached the company about panel inclusion — citing its URAC status as the reason for outreach.

The more durable outcome was internal. The company now operates with a credentialing program that catches lapsed credentials before they create liability, a COI process that produces documented records rather than informal dispositions, and a QI program that generates actual performance data rather than a policy that describes intentions. The infrastructure built for accreditation became the operational foundation the company had been missing.

What This Engagement Illustrates

A Policy Is Not a Process

The most common gap IHS finds in IME programs is the distance between documentation and operation. A conflict-of-interest form that collects disclosures but has no review step is not a COI program. A quality policy that describes commitment to excellence but has no metrics, no data, and no meeting records is not a QI program. URAC reviewers are trained to look for operational evidence — not just well-written policies.

The Timeline Is Set by the Operational Record, Not the Policies

URAC expects to see evidence that a program has been functioning — not just that it has been designed. This company needed three months of structured timeliness data and at least one cycle of QI committee activity before the application was credible. Organizations that try to file immediately after drafting policies will not have the operational record reviewers require.

RFI Response Is a Precision Exercise

The single RFI this company received was narrow and specific — it asked about evidence of process application, not about the policy itself. Responding precisely to what was asked, with documentation that answered the specific question, resolved the matter in a single cycle. Organizations that submit broad responses, argue with the finding, or submit documentation that addresses tangential issues typically extend the review timeline by multiple months.

Accreditation Builds the Program You Should Have Had

The most common reaction IHS hears from clients after completing accreditation is that the process revealed operational vulnerabilities they would not have identified through normal course of business. The examiner with a lapsed board certification, the COI relationships that had never been evaluated, the timeliness gaps obscured by informal tracking — none of these were visible until accreditation preparation required a structured look. The investment in accreditation returned more than a credential.

Interpretive Authority Matters

IHS is led by Thomas G. Goddard, JD, PhD — the former Chief Operating Officer and General Counsel of URAC. In this engagement, that background shaped every stage of preparation: how the gap analysis was framed against the standard's actual intent, how the COI disposition records were structured to satisfy the specific documentation question reviewers ask, and how the single RFI was responded to with precision rather than volume. Consultants who have read the standards and consultants who wrote the standards are not equivalent. The difference shows in outcomes.

Facing a Carrier Requirement or Regulatory Pressure on IME Accreditation?

The pattern in this case study is common: a carrier requirement or regulatory notice creates a fixed deadline, and the organization needs to understand quickly how much work the accreditation actually requires and whether the timeline is achievable. A discovery session with IHS gives you that assessment — based on your program's actual current state, not a generic estimate.

Schedule a Free Discovery Session

Last updated: April 2026