From Physician Collaboration to URAC-Accredited CIN: A Clinical Integration Program Build
How a regional multi-specialty network moved from informal provider collaboration to URAC Clinically Integrated Network Accreditation — with governance infrastructure, clinical protocols, and a functioning quality program built in the process.
This case study is a composite engagement narrative. Client identities are not disclosed. The organizational structure, challenges, and engagement approach described are representative of IHS's work in this program area.
Background: A Network in Name Only
The organization had operated for several years under a physician-hospital organization structure — a PHO with 300 participating physicians across primary care, cardiology, orthopedics, and several other specialties, affiliated with a regional health system. The network had executed payer contracts jointly and held regular leadership meetings. On paper, it looked like a functioning CIN.
In practice, the clinical infrastructure was thin. There were no shared clinical protocols across specialties. Quality metrics were tracked at the health system level, not at the network level, and individual participating practices received no performance feedback. The credentialing process was delegated entirely to the hospital medical staff office with no formal delegation agreement or oversight mechanism in place. The governance documents were outdated and did not reflect how decisions were actually made.
The impetus for pursuing URAC CIN Accreditation came from two directions simultaneously: a large self-insured employer in the region required demonstrated clinical integration standards as a condition of continued preferred network status, and the network's legal counsel flagged antitrust exposure in the existing joint contracting arrangement. The network's executive director contacted IHS.
Phase 1: Baseline Assessment — Naming the Gaps
The engagement began with a structured baseline assessment against URAC's CIN Accreditation standards. Thomas G. Goddard, JD, PhD — the former Chief Operating Officer and General Counsel of URAC — led the assessment directly, working with the network's executive director, physician leadership, and operational staff over a four-week period.
The assessment examined each standards domain and produced a gap analysis the leadership team could act on:
Governance & Structure
Finding: The PHO board existed but had not met formally in 14 months. No physician committee with defined clinical authority existed. The governance documents referred to committees that had been dissolved years earlier. Decision-making operated informally through the health system's administrative staff rather than through the network's own governance bodies.
Gap severity: Significant. Governance documentation and practice required substantial redesign before any accreditation application was viable.
Clinical Protocols
Finding: No shared clinical protocols had been adopted across the network. Individual practices followed their own specialty society guidelines. There were no mechanisms to monitor whether participating providers adhered to any common standard, and no forum for modifying practice patterns at the network level.
Gap severity: Significant. This was the most fundamental gap — the absence of shared protocols meant the network could not credibly claim to be clinically integrated by any definition.
Quality Measurement & Improvement
Finding: Quality data existed at the health system level but was not organized around the network's participating providers as a distinct population. No quality dashboard, no provider-level feedback, and no structured performance improvement process existed within the network structure.
Gap severity: Moderate. Data infrastructure existed; the gap was in how data was organized, attributed, and acted upon at the network level.
Health Information Technology
Finding: The health system's EHR was shared by employed providers but not by independent practices, which operated on three different platforms. No HIE connection was active. Care coordination was largely telephone-based.
Gap severity: Moderate. IT infrastructure required a defined roadmap, though full interoperability was not required on day one if other integration evidence was strong.
Credentialing & Network Oversight
Finding: Credentialing was fully delegated to the hospital medical staff office. No delegation agreement existed. The network had no visibility into the credentialing process and could not produce documentation of how participating providers were verified or recredentialed.
Gap severity: Significant. A delegation agreement and oversight mechanism required immediate development.
Care Coordination & Consumer Protection
Finding: Some care coordination programs existed through the health system's care management team. Patient rights and grievance policies existed but were not network-specific. Documentation was adequate as a starting point.
Gap severity: Minor. Existing programs required documentation and network-level adaptation, not construction from scratch.
The assessment was direct with the network leadership: this was not a documentation project. Significant program infrastructure needed to be built and operationalized before a credible accreditation application could be submitted. The network agreed to proceed on that basis, with a fourteen-month timeline to accreditation.
Phase 2: Infrastructure Design and Build — Nine Months
The bulk of the engagement was infrastructure construction, not paperwork. IHS worked alongside the network's leadership team to design and implement the governance, clinical, and operational programs the accreditation standards required.
Governance Redesign
IHS drafted a revised governance charter establishing a formal PHO Board with defined physician membership requirements, meeting cadence, and decision-making authority. A Clinical Quality Committee was created with physician co-chairs from primary care and specialty tracks, defined jurisdiction over clinical protocol adoption and quality program oversight, and a documented reporting relationship to the Board. All governance documents were updated to reflect actual organizational practice rather than aspirational structure.
The network's physician leaders required active engagement — governance on paper is not governance in practice, and URAC evaluates both. IHS facilitated the first three Board meetings and the first two Clinical Quality Committee meetings to establish procedural habits and generate the documentary record the accreditation review would require.
Clinical Protocol Framework
IHS worked with the Clinical Quality Committee to identify an initial set of five clinical priority areas where shared protocols were most feasible and most clinically meaningful: diabetes management, hypertension, preventive screening, post-acute care transitions, and high-cost imaging utilization. For each area, the committee reviewed current evidence-based guidelines, adapted them to network context, and formally adopted them through the governance process.
Critically, adoption was not sufficient. IHS designed an adherence monitoring process for each protocol — specifying what data would be collected, how it would be attributed to participating providers, and what the feedback and escalation process would be for providers whose practice patterns were outliers. The monitoring process was operational for six months before the accreditation application was submitted.
Quality Program Construction
IHS designed a network-level quality dashboard drawing on claims data, EHR extracts, and care management records. The dashboard was organized around the network's participating provider panel, not the health system's patient population broadly, and attributed quality metrics to individual practices and to the network as a whole. Provider-level performance reports were designed and distributed quarterly beginning in month five of the engagement.
A structured performance improvement process was documented: how underperforming providers would be identified, what support resources would be offered, and what escalation pathway existed for sustained outliers. The process was tested with two pilot providers before the accreditation application was submitted.
Credentialing Infrastructure
IHS drafted a formal delegation agreement between the PHO and the hospital medical staff office, specifying the scope of delegated credentialing functions, the documentation the medical staff office would provide to the network, the network's right to audit delegation compliance, and the recredentialing cycle. An oversight process was established: the Clinical Quality Committee would receive a quarterly credentialing activity report and would review and approve the delegation arrangement annually.
For the independent practices not credentialed through the hospital, IHS developed a supplemental credentialing verification process to ensure the network had documented evidence of provider qualifications across all participants — not only those credentialed by the hospital.
HIT Roadmap
Full EHR interoperability across all participating practices was not achievable within the accreditation timeline. IHS worked with the network to document the current IT infrastructure accurately, establish an HIE connection for data sharing on the highest-volume patient population, and develop a multi-year HIT roadmap that URAC could evaluate as a credible plan. The roadmap included specific milestones, assigned accountability, and a governance review process — the evidence that technology investment was being driven by clinical integration strategy, not vice versa.
Phase 3: Application, Review, and Accreditation — Five Months
With nine months of infrastructure build behind them — and six months of operational evidence in the governance meeting minutes, protocol adherence data, quality reports, and credentialing records — the network submitted its URAC CIN Accreditation application.
Application and Documentation
IHS managed the application process, preparing the documentation package with standard-by-standard mapping of the network's programs to URAC's requirements. The documentation strategy was conservative: every claim was supported by operational evidence, and no credit was claimed for programs that existed only on paper. The gap assessment from month one informed this approach — the areas where programs had been built from scratch were documented with particular rigor, anticipating that URAC's reviewers would probe them most closely.
URAC Engagement and On-Site Review
URAC's review process involved three conference calls with the network leadership team, a document review period, and an on-site evaluation. IHS prepared the network's executive director, physician committee chairs, and operational staff for each interaction — not by coaching responses, but by ensuring each person understood what their governance role actually required of them and could speak to it authentically.
The on-site review examined the governance meeting records, the clinical protocol adoption and monitoring documentation, the quality dashboard and provider feedback reports, and the credentialing delegation agreement. URAC's reviewers spent significant time with the Clinical Quality Committee co-chairs — the depth of physician engagement in clinical governance was a focus.
RFI Response
URAC issued two requests for information following the initial review. The first concerned the HIT roadmap — reviewers wanted additional specificity about the timeline and accountability for the HIE expansion. The second concerned the care coordination documentation for the post-acute care transitions protocol, where the evidence of protocol implementation was thinner than in other areas.
IHS prepared both RFI responses within ten days of receipt. The HIT response provided a revised roadmap with quarterly milestones and a board resolution establishing governance oversight of the plan. The care transitions response supplemented the existing documentation with additional operational records and a process improvement plan the network had already initiated based on its own quality data.
Both RFIs were resolved without further follow-up. The network received URAC CIN Accreditation fourteen months after the initial engagement began.
Outcomes Beyond the Credential
The accreditation award resolved the immediate business pressures — the employer retained the network in its preferred tier, and the network's legal counsel was satisfied that the clinical integration infrastructure addressed the antitrust exposure identified at the outset. But the more significant outcomes were operational:
A Functioning Governance Structure
The PHO Board and Clinical Quality Committee were meeting regularly and making real decisions about clinical standards and network performance for the first time in the network's history. Physician leaders who had been nominally on governance committees were now actively engaged. The accreditation process had built institutional muscle that continued after the review was complete.
Measurable Protocol Adherence Improvement
By the time of the accreditation award, three of the five initial clinical protocols showed measurable adherence improvement compared to baseline. The diabetes management protocol showed the most significant shift — the quarterly provider feedback reports had surfaced significant practice variation that the committee had not previously known existed, and several providers had modified their workflows in response.
A Credentialing Process the Network Actually Controlled
The delegation agreement and oversight process meant the network for the first time had visibility into how its participants were credentialed and recredentialed. Within three months of the agreement being executed, the oversight process surfaced a provider with a lapsed license that had not been caught by the hospital's recredentialing cycle. The network had a process to act on it; previously, it would not have known.
A Platform for Value-Based Contract Expansion
With documented clinical integration infrastructure in place, the network entered negotiations with two additional commercial payers for value-based arrangements that would not have been available under its prior organizational structure. URAC accreditation was cited in both negotiations as evidence of network maturity.
A Perspective on What Made This Engagement Work
Clinical integration accreditation engagements fail most often not because the documentation is wrong, but because the underlying program is wrong — governance that does not function, protocols that are not monitored, quality programs that produce data but no improvement. Documentation of a non-functioning program does not produce accreditation; it produces a well-organized record of organizational gaps.
The engagement described above worked because the network's leadership was willing to do the underlying work — to redesign governance, to build and operate clinical protocols, to generate and act on quality data — not merely to document what they wished were true. IHS's role was to design the programs, build the infrastructure, and keep the work honest. The accreditation was a recognition of operational reality, not a credential layered on top of it.
That is what URAC's review process is designed to find. Organizations that approach URAC CIN Accreditation as a documentation exercise typically find the review challenging. Organizations that approach it as a program build typically find the review confirms what they already know about their own operations.
— Thomas G. Goddard, JD, PhD, Integral Healthcare Solutions
Is Your Network Ready to Start This Process?
The right first step is an honest conversation about where your network actually is — not where your governance documents say it is. Thomas G. Goddard, JD, PhD conducts initial discovery sessions personally. No sales team. No junior staff.
Schedule a Free Discovery Session