Technology14 min read

Measuring TPA Performance: A Data-Driven Approach to CMS Universe File Compliance

S

Sevana Health Team

November 20, 2025

For Medicare Advantage and Part D health plans, managing delegated entities—Third-Party Administrators (TPAs) and Pharmacy Benefit Managers (PBMs)—represents both an operational necessity and a compliance challenge. When it comes to CMS universe file submissions, the quality of data provided by your TPAs directly impacts your plan's audit readiness, regulatory compliance, and ultimately, your Stars ratings.

Critical Question:

Yet many health plans struggle to answer a fundamental question: Is my TPA getting better or worse over time? This article explores a systematic, data-driven approach to measuring TPA performance in the context of CMS universe file validation.

The TPA Universe Validation Workflow

Before diving into metrics, it's essential to understand the typical workflow for delegated entity universe file processing:

  1. 1. TPA Submission: The delegated entity (TPA/PBM) prepares and submits monthly universe files (ODAG, CDAG, FA, SNPCC protocols)
  2. 2. Initial Validation: Health plan receives the file and validates it (either manually or using automated validation tools)
  3. 3. Issue Identification: Validation identifies IDS risk errors, compliance warnings, and data quality issues
  4. 4. Remediation Request: Health plan sends validation results back to TPA for correction
  5. 5. TPA Correction: TPA fixes identified issues and resubmits
  6. 6. Re-validation: Health plan validates the corrected file
  7. 7. Repeat: Cycle continues until file meets CMS compliance standards

The number of iterations required—and the types of errors encountered—tells you everything about TPA performance quality.

Why Traditional Oversight Fails

Many health plans track TPA performance through:

Common Inadequate Approaches

  • • Anecdotal feedback: "They seem to be doing better this month"
  • • Point-in-time snapshots: Only looking at final submission success/failure
  • • Incomplete metrics: Ignoring processing failures that never reach validation
  • • Lack of trend analysis: No month-over-month comparison

The Hidden Problem

A TPA might successfully submit a clean file in Month 6, but if it took them four resubmissions to get there (vs. one resubmission in Month 1), they're actually declining in performance—not improving. These approaches miss critical insights.

Key Performance Indicators for TPA Quality

1. Processing Failure Rate

What it measures: Percentage of submissions that fail header validation or file structure checks before even reaching CMS rule validation.

Why it matters: Processing failures indicate fundamental data preparation issues—wrong column headers, incorrect file formats, or structural problems. These are the "easiest" errors to prevent and suggest inadequate quality control processes.

Target: < 5% processing failure rateRed flag: Processing failures increasing month-over-month

2. IDS Risk Error Rate

What it measures: Invalid Data Submission (IDS) errors per successful validation, averaged across all submissions.

Why it matters: IDS risk errors represent violations that render universe files "unusable" by CMS. These are critical compliance failures that can result in rejected submissions and audit findings. Per 42 CFR § 423.568, plans are responsible for ensuring data accuracy—even when delegated.

Examples of IDS risk errors:

  • • Invalid member identifiers (Medicare Beneficiary Identifier format violations)
  • • Required field omissions
  • • Invalid date ranges or sequences
  • • Enrollment status inconsistencies
Target: < 1.0 average IDS errors per submissionRed flag: IDS errors increasing over 3+ months

3. Compliance Error Rate

What it measures: CMS compliance warnings per successful validation (timeliness violations, notification requirements, etc.)

Why it matters: While not "IDS risk," compliance errors indicate process gaps that could become audit findings. These warnings often reflect operational inefficiencies that, if left unchecked, evolve into systematic compliance failures.

Examples of compliance errors:

  • • Timeliness violations (42 CFR § 422.572, § 422.629)
  • • Missing notification documentation
  • • Incomplete appeal/grievance data
  • • Expedited processing issues
Target: < 5.0 average compliance errors per submissionRed flag: Recurring errors in same categories month-over-month

4. First-Pass Success Rate

What it measures: Percentage of initial submissions (first attempt in a given month) that complete validation with zero IDS risk errors.

Why it matters: Demonstrates TPA quality assurance maturity. High first-pass rates indicate robust internal validation processes.

Target: > 80% first-pass success rateRed flag: First-pass rate declining over time

5. Resubmission Rate

What it measures: Average number of submission attempts required before achieving clean validation.

Why it matters: Quantifies the operational burden imposed by TPA quality issues. Each resubmission cycle adds days to compliance timelines and increases audit risk windows.

Target: < 1.5 average resubmissions per monthRed flag: Increasing resubmissions despite repeated feedback

6. Issue Recurrence Rate

What it measures: Percentage of errors appearing in consecutive months or reappearing after being previously corrected.

Why it matters: Indicates whether TPAs are implementing sustainable fixes vs. "band-aid" solutions. Recurring issues suggest systemic process gaps.

Target: < 10% recurrence rateRed flag: Same error codes in 3+ consecutive months

Real-World Example: Analyzing TPA Performance Trends

Let's examine a concrete example from a recent ODAG (Organization Determinations, Appeals, and Grievances) universe validation cycle:

Reporting Period:

Q3 2024 (July - September)

Delegated Entity:

Regional Benefits TPA

Protocol:

ODAG - OD (Organization Determinations)

MonthTotal SubmissionsProcessing FailuresSuccessful ValidationsIDS ErrorsAvg IDS/SubCompliance ErrorsAvg Comp/Sub
Jul 20242024723.5012864.00
Aug 20242024221.004522.50
Sep 20243123819.005226.00

Key Insights

Processing Failures

September showed a concerning development—1 out of 3 submissions failed header validation. This represents a 33% processing failure rate, indicating possible quality control breakdown in the TPA's file preparation process.

IDS Risk Errors

Positive trend (23.5 → 21.0 → 19.0 avg per successful validation), demonstrating the TPA's data quality processes are improving. This 19% reduction over three months indicates effective remediation of systematic issues.

Compliance Errors

Dramatic improvement from July (64 avg errors) to August (22.5 avg errors)—a 65% reduction. However, the September slight increase to 26 errors suggests the need for continued monitoring to ensure improvements are sustainable.

Overall Trend

IMPROVING (19% reduction in IDS errors, 59% reduction in compliance errors), but with caution flags around the new processing failures that emerged in September.

Recommended Actions

  1. Immediate: Conduct root cause analysis on September's header validation failure
  2. 30 days: Implement pre-submission validation checklist for file format compliance
  3. Ongoing: Monitor compliance error recurrence—investigate why July issues partially returned in September
  4. Quarterly: Performance review meeting with TPA to discuss quality improvement roadmap

The Business Case for TPA Performance Measurement

Beyond regulatory compliance, systematic TPA performance tracking delivers tangible business value:

Risk Mitigation

  • • Audit preparedness: Demonstrate oversight and corrective action for CMS audits
  • • Early warning system: Identify declining performance before it becomes an audit finding
  • • Liability management: Document delegated entity accountability

Operational Efficiency

  • • Resource optimization: Reduce internal rework from poor TPA submissions
  • • Faster cycle times: Fewer resubmissions = faster file acceptance
  • • Reduced escalations: Data-driven conversations with underperforming TPAs

Strategic Planning

  • • Contract renewal decisions: Objective performance data for delegation agreements
  • • Delegation strategy: Identify which functions are suitable for delegation
  • • Competitive benchmarking: Compare TPA performance across your network

Technology Enablers: CMS Universe Scrubbers

Manual TPA performance tracking is impractical given the volume and complexity of CMS universe files. Modern CMS universe validators (also called "universe scrubbers") provide:

Key Capabilities

  • • Automated validation: Field-level validation against 42 CFR requirements
  • • Performance dashboards: Month-over-month trending for all delegated entities
  • • Issue categorization: Automatic classification of IDS risk vs. compliance errors
  • • Drill-down capability: From summary metrics to specific error instances
  • • Comparative reporting: TPA performance benchmarking
  • • Audit trail: Complete submission history for regulatory documentation

When evaluating universe validation tools, prioritize solutions that specifically support delegated entity performance tracking as a core feature—not just basic file validation.

Best Practices for Continuous Improvement

1. Establish Clear Performance Standards

Document acceptable error rate thresholds in your delegation agreements: maximum allowable IDS error rate, processing failure tolerance, and required improvement trajectory for underperforming TPAs.

2. Create Feedback Loops

  • • Real-time: Share validation results immediately upon file processing
  • • Monthly: Provide TPA performance scorecards
  • • Quarterly: Joint performance review meetings with action plans

3. Incentivize Quality

Consider performance-based provisions in delegation contracts: service level agreements tied to error rates, quality improvement bonus structures, and escalation procedures for sustained underperformance.

4. Build Internal Capability

Ensure your compliance team has: access to historical performance data, training on CMS validation rules, tools to generate ad-hoc performance reports, and authority to escalate performance concerns.

5. Document Everything

Maintain comprehensive records of all TPA submissions, validation results with error details, corrective action requests and TPA responses, performance trend analysis, and delegation oversight activities. This documentation is your defense in CMS audits and your evidence for contract enforcement.

Conclusion

Measuring TPA performance isn't about blame—it's about continuous improvement, risk management, and ensuring your Medicare Advantage or Part D plan maintains the highest standards of compliance and operational excellence.

By implementing systematic tracking of processing failures, IDS risk errors, compliance warnings, and resubmission rates, you transform TPA oversight from a subjective exercise into a data-driven program that protects your plan, serves your members, and satisfies CMS requirements.

The question isn't whether to measure TPA performance—it's whether you can afford not to.

Related Topics

  • • CMS Universe File Validation Requirements
  • • ODAG Universe File Compliance (42 CFR § 422.568, § 423.568)
  • • Medicare Advantage Audit Preparation
  • • Part D TPA Delegation Best Practices
  • • IDS Risk Error Prevention Strategies
  • • CMS Stars Rating Quality Improvement

Transform Your TPA Oversight Program

Sevana Health's CMS Universe Scrubber provides automated validation, delegated entity performance tracking, and comprehensive reporting to ensure your TPAs meet the highest standards of data quality and compliance.

Ready to Simplify Your Compliance?

See how Sevana Health can help you avoid violations and streamline your processes.