Data Warehouse Success Story: Transforming Operations - Softqube

Data Warehouse Architecture

Data warehouse architecture refers to the design and structure of a system that collects, stores, and manages large volumes of structured and unstructured data from various sources.

Data Warehouse Architecture

Analytics Capabilities At a Glance

QualMetrix Applications - Payor and Risk-Bearing Provider Reports

Executive Summary
Executive Summary
  • Cover Page
  • Savings Opportunities
  • KPI Trends
  • "What-if" Savings
  • General Key Performance Indicators ("KPIs")
  • Integration with EMR (Provider-only)
  • Medical Services KPIs
Pharmacy
Pharmacy
  • Pharmacy Services KPIs
  • Member Prescription Profile
  • Specialty and Prescriber Report
  • Member Medication Adherence Profile
Quality
Quality
  • Complications Report
  • Core Disease KPI
Performance
Performance
  • Member Costs Utilization and Risk Report
  • Primary Care Providers Performance Report
  • Member Medical Profile
Prevention
Prevention
  • Gap in Care Report
  • Member List by Prevention Category
  • HEDIS / USPHT Reports (plan, provider, member)
  • Member Prevention - Personal Health Score Card
Cost Utilization
Cost & Utilization
  • Specialty Profile
  • Choosing Wisely
  • ER Super Users
  • Radiology Service Report
  • Outlier Procedures
  • ER Super Users
  • Geography Variance
  • E&M Level of Service Report
  • Primary Care
  • Procedure Report
  • PCP/Specialty/Ancillary by State County
  • Readmission by Facility, PCP, Member
  • Hospitals Service Line Profile (Payor-only)
  • Avoidable ER
  • Lab and Radiology POS Referral by State County ER resulting in Hospital Visits

Key Highlights

  • Data driven decisions/interventions towards proactively managing costs and controlling utilizations
  • Continuous improvement of compliance to prevention and quality measures by enabling providers with timely impact analysis
  • Allowing for a robust set of metrics and KPIs supported by high-quality, rich data to measure performance by a multitude of dimensions such as by community, by individual, by disease etc.

Capabilities Landscape – Future Plans

Future Applications and New Product Development

Predictive
Modeling

  • Predictive Modeling functionality is currently being incorporated into the QualMetrix Platform
  • UCSD risk scoring model, CDPS (Medicaid only) and CMS CC paradigm already implemented
  • Risk Scoring models predict potential future costs based on risk assessments
  • Reports are intended to show the likelihood of key events including hospital admission /readmission and likelihood of ER visits
  • Likelihood is dependent on physician behavior and member behavior
    • Physician - Past performance is indicative of future behavior
    • Member - Members who use the ER for avoidable visits which cannot be predicted
    • ACO (Payor and Provider) - Population Risk Assessment and Financial Performance Monitor

Additional Application
in Development

  • Near Term Product Evolution
    • Efficiency of Care - Hospital Performance Reports
    • Episode of Care - Specialty Performance Reports
    • Risk Severity (CDPS/ACG)
  • New Product Pipeline
    • Payor - Medical Home Primary Care Reporting
    • Employer - Savings and Benefits Optimization
    • ACO (Payor and Provider) - Population Risk Assessment and Financial Performance Monitor

Our Framework Driven Approach to Rich Analytics

  • Web integration layer offers branding, security, application bundling and user reporting enhancements
  • Payor-specific member centric data repository optimized for performance and powerful analvtics
  • ETL driven data mapping process that allows refreshing of data at required time intervals
  • DW data, Med Claims (837/835), Drug Claims, Membership, Group, Provider, Lab, Care Management Vendor Data, Third Party Risk data, etc.

Key Highlights

  • Logical separation of physical, logical and business application layers, conducive to add-ons and rapid business requirement changes
  • Metadata driven processes eliminating or minimizing manual touch points (map once – use many times)
  • Extensible core data model to drive a standards based approach to data ingestion and dissemination
  • Extensible DQ framework to accommodate additional client specific business rules
Predictive Modeling
Visualization Applications

Unique QualMetrix Approach

Portal Integration Environment
Payor Facing Tools Provider Facing Tools
Business Rules and Logic
Member-Centric Data Repository
Data Mapping Process
Customer Claims
Customer Claims
Customer Enrollment
Customer Enrollment
Provider Data
Provider Data
RY Data
RY Data
Behavioral Health
Behavioral Health
FMR Enrichment
FMR Enrichment

Data Architecture Framework

Key building blocks enabling our current & future product/service offerings

Key Highlights

  • The framework depicts key building blocks towards enabling QMX current and future analytics capabilities
  • Multitenant platform, designed for scalability, optimal SaaS operations, HIPAA security, adhere to Service Oriented Architecture (SOA) guiding principles
  • Loosely coupled modular architecture allow faster system / data integration and swap outs (plug and play)
  • SOA provides higher transparency, protocol independence, ease of maintenance (smaller IT staff footprint) and more creative resource options
  • Metadata driven batch integration framework that is scalable, recoverable, interoperable, auditable and platform agnostic
Metadata Management Semantic Reconciliation

Batch Integration - Key Architectural Layers

Batch Integration Key Architectural Layers Batch Integration Key Architectural Layers

A Top Level View of Data/Process Touch Points

A Top Level View of Data Process Touch Points A Top Level View of Data Process Touch Points

Key Data Flows at a Glance

Key Data Flows at a Glance Key Data Flows at a Glance

Metadata Driven Job Management

Building Auditability through granular job control

sources

ID

  • NAME
  • FEED_TYPE
  • CREATED_DATE
  • REVISION_DATE
  • VERSION

CDC_jobS

JOBID

  • SOURCE ID (FK)
  • JOB TYPE
  • JOB START TIME
  • JOB_END_TIME
  • JOB INTERVAL
  • STATUS
  • DESCRIPTION

cdc_jobs_event_log

JOBID (FK)

EVENT_NAME (FK)

STAGE_NAME (FK)

  • EVENT TIME
  • STATUS
  • LOG MSG
  • ERROR_MSG
  • ERROR CD
  • OBJECT_NAME

CDC_jobs_stages

JOBID (FK)

STAGE_NAME

  • STAGE_START_TIME
  • STAGE_END_TIME
  • STAGE INTERVAL
  • STATUS
  • DESCREPTION
  • PARAMETERS

Here’s how it works

  • Ingested data from sources is staged ‘as-is” in QMX RAW schema
  • Job is initiated by batch job manager. Jobs can be kicked-off in scheduled, event driven or on demand modes
  • For incremental loads, deltas are processed and staged in QMX STG schema
  • Parameterized jobs then load data from STG schema tables to target QMX Core tables
  • Batch job management tracks metrics such as job start time, job end time, job stage and even events within the stage by customer name, subject area/file type
  • Staged data is now subjected to predefined DQ rules. These rules can be extended or customized to fit any customer needs
Heres how it works

cdc_jobs_events

JOBID (FK)

EVENT_NAME (FK)

STAGE_NAME (FK)

  • EVENT_START_TIME
  • EVENT_END_TIME
  • EVENT INTERVAL
  • ROWS P
  • status
  • description
  • error msg
  • trace file
  • instance_id
  • session_id
  • serial no

Robust error handling and restart ability

sources

ID

  • NAME
  • FEED_TYPE
  • CREATED_DATE
  • REVISION_DATE
  • VERSION

CDC_jobS

JOBID

  • SOURCE ID (FK)
  • JOB TYPE
  • JOB START TIME
  • JOB_END_TIME
  • JOB INTERVAL
  • STATUS
  • DESCRIPTION

cdc_jobs_event_log

JOBID (FK)

EVENT_NAME (FK)

STAGE_NAME (FK)

  • EVENT TIME
  • STATUS
  • LOG MSG
  • ERROR_MSG
  • ERROR CD
  • OBJECT_NAME

CDC_jobs_stages

JOBID (FK)

STAGE_NAME

  • STAGE_START_TIME
  • STAGE_END_TIME
  • STAGE INTERVAL
  • STATUS
  • DESCREPTION
  • PARAMETERS

Here’s how it works

  • Granular auditing capability of the batch control process allows for a robust error/exception handling capability
  • When a process encounters a fatal error, the event is recorded in the CDC_JOB_EVENT_LOG table
  • Event record will trap the exact job step the error occurred, the actual error message and exact code segment where the error occurred
  • Once the necessary fix is applied, the process can be executed from the code segment where the error was detected (as opposed to restarting from the beginning)
Robust error handling and restart ability

cdc_jobs_events

JOBID (FK)

EVENT_NAME (FK)

STAGE_NAME (FK)

  • EVENT start TIME
  • event_end_time
  • event interval
  • rows processed
  • status
  • description
  • error msg
  • trace file
  • instance_id
  • session_id
  • serial no

Identifying & Exposing DQ Issues Early

Here’s how it works

  • While standard data integrity related rules are pre-configured for DQ, the framework allows for appending new rules or customizing existing rules to address client specific DQ requirements
  • Once business rules are created and parameters configured, the dynamic logic populates the Business Rule Detail table. The data in this table helps surface any gaps in expected (ideal) state and actual state
  • Once business rules are created and parameters configured, the dynamic logic populates the Business Rule Detail table. The data in this table helps surface any gaps in expected (ideal) state and actual state
    • Data Integrity
    • General Assessment
    • Critical Fields
    • Data Validations
  • DQ data can be easily exposed to client SMEs via canned reports or Spotfire dashboards

Pre-defined DQ Rules

  • Data Integrity: This rule checks for data integrity issues between master (e.g. membership, provider) and transactional data (e.g. claims)
  • Critical Fields: This rules checks for completeness of data, especially, for natural keys and other critical business attributes that are needed for downstream analytics
  • General Assessment: This rule informs of general health snapshot of data. For example:
    • Total number of rows in a table
    • Total number of columns
    • Total number of distinct values in specific columns (e.g. total number of claims, total members with claims, total number of PCPs etc.)
  • Data Validations: Compares data against established standard value set or thresholds.
    • Expected vs actual data types (e.g. to calculate average daily dosage, we’d need strength, days supply count and dispensed count to numeric data types in the inbound RX file)
    • Validations against established thresholds (e.g. eligibility, member counts etc.)

PARAMETERS

Parameter ID

  • Code
  • Name
  • SourceInformation1
  • Type
  • QueryText

BUINESS RULE DETAIL

BusinessRuleDetail ID

BusinessRuleID (FK)

ParameterID (FK)

  • Ideal_Result
  • Actual_Result
  • TotalCounts
  • Minvalue
  • Maxvalue

BUINESS RULE DETAIL

BusinessRule ID

  • Code
  • Name
  • LastRunDate
  • JobID

QM(x) BI Layers

QMx BI Layers QMx BI Layers

QM(x) BI Layers – Tibco SF Architecture

CONSUMERS

Business
Intelligence
Self-Service
Analytics
Data
Science
360°
view
Transactional
applications

TIBCO DATA VIRTUALIZATION

Access Any
Data Source
Combine &
Transform
Optimize &
Secure
Deliver
Data

DATA SOURCES

Packaged
Apps
Excel &
Flat Files
Data
Lakes
XML
Docs
Web
Services
Cloud
Data
Master &
Reference Data
RDBMS Data
Warehouse
Big Data ioT / Streaming
Data

QM(x) BI Layers – Tibco SF features

Distributed In-memory Processing

Scale high speed analytics to thousands of users and millions of rows of data on 32- or 64-bit OS laptops and clustered servers

Free Dimensional Ad Hoc Queries

Replace 100s of reports with on-the-fly aggregations, hierarchies and custom metrics.

Instant Interactive Visualizations

Auto-configured based upon the underlying data itself – minimizes pre-processing efforts.

Guided Analytics™

Unlock business expertise and reduce IT work loads with user created point-and-click workflow or event-driven broadcast distribution of analyses.

Adaptable & Extensible

End-user adapted to type of analysis plus Microsoft Visual Studio C#-based Analytic Developer Kits (SDK).

Real-time, event-driven

Process monitoring and improvement

Integrated Statistics, Modeling & Data Mining

Delivering powerful applications to the enterprise and enabling collaborations between statistical experts and business professionals.

QM(x) BI Layers – Tibco SF Scalability

External Scalability
Internal Scalability
QMx BI Layers Tibco SF Scalability

Conclusion

our data warehouse case study exemplifies how our solution empowered data-driven decision-making. By centralizing and analyzing diverse data sources, our client achieved improved insights and operational efficiency. Experience the potential of advanced data management – explore our case study and unlock a new dimension of business optimization.