Here’s how it works
- Ingested data from sources is staged ‘as-is” in QMX RAW schema
- Job is initiated by batch job manager. Jobs can be kicked-off in scheduled, event driven or on demand modes
- For incremental loads, deltas are processed and staged in QMX STG schema
- Parameterized jobs then load data from STG schema tables to target QMX Core tables
- Batch job management tracks metrics such as job start time, job end time, job stage and even events within the stage by customer name, subject area/file type
- Staged data is now subjected to predefined DQ rules. These rules can be extended or customized to fit any customer needs

cdc_jobs_events
JOBID (FK)
EVENT_NAME (FK)
STAGE_NAME (FK)
- EVENT_START_TIME
- EVENT_END_TIME
- EVENT INTERVAL
- ROWS P
- status
- description
- error msg
- trace file
- instance_id
- session_id
- serial no