- Company Name
- Queen Square Recruitment
- Job Title
- Mainframe Data Architect
- Job Description
-
Job title: Mainframe Data Architect
Role Summary: Lead the migration of critical business data from Db2 on z/OS to Amazon Aurora PostgreSQL, designing and implementing change‑data‑capture (CDC) pipelines, ensuring zero‑surprise cutovers, and architecting event‑driven systems.
Expectations: Deliver a clean, validated migration within a 6‑month contract, meet governance controls, and support ongoing operational observability.
Key Responsibilities:
- Design and build CDC pipelines (IBM CDC/Precisely) with subscription, bookmark, and replay/backfill logic.
- Translate Db2 schemas to Aurora PostgreSQL, handling logical/physical modelling, referential integrity, and indexing/partitioning.
- Develop integration pipelines (Db2 → Aurora via Kafka/S3), enforcing reliability, ordering, and merge/UPSERT logic.
- Perform data encoding transformations (EBCDIC → UTF‑8, packed decimals) and create robust validation suites.
- Use migration tooling (AWS Glue, Athena, Redshift) for schema conversion and analytics.
- Implement Infrastructure‑as‑Code (Terraform) and CI/CD pipelines (GitLab).
- Plan and execute dual‑run cutovers, reconciliation, rollback, and governance.
- Build observability dashboards (CloudWatch, Grafana) for lag, throughput, errors, and cost.
- Apply Domain‑Driven Design and event‑driven architecture principles for CDC as event streams.
Required Skills:
- Expertise in CDC tools (IBM, Precisely) and subscription management.
- Deep knowledge of Db2, z/OS cataloging, batch windows, and performance tuning.
- Proficiency in relational modelling (PostgreSQL/Aurora), normalization, denormalization, and partitioning.
- Experience with Kafka integration, Python/SQL troubleshooting, and UPSERT/MERGE logic.
- Strong data quality mindset: validation tests, reconciliation, and schema validation.
- Familiarity with Domain‑Driven Design, CQRS, and event‑driven architecture.
Desirable: IBM zDIH patterns, zIIP tuning, COBOL copybook/VSAM ingestion.
Required Education & Certifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Relevant certifications in mainframe technologies, AWS, or database design are a plus.