- Company Name
- Zego
- Job Title
- Head of Data Engineering
- Job Description
-
**Job title:** Head of Data Engineering
**Role Summary:** Lead a high‑performance team of four Data Engineers to design, build, and maintain a scalable, resilient data platform that supports product, analytics, and business needs while ensuring data privacy and governance.
**Expectations:** Deliver a robust data architecture aligned with business strategy, foster a culture of mentorship and continuous improvement, and champion modern data engineering practices across the organization.
**Key Responsibilities:**
- Lead, mentor, and grow the Data Engineering team, setting high standards for quality and ownership.
- Define and execute the data engineering roadmap in partnership with Engineering, Analytics, Product, and Pricing leaders.
- Own end‑to‑end data platform architecture: ingestion, transformation, warehousing, governance, observability, and real‑time processing.
- Oversee the design and evolution of ETL/ELT pipelines, data models, and data lake or lakehouse structures.
- Drive adoption of CI/CD, infrastructure‑as‑code, testing, and observability tooling.
- Identify, evaluate, and implement technology upgrades or re‑architectures to improve performance, cost, and maintainability.
- Collaborate with cross‑functional stakeholders to translate business requirements into technical solutions.
- Manage data privacy, security, and compliance across all data workflows.
**Required Skills:**
- 2+ years leading a Data Engineering team (Technical Lead or Manager).
- 5+ years as a Data Engineer with proven experience building scalable platforms in product‑oriented or high‑growth environments.
- Deep proficiency in Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgreSQL, Airflow, dbt, Apache Spark.
- Hands‑on experience with AWS services, Docker, Terraform, and IaC best practices.
- Strong understanding of data governance, observability, and data quality frameworks.
- Excellent stakeholder communication: ability to translate complex technical concepts into business‑focused dialogue.
- Proven mentorship and coaching abilities, fostering continuous learning.
- Problem‑solving mindset that balances technical depth with business priorities.
**Nice to have:**
- Experience with Data Mesh or Data Lakehouse architectures.
- Familiarity with Kubernetes, Kafka, Kinesis, and real‑time streaming pipelines.
- Exposure to ML engineering pipelines or MLOps frameworks.
**Required Education & Certifications:**
- Bachelor’s or higher degree in Computer Science, Software Engineering, Data Engineering, or a related technical field.
- Relevant certifications (e.g., AWS Certified Solutions Architect, Snowflake Professional, dbt Certified Engineer) are advantageous but not mandatory.