- Company Name
- HARDIS TECH SERVICES
- Job Title
- EXPERT DATA INTEGRATION & ENGINEER F/H
- Job Description
-
Data Integration & Engineering Expert
Role Summary: Design, develop, and industrialize data architectures (ETL/ELT, data warehouses, lakehouses) for clients across diverse industries, leveraging cloud platforms (Snowflake, AWS, Azure) and AI integration.
Expactations: 7+ years professional experience in data integration and modeling, excluding internships. Proficiency in ETL/ELT tools, cloud architecture, and automation.
Key Responsibilities
- Architect and implement data pipelines, workflows, and APIs for data integration, transformation, and delivery.
- Optimize and industrialize data processing pipelines (Talend, dbt, Snowflake) and ensure CI/CD, monitoring, and data quality.
- Deploy and automate data systems, including MLOps practices for AI model integration (NLP, computer vision, optimization).
- Provide technical direction on innovative solutions, including AI/ML use cases, and contribute to technical roadmaps.
- Collaborate cross-functionally on projects across sectors (finance, logistics, energy, retail) with autonomy and end-to-end ownership.
Required Skills
- ETL/ELT tools: Talend, Informatica, Azure Data Factory, dbt.
- Cloud platforms: Snowflake, AWS, Azure.
- Databases: SQL, PostgreSQL, Oracle, BigQuery.
- Development: SQL, Python, Java; CI/CD; automated testing; data quality frameworks.
- AI/ML: Applied machine learning models, APIs, NLP, computer vision, generative AI.
- Data architecture: Data warehouse, lakehouse, data mesh, data catalog.
- Languages: Fluent written and spoken French and English required.
Required Education & Certifications
- Bachelor’s or Master’s in Computer Science/Data Engineering.
- Mandatory: 7+ years verified work experience in data integration/modeling.
- Preferred (not mandatory): Certifications in Talend, Snowflake, GCP, dbt, Power BI, or AI/ML.