cover image
Yoh, A Day & Zimmermann Company

Senior Data Engineer – Airflow, DBT Core, Kubernetes/OpenShift (CP) - NJ- No corps - W2 only- local only

Hybrid

Jersey city, United states

$ 80 /hour

Senior

Freelance

02-02-2026

Share this job:

Skills

Python SQL Data Engineering Apache Airflow CI/CD Kubernetes Monitoring Training apache git Accounting Analytics CI/CD Pipelines OpenShift

Job Specifications

Please  contact

Renu Goel
857-207-2676
renu.goel@yoh.com

W2 only no corps
Financials services 

Three days oniste per week. Local to NJ/NYC only

Senior Data Engineer – Airflow, DBT Core, Kubernetes/OpenShift

Location:
Onsite

Job Summary:
We are seeking a highly skilled Senior Data Engineer with 8+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift).
This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.

Key Responsibilities:
Data Pipeline & Orchestration
Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines
Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting
Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads
dbt Core & Data Modeling
Lead dbt Core implementation, including project structure, environments, and CI/CD integration
Design and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practices
Implement dbt tests, documentation, macros, and incremental models to ensure data quality and performance
Optimize dbt query performance for large-scale datasets and downstream reporting needs
Cloud, Kubernetes & OpenShift
Deploy and manage data workloads on Kubernetes / OpenShift platforms
Design strategies for workload distribution, horizontal scaling, and resource optimization
Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads
Troubleshoot container-level performance issues and resource contention
Performance & Reliability
Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms
Identify bottlenecks in query execution, orchestration, and infrastructure
Implement observability solutions (logs, metrics, alerts) for proactive issue detection
Ensure high availability, fault tolerance, and resiliency of data pipelines
Collaboration & Governance
Work closely with data architects, platform engineers, and business stakeholders
Support financial reporting, accounting, and regulatory data use cases
Enforce data engineering standards, security best practices, and governance policies

Required Skills & Qualifications:
Experience
10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles
Proven experience designing and supporting enterprise-scale data platforms in production environments
Must-Have Technical Skills
Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
Expert-level DBT Core (data modeling, testing, macros, implementation)
Strong proficiency in Python for data engineering and automation
Deep understanding of Kubernetes and/or OpenShift in production environments
Extensive experience with distributed workload management and performance optimization
Strong SQL skills for complex transformations and analytics
Cloud & Platform Experience
Experience running data platforms on cloud environments
Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows
Preferred Qualifications
Experience supporting financial services or accounting platforms
Exposure to enterprise system migrations (e.g., legacy platform to modern data stack)
Experience with data warehouses (Oracle)

Estimated Min Rate: $56.00
Estimated Max Rate: $80.00

What’s In It for You?
We welcome you to be a part of the largest and legendary global staffing companies to meet your career aspirations. Yoh’s network of client companies has been employing professionals like you for over 65 years in the U.S., UK and Canada. Join Yoh’s extensive talent community that will provide you with access to Yoh’s vast network of opportunities and gain access to this exclusive opportunity available to you. Benefit eligibility is in accordance with applicable laws and client requirements. Benefits include:

Medical, Prescription, Dental & Vision Benefits (for employees working 20+ hours per week)
Health Savings Account (HSA) (for employees working 20+ hours per week)
Life & Disability Insurance (for employees working 20+ hours per week)
MetLife Voluntary Benefits
Employee Assistance Program (EAP)
401K Retirement Savings Plan
Direct Deposit & weekly epayroll
Referral Bonus Programs
Certification and training opportunities

Note: Any pay ranges displayed are estimations. Actual pay is determined by an applicant's experience, technical expertise, and other qualifications as listed in the job description. All qualified a

About the Company

At Yoh, we focus on helping you precisely navigate and fulfill your talent demands. Are you securing the right talent pipelines? Seeking the truth about your talent needs and processes? Start leveraging our deep industry expertise today. Yoh covers your diverse talent and specialized resource needs in the areas of IT, Fintech, Cloud Computing & Migration, Cybersecurity, Product Engineering, Healthcare, Life Sciences, and Interactive, Media & Entertainment. You can be confident that we have the right subject-matter experts... Know more