cover image
Hays

Cloud Engineer

Hybrid

Glasgow, United kingdom

£ 350 /day

Senior

Freelance

05-02-2026

Share this job:

Skills

Python Data Engineering Encryption CI/CD Monitoring AWS CloudFormation AWS PySpark

Job Specifications

Description

CONTRACTOR MUST BE ELIGIBLE FOR BPSS

Role Title: AWS Engineer (Contract)

Location: Glasgow

Rate: 350 £/day through umbrella

Duration: 31/12/2026

Days on site: 2-3

Role Description:

We are seeking a highly skilled Senior AWS Data Engineer with strong hands-on experience building scalable, secure, and automated data platforms on AWS. The ideal candidate will have deep expertise in AWS CloudFormation, data ingestion and transformation services, Python-based ETL development, and orchestration workflows. This role will focus on designing, implementing, and optimizing end to end data pipelines, ensuring data quality, reliability, and governance across cloud-native environments.

Key Responsibilities

Data Engineering & Pipeline Development

• Design, develop, and maintain large scale data pipelines using AWS services such as Glue, Lambda, Step Functions, EMR, DynamoDB, S3, Athena, and other ETL/ELT components.

• Build automated ingestion, transformation, and enrichment workflows for structured and unstructured datasets.

• Implement reusable data engineering frameworks and modular components using Python, PySpark, and AWS-native tooling.

Cloud Infrastructure for Data Platforms

• Develop and manage AWS CloudFormation templates for provisioning secure, scalable data engineering infrastructure.

• Optimize data storage strategies (S3 layouts, partitioning, compression, lifecycle rules).

• Configure and maintain compute services for data workloads (Lambda, ECS, EC2, EMR).

Automation & Orchestration

• Build and enhance orchestration flows using AWS Step Functions, EventBridge, and Glue Workflows.

• Implement CI/CD practices for data pipelines and infrastructure automation.

Security, Governance & Best Practices

• Apply strong authentication/authorization mechanisms using IAM, KMS, access policies, and data access controls.

• Ensure compliance with enterprise security standards, encryption requirements, and governance frameworks.

• Implement data quality checks, schema validation, lineage tracking, and metadata management.

Collaboration & Troubleshooting

• Work with data architects, platform engineers, analysts, and cross functional stakeholders to deliver high quality datasets.

• Troubleshoot pipeline issues, optimize performance, and improve reliability and observability across the data platform.

• Drive continuous improvement in automation, monitoring, and operational efficiency.

Required Skills & Experience

• 8+ years of hands-on experience as a Data Engineer with strong AWS expertise.

• Expert-level proficiency in AWS CloudFormation (mandatory).

• Strong experience with AWS data and compute services:

o Glue, Lambda, Step Functions, EMR

o S3, DynamoDB, Athena

o ECS/EC2 for data workloads where relevant

• Solid experience building ETL/ELT pipelines using Python (and ideally PySpark).

• Strong knowledge of IAM, KMS, encryption, and AWS security fundamentals.

• Ability to design and implement authentication/authorization patterns (OAuth2, API security, IAM roles & policies).

• Strong understanding of distributed systems, data modelling, modern data architectures, and cloud-native design.

• Experience deploying pipelines using CI/CD practices and automated workflows.

About the Company

We are leaders in specialist recruitment and workforce solutions, offering advisory services such as learning and skill development, career transitions and employer brand positioning. As the Leadership Partner to our customers, we invest in lifelong partnerships that empower people and businesses to succeed. We help you achieve your career goals and deliver your business needs by combining meaningful innovation with our global scale and insights. Last year we helped over 280,000 people find their next career. Join the mill... Know more