cover image
McCabe & Barton

McCabe & Barton

www.mccabebarton.com

8 Jobs

20 Employees

About the Company

McCabe & Barton are part of the Holley Holland Group.
Having built our reputation on successfully delivering cross-functional results within Technology & Business, Change and Transformation, for 20 years, we are respected and regarded as an industry leader in recruitment research and execution.

By developing long term relationships with our clients, acting as an extension of their business, identifying and engaging suitable candidates on their behalf, we have established ourselves as a trusted partner.

We are a member of APSCO and have recently been successfully audited.

Listed Jobs

Company background Company brand
Company Name
McCabe & Barton
Job Title
Data Engineering Manager
Job Description
Job Title: Data Engineering Manager Role Summary: Lead the design, improvement, and maintenance of enterprise-grade data pipelines and architectures, steering a small team of data engineers within a fast-paced, Agile environment. Expectations: - Supervise and mentor 3–5 data engineers, fostering collaboration and continuous improvement. - Champion data quality, reliability, and performance across all pipeline stages. - Ensure alignment with cloud infrastructure best practices and regulatory requirements. Key Responsibilities: - Design, implement, and optimize end-to-end data pipelines that integrate Snowflake, Azure Data Factory, and Azure DevOps. - Conduct code reviews, enforce coding standards, and promote reusable, modular architecture. - Collaborate with data scientists, analysts, and product stakeholders to translate business requirements into scalable data solutions. - Monitor pipeline performance, troubleshoot issues, and implement automated alerting and remediation. - Lead continuous improvement initiatives for data tooling, documentation, and process efficiency. - Manage deployment pipelines, versioning, and release cycles in a DevOps-driven workflow. - Keep up-to-date with industry trends, emerging technologies, and best practices in data engineering. Required Skills: - Advanced SQL programming and experience with Snowflake. - Proficiency in Azure ecosystem (Azure Data Factory, Azure DevOps, Azure Storage). - Strong programming background (Python, and familiarity with Java or C++). - Proven experience managing small data engineering teams in Agile environments. - Excellent problem‑solving, communication, and stakeholder management skills. - Demonstrated ability to design scalable, fault‑tolerant data pipelines. Required Education & Certifications: - Bachelor’s or Master’s degree in Computer Science, Software Engineering, Information Systems, or related field. - Professional certifications such as Microsoft Certified: Azure Data Engineer Associate or Snowflake SnowPro Core are highly desirable.
London, United kingdom
Remote
25-11-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Head of Development
Job Description
Job Title: Head of Development Role Summary: Interim leader responsible for directing a 10‑20 engineer team to deliver robust automation, AI/ML, and software development initiatives, driving organizational efficiency and aligning technical roadmaps with business goals. Expactations: Deliver a high‑velocity, scalable technology pipeline over a 6‑month engagement, ensuring measurable ROI on automation and AI projects, and establish lasting governance for coding standards, DevOps, and SRE practices. Key Responsibilities - Lead, mentor, and scale a distributed development team, setting culture, standards, and back‑log priorities. - Own end‑to‑end automation strategy, identifying and executing efficiency opportunities via GitHub Actions, Terraform, Ansible, and RPA tools. - Design, implement, and integrate AI/ML solutions, managing MLOps workflows and large‑language‑model deployments. - Define and enforce coding standards, architecture decisions, and technical debt reduction plans. - Develop and communicate technical roadmaps, SLO/SLA metrics, and ROI analyses to C‑suite stakeholders. - Implement CI/CD, containerization, observability, and incident‑management best practices across cloud‑native microservices. - Facilitate cross‑functional workshops, change‑management initiatives, and vendor/tool evaluations. Required Skills - Proven track record leading engineering teams of 10‑20+ members. - Deep expertise in automation (GitHub Actions, Terraform, Ansible, Airflow, Prefect, RPA). - Full‑stack development (Python, JavaScript/TypeScript, Java, Go) with modern frameworks (React, Node.js, Django, FastAPI). - Microservices, API design (REST/GraphQL), Docker, Kubernetes, cloud‑native patterns. - DevOps & SRE: CI/CD pipelines, infrastructure monitoring, log aggregation, SLO/SLA, incident response. - AI/ML: MLOps tools, LLM integration, proof‑of‑concept development. - Strategic business acumen: technical roadmaps, cost‑benefit analysis, ROI presentation. - Strong stakeholder management, communication, and organizational change skills. Required Education & Certifications - Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field. - Certifications: Certified Kubernetes Administrator (CKA) or similar; CI/CD/deployment tools certifications; Cloud provider certs (AWS, GCP, Azure) preferred.
London, United kingdom
On site
25-11-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Project Manager
Job Description
**Job Title** Project Manager **Role Summary** Lead the delivery of cloud‑based and outsourced IT solutions for an investment management client. Combine project management and business analysis functions to define requirements, coordinate multiple vendors, ensure successful implementation, and drive adoption of third‑party systems while exploring AI integration to enhance efficiency and decision‑making. **Expectations** - Deliver projects on scope, time, cost, and quality targets. - Maintain strong vendor performance and contractual compliance. - Champion AI‑enabled process improvements to reduce operational risk. - Provide clear, data‑driven status reporting to stakeholders. **Key Responsibilities** - Develop detailed project plans, schedules, budgets, and risk registers. - Conduct business requirements gathering, process mapping, and stakeholder workshops. - Evaluate, select, and manage third‑party vendors and service providers. - Negotiate contracts, manage scopes, and ensure adherence to procurement standards. - Oversee system implementation, integration (APIs, connectors), testing, and go‑live activities. - Facilitate user training, change management, and adoption drives. - Monitor post‑implementation performance against KPIs and implement corrective actions. - Ensure IT governance compliance and regulatory alignment. - Identify opportunities to embed Generative AI/Agentic AI tools in workflows and lead pilot initiatives. **Required Skills** - Proven experience delivering SaaS or cloud‑based business systems in the investment/financial services sector. - Strong business analysis capability: requirements elicitation, process modeling, stakeholder engagement. - Vendor & contract management proficiency with procurement and evaluation experience. - Solid understanding of cloud delivery models (SaaS, PaaS, APIs) and related integrations. - Excellent communication, documentation, and influencing skills. - Highly organized, detail‑oriented, and able to manage multiple concurrent priorities. - Results‑driven with a track record of meeting or exceeding project objectives. **Required Education & Certifications** - Project Management certification (PRINCE2, AgilePM, PMP) **or** Business Analysis qualification (BCS, IIBA). - Degree or equivalent in IT, Computer Science, Business, or related discipline. ---
London, United kingdom
On site
19-12-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Data Platform Engineer
Job Description
Job Title: Data Platform Engineer Role Summary: Design, develop, and maintain scalable, secure cloud‑based data platforms on Microsoft Azure and Databricks. Deliver performant data pipelines, data lake architecture, and analytics environments that support business insights and decision‑making. Expactations: - 5+ years of professional experience with Azure data services (Data Factory, ADLS, Synapse, Azure SQL). - Proven expertise in Databricks, Delta Lake, and cluster management. - Strong coding skills in Python and SQL; experience with PySpark or Scala. - Hands‑on experience with IaC (Terraform, ARM templates) and CI/CD pipelines. - Knowledge of data modelling, governance, and security best practices. - Mentor‑like collaboration with data scientists, analysts, and stakeholders. Key Responsibilities: - Build and optimize data pipelines using Azure Data Factory and Databricks. - Develop ETL/ELT processes to transform raw data into analytics‑ready formats. - Architect and deploy data lake solutions in Azure Data Lake Storage, implementing medallion layers and Delta Lake. - Implement governance, security controls, and backup/disaster‑recovery strategies. - Use Terraform or equivalent IaC tools for reproducible infrastructure deployments. - Configure Databricks clusters, write jobs in PySpark/Scala, and manage CI/CD for Databricks artifacts. - Monitor performance with Azure Monitor, Log Analytics, and Databricks monitoring tools; tune for SLA compliance. - Collaborate across teams to define requirements, document designs, and share knowledge. Required Skills: - Microsoft Azure data services (Data Factory, ADLS, Synapse, Azure SQL Database). - Databricks, Delta Lake, cluster/instance configuration. - Python, SQL, PySpark/Scala development. - Git/GitHub, CI/CD pipelines, and familiarity with Azure DevOps or equivalent. - Data modelling, governance, and security concepts. - Performance tuning, monitoring, and backup/recovery practices. Required Education & Certifications: - Bachelor’s degree in Computer Science, Data Engineering, or related field (or equivalent experience). - Certified Azure Data Engineer or Databricks certification preferred.
London, United kingdom
On site
Mid level
24-12-2025