cover image
Iris Software Inc.

Iris Software Inc.

www.irissoftware.com

59 Jobs

3,511 Employees

About the Company

Iris Software has been a trusted software engineering partner to several Fortune 500 companies for over three decades. We help clients realize the full potential of technology-enabled transformation by bringing together a unique blend of domain knowledge, best-of-breed technologies, and experience executing essential and critical application development engagements.

Our highly-experienced talent and rightsized teams help develop core, complex, and mission-critical applications and solutions for leading enterprises across Capital Markets/Front office operations, Banking, Investment Management, Brokerage, Risk and Compliance, Insurance, Healthcare/Life Sciences, and Supply Chains/Logistics.

Disclaimer: It has come to our attention that unauthorized individuals or entities are misleading the public by using our company's name, logo, and other identifying information for fraudulent purposes through www.irissoftware.org and www.irissoftware.info. Through the said websites the entity is engaging in deceptive practices, such as making false promises, and conducting unauthorized transactions in our name. The general public is hereby informed alerted about this deceptive entity to prevent any potential harm or misinformation, and are advised to not engage in any communication / transaction with the said entity, in the event that you are contacted by the said websites, you are advised to report here: https://www.irissoftware.com/careers#disclaimer

Listed Jobs

Company background Company brand
Company Name
Iris Software Inc.
Job Title
Principal Python Engineer
Job Description
Job Title: Principal Python Engineer Role Summary: Lead the design, development, and optimization of Directed Acyclic Graph (DAG)-based data orchestration systems. Own the end‑to‑end lifecycle of custom DAG engines/servers, pushing performance, latency, and resource efficiency to production‑grade levels. Expectations: - Deliver scalable, high‑throughput DAG pipelines for enterprise data workflows. - Innovate scheduling strategies, reduce system latency, and maximize resource utilization. - Mentor and collaborate with cross‑functional teams to maintain architectural excellence. Key Responsibilities: - Architect and build production‑ready custom DAG engines, beyond off‑the‑shelf solutions (Airflow, Luigi, etc.). - Optimize scheduling, latency, and resource management across large Python codebases. - Profile, tune, and debug complex distributed systems; conduct root‑cause analysis. - Define and enforce coding standards, performance benchmarks, and deployment best practices. - Work closely with data, analytics, and infrastructure teams to integrate DAG workflows into the broader data platform. Required Skills: - 8+ years of Python engineering experience, focused on backend and system architecture. - Deep expertise in DAG structures, workflow scheduling, and high‑performance system design. - Proven track record designing and deploying custom DAG engines/servers. - Strong analytical mindset; proficiency in profiling, tuning, and debugging large Python applications. - Familiarity with cloud data platforms, containerization (Docker/Kubernetes), and CI/CD pipelines. - Excellent communication, stakeholder management, and technical leadership abilities. Required Education & Certifications: - Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field. - Relevant certifications (e.g., AWS Certified Solutions Architect, Azure Data Engineer, or equivalent) are a plus but not mandatory.
Toronto, Canada
Hybrid
Senior
27-11-2025
Company background Company brand
Company Name
Iris Software Inc.
Job Title
Sr. Abinitio Developer.
Job Description
Job title: Sr. Abinitio Developer Role Summary: Senior engineer responsible for designing, developing, and maintaining high‑performance ETL solutions on the Ab Initio platform. Collaborates with data architects, analysts, and cross‑functional teams to integrate diverse data sources, ensure data quality, and provide production support and performance optimization. Provides technical leadership and mentorship to junior developers. Expectations: • 6+ years of experienced ETL development, with at least 3 years on Ab Initio. • Proven track record in designing scalable data flows, data profiling, and cleansing. • Ability to mentor and guide junior team members. • Strong analytical and problem‑solving skills. • Excellent communication and documentation practices. Key Responsibilities: - Design, develop, and implement Ab Initio ETL data flows. - Collaborate with data architects and business analysts to translate requirements into technical solutions. - Perform data profiling, cleansing, transformation, and lineage tracking. - Optimize workflows and tune performance for large‑volume processing. - Maintain technical documentation and metadata repositories. - Troubleshoot and resolve production data issues. - Provide production support and incident response. - Mentor junior developers and offer technical direction. - Stay current with Ab Initio releases and industry best practices. - Integrate Ab Initio solutions into existing data architectures and ensure seamless operation. Required Skills: - Expertise in Ab Initio Designer, Workbench, and Automation. - Deep knowledge of ETL best practices and data integration patterns. - Strong SQL and data modeling skills. - Experience with performance tuning and workflow optimization. - Familiarity with data quality, profiling, and cleansing techniques. - Ability to create and maintain technical documentation and metadata. - Excellent troubleshooting and analytical abilities. - Strong communication and mentorship abilities. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Systems, or related field. - Ab Initio Advanced Developer or equivalent certification preferred.
Mississauga, Canada
Hybrid
01-12-2025
Company background Company brand
Company Name
Iris Software Inc.
Job Title
Data Engineer (Python and AWS)
Job Description
Job Title: Data Engineer (Python and AWS) Role Summary: Design, build, and maintain scalable data pipelines and services using Python on AWS. Integrate data sources, develop microservices, and implement CI/CD workflows for automated deployment and monitoring. Expectations: Deliver reliable, secure, and high‑performance data solutions that support business analytics and data science initiatives. Collaborate with cross‑functional teams to translate data requirements into technical designs and automate deployment pipelines. Key Responsibilities: - Develop and maintain Python-based ETL pipelines and data processing services on AWS (EC2, Lambda, S3, RDS). - Design and expose RESTful APIs and microservices for data access and transformation. - Implement and manage CI/CD pipelines using Jenkins, GitHub Actions, or AWS CodePipeline. - Containerize applications with Docker and orchestrate with Kubernetes (preferred). - Manage and query SQL and NoSQL databases; ensure efficient data storage and retrieval. - Monitor services with CloudWatch, enforce IAM roles, and apply cloud security best practices. - Troubleshoot performance issues and optimize data workflows. Required Skills: - Proficient in Python programming. - Hands‑on experience with AWS services: EC2, Lambda, S3, RDS, IAM, CloudWatch. - Strong understanding of RESTful API design and microservices architecture. - Experience with CI/CD tools (Jenkins, GitHub Actions, AWS CodePipeline). - Knowledge of Docker; experience with container orchestration (Kubernetes a plus). - Familiarity with relational and NoSQL databases. - Awareness of cloud security best practices. Required Education & Certifications: - Bachelor’s degree in Computer Science, Engineering, or a related technical field. - AWS certifications (Solutions Architect or Developer) are preferred.
Toronto, Canada
Hybrid
01-12-2025
Company background Company brand
Company Name
Iris Software Inc.
Job Title
AWS Python Lead Developer
Job Description
Job Title: AWS Python Lead Developer (Cloud Architect) Role Summary Lead the end‑to‑end design and architecture of cloud‑native banking platforms using AWS services. Build and deploy scalable Python applications, data pipelines, and ML model deployment frameworks that support high‑availability, regulatory‑compliant financial systems. Expectations * Deliver mission‑critical solutions that meet PCI, encryption, IAM, and audit standards. * Mentor and guide engineering teams, conduct design reviews, and shape the long‑term architecture roadmap. * Define and enforce cloud governance, CI/CD, DevOps, and infrastructure best practices. * Drive modernization initiatives while ensuring alignment with enterprise security and compliance policies. Key Responsibilities * Design, develop, and deploy AWS‑based banking solutions (SageMaker, Lambda, API Gateway, ECS/EKS, RDS/Redshift). * Build and optimize Python‑based data pipelines and ML model production pipelines. * Collaborate with data science and engineering teams to productionize ML/AI models on SageMaker. * Establish cloud governance, security controls, identity & access management, and audit logging. * Lead architectural reviews, create best‑practice guidelines, and maintain technical documentation. * Mentor junior engineers, provide architecture guidance, and promote knowledge sharing. * Continuously evaluate emerging AWS services and technologies for potential adoption. Required Skills Technical * Deep expertise in AWS services: SageMaker, Lambda, API Gateway, ECS/EKS, RDS, Redshift, CloudFormation / Terraform. * Advanced Python development, including async, data processing, and ML libraries (scikit‑learn, PyTorch, TensorFlow). * CI/CD pipelines (Jenkins, CodePipeline, GitHub Actions) and infrastructure as code. * Cloud security best practices: IAM, KMS, CloudTrail, GuardDuty, Security Hub. * Performance tuning and cost optimization in AWS. Soft/Leadership * Strong communication, mentorship, and stakeholder management. * Proven ability to lead architecture design reviews and strategic initiatives. * Experience in regulatory‑compliant financial environments. Required Education & Certifications * Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field. * AWS Certified Solutions Architect – Professional or equivalent. * AWS Certified Developer – Associate or equivalent familiarity. * Additional certifications in AWS DevOps, Data Analytics, or ML are a plus.
Toronto, Canada
Hybrid
Senior
02-12-2025