cover image
VySystems Singapore

VySystems Singapore

vysystems.com.sg

6 Jobs

7 Employees

About the Company

Vy Systems Pte Ltd, the first company in the vy.ventures family, was incorporated on 3rd May 2002. Since then, it has been providing valuable services across many countries. We have formulated company policies and protocols based on our distinctive DNA, which has evolved over two decades and strikes a balance between IQ and EQ. Our DNA integrates emotional intelligence, analytical competencies, intellectual capabilities, plain old common sense, and presence of mind to solve problems and make critical business decisions. We adopt a people-centric culture that resolves issues through passionate disagreements and encourages objective debates to reach consensual solutions. These principles translate into stellar services, enabling us to deliver on our commitments to all stakeholders. The company's values are built on transparency, trust, reliability, responsiveness, and conducting business in a soulful manner. Our expertise spans web and mobile app development, digital media, AR/VR solutions, executive and leadership hiring, staffing and recruiting, online training services, e-commerce development, and AI & ML solutions.

Listed Jobs

Company background Company brand
Company Name
VySystems Singapore
Job Title
Scala Developer
Job Description
Job Title: Scala Developer Role Summary: Develop and optimize big data solutions using Hadoop, Spark, and Scala, with domain expertise in Liquidity Reporting and Capital Markets. Requires strong programming and communication skills. Expectations: Expertise in Hadoop, Spark, Scala, Java, and NoSQL databases. Strong understanding of financial domains. Proven problem-solving and collaboration abilities. Key Responsibilities: - Design and optimize Hadoop-based data processing solutions. - Build and maintain applications using Apache Spark and Scala. - Develop data pipelines with Hadoop ecosystem tools. - Manage NoSQL databases (Cassandra) for high-volume data. - Create backend components in Java. - Support liquidity reporting and capital markets use cases. - Analyze performance bottlenecks and implement optimizations. - Present technical solutions to stakeholders. Required Skills: Hadoop development, Apache Spark, Scala programming, Java development, Cassandra, Liquidity Reporting, Capital Markets, Communication/presentation. Required Education & Certifications: Not specified.
Toronto, Canada
Hybrid
22-01-2026
Company background Company brand
Company Name
VySystems Singapore
Job Title
Akamai Business System Analyst+ CASA
Job Description
**Job title:** Business System Analyst – API Security and Governance (CASA) **Role Summary:** Serve as the primary liaison between cybersecurity, development, architecture, and business teams to establish, maintain, and continuously improve API security and governance capabilities. Translate regulatory and business needs into actionable API security requirements, ensure compliance with enterprise standards, and drive adoption of best practices across the API lifecycle. **Expectations:** - Hold or be certified as a Certified API Security Analyst (CASA) – optional but advantageous. - Strong knowledge of API security protocols (OAuth 2.0, OpenID Connect, JWT, mTLS) and regulatory frameworks (PCI DSS, GDPR, Open Banking). - Proven experience leading API governance initiatives, vulnerability assessments, and penetration testing remediation. - Excellent communication skills for facilitating workshops, training, and cross‑functional collaboration. - Ability to produce clear documentation, traceability matrices, and compliance reports. **Key Responsibilities:** 1. Gather, document, and translate business, risk, and regulatory requirements into API security and governance specifications. 2. Assess current and target‑state API lifecycle processes; recommend enhancements to strengthen security posture. 3. Define, validate, and enforce controls for authentication, authorization, encryption, rate limiting, and threat detection on APIs. 4. Align all security practices with corporate InfoSec standards and industry best practices. 5. Conduct and support vulnerability assessments, penetration tests, and remediation of API security findings. 6. Design, implement, and manage governance processes covering API design, onboarding, publishing, versioning, monitoring, and decommissioning. 7. Ensure compliance with company API governance framework and external regulatory obligations; prepare audit‑ready documentation. 8. Maintain accurate metadata and end‑to‑end traceability across the API catalog. 9. Act as a liaison among cybersecurity, development, architecture, risk, and business stakeholders. 10. Lead workshops, training sessions, and adoption initiatives for API security best practices. 11. Oversee requirement traceability from development through testing to deployment. 12. Manage integration of APIs with monitoring and logging platforms (e.g., Akamai, Splunk, Apigee, MuleSoft). 13. Generate governance scorecards, compliance reports, and metrics for leadership and audit teams. 14. Identify, assess, and manage risks, dependencies, and change requests throughout the API lifecycle. **Required Skills:** - API security architecture and best practices - OAuth 2.0, OpenID Connect, JWT, mTLS, encryption, rate limiting, threat detection - Regulatory compliance (PCI DSS, GDPR, Open Banking) - Vulnerability assessment, penetration testing, remediation - API lifecycle management and governance frameworks - Documentation, traceability, metadata management - Strong stakeholder communication and workshop facilitation - Experience with API monitoring/logging platforms (Akamai, Splunk, Apigee, MuleSoft) - Project management and risk assessment **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Security, or related field, or equivalent experience. - Certified API Security Analyst (CASA) preferred; other API or security certifications (e.g., CISSP, CISA, CISM, OCP) are advantageous.
Toronto, Canada
Hybrid
29-01-2026
Company background Company brand
Company Name
VySystems Singapore
Job Title
Java Kafka developer with Azure Data Factory
Job Description
**Job Title** Java Kafka Developer – Azure Data Factory **Role Summary** Design and develop Java-based Kafka microservices, build and maintain ETL pipelines on Azure, and orchestrate data workloads using Azure Data Factory, Databricks, and open‑source tools. Collaborate with data engineering, DBAs, and DevOps to deliver high‑performance, scalable data solutions in a cloud‑native environment. **Expectations** - Deliver production‑ready Java Kafka services and data pipelines within defined timelines. - Ensure code quality, performance, and maintainability using automated testing and CI/CD. - Integrate data from SQL, NoSQL, and streaming sources into Azure data platforms. - Maintain documentation, participate in design reviews, and provide technical guidance. **Key Responsibilities** - Build and maintain Kafka producers/consumers and microservices in Java. - Design, develop, and schedule ETL workflows in Azure Data Factory and Databricks. - Use SQL, Python (pandas, PySpark, ibis) for data transformation and validation. - Orchestrate data pipelines with Airflow, NiFi, or other tooling. - Develop and manage NoSQL schemas (Cassandra, MongoDB) and integrate them into pipelines. - Containerise services using Docker and deploy to Kubernetes clusters on Azure. - Configure and maintain CI/CD pipelines using GitHub Actions/ Azure DevOps. - Monitor pipeline performance, troubleshoot failures, and implement optimizations. - Work with cross‑functional teams to define data requirements and data quality standards. **Required Skills** - Strong Java programming and Kafka expertise (topics, partitions, consumer groups). - Experience with Azure PaaS services: Data Factory, Databricks, Data Lake Storage, Azure SQL. - Proficiency in SQL and Python for ETL scripting (pandas, PySpark, ibis). - Knowledge of NoSQL databases: Cassandra, MongoDB. - Familiarity with ETL orchestration tools (Airflow, NiFi, Griffin, Hamilton). - Containerization (Docker) and orchestration (Kubernetes, AKS). - Source control with Git, CI/CD best practices, and automated testing. - Ability to troubleshoot performance and scalability issues in cloud pipelines. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field. - Professional certifications are a plus: Azure Data Engineer Associate, Microsoft Certified: Azure Developer Associate, Confluent Certified Developer for Apache Kafka, or relevant Spark/Databricks certification.
Toronto, Canada
Hybrid
05-02-2026
Company background Company brand
Company Name
VySystems Singapore
Job Title
Dev (C/C++ & Linux)) + Kafka
Job Description
**Job Title:** Software Developer – C/C++ & Linux, Kafka **Role Summary:** Design, develop, and maintain high‑performance messaging and streaming applications using IBM MQ, Apache Kafka, and C/C++ on Linux. Collaborate with cross‑functional teams to optimize legacy systems, troubleshoot incidents, and apply rigorous object‑oriented and functional programming practices. **Expectations:** - Deliver clean, maintainable code that meets performance and reliability standards. - Participate actively in incident response, root cause analysis, and post‑mortem reviews. - Continually refactor and improve codebases using SOLID principles, design patterns, and functional paradigms. - Engage in peer code reviews, automated testing, and CI/CD pipelines. **Key Responsibilities:** - Develop and maintain distributed messaging solutions with IBM MQ and Kafka. - Write production‑grade C/C++ and Java/Scala modules, integrating them with vendor products. - Analyze production incidents, identify legacy system bottlenecks, and implement improvement plans. - Apply object‑oriented design, SOLID principles, and functional programming techniques (immutability, higher‑order functions, lambdas). - Create and maintain technical documentation for new features and system changes. - Collaborate with DevOps to manage Linux/Unix servers, scripting, and deployment automation. - Support optional fraud/financial analytics workflows and Elastic Search integration when required. **Required Skills:** - Advanced proficiency in C/C++ (systems or embedded development). - Solid experience in Java and/or Scala application development. - Deep knowledge of IBM MQ and Apache Kafka architectures and APIs. - Expertise in Linux/Unix command line, scripting, and server administration. - Strong incident analysis, debugging, and production problem‑solving skills. - In‑depth understanding of SOLID, design patterns, and functional programming concepts. - Familiarity with Git, CI/CD pipelines, containerization, and automated testing frameworks. **Required Education & Certifications:** - Bachelor’s (or higher) degree in Computer Science, Software Engineering, or a related technical field. - Optional certifications: Apache Kafka Engineer, Red Hat Certified Engineer (RHCE), Microsoft Certified: Azure Developer Associate, or equivalent. **Nice‑to‑have:** - Experience in fraud detection or financial analytics applications. - Knowledge of Elastic Search and query optimization.
Toronto, Canada
On site
09-02-2026