cover image
TekValue IT Solutions

TekValue IT Solutions

www.tekvalueit.com

7 Jobs

51 Employees

About the Company

TekValue IT Solutions is a fast growing IT Solutions & Services (Product Developer, Staffing, Consulting & Trainning), Overseas Education consulting, Job Portal, Resume Database, Immigration Services, and Taxation. We provide services for all the above services all over the world, but primary focus would be USA, Canada, UK and India. We also provide third party End to End US Staffing services from India, which includes (Educating, Marketing, Support & Placement). We do recruitment services for Contract, Contract to Hire & Fulltime United States, Canada & India.

Listed Jobs

Company background Company brand
Company Name
TekValue IT Solutions
Job Title
Observability Engineer with Platform Exp
Job Description
**Job Title** Observability Engineer **Role Summary** Design, implement, and maintain observability solutions across distributed microservices. Build and evolve pipelines using OpenTelemetry, integrate with modern monitoring platforms, and customize instrumentation to provide actionable insights for operations and development teams. **Expectations** - Deploy and manage observability tools (Splunk, Datadog, Dynatrace, New Relic, LogicMonitor). - Craft end‑to‑end monitoring architecture for both infrastructure and applications. - Deliver custom metrics and dashboards that support continuous delivery and incident response. - Actively collaborate with development, DevOps, and security teams to embed observability into the software development lifecycle. **Key Responsibilities** - Design and implement observability pipelines using OpenTelemetry SDKs and collectors. - Configure and extend existing monitoring platforms to ingest application, infrastructure, and custom metrics. - Develop and maintain custom metric exporters or agents using coding best practices. - Integrate security and vulnerability tools with observability streams through API or code. - Create and maintain dashboards, alerts, and reporting for production systems. - Perform root‑cause analysis during incidents; provide recommendations for metric enhancements. - Keep observability strategy up‑to‑date with evolving microservice architecture and tooling ecosystem. **Required Skills** - Proficient in OpenTelemetry (SDK, collector, exporter) and other tracing/metrics protocols. - Experience with at least one of the following platforms: Splunk, Datadog, Dynatrace, New Relic, LogicMonitor. - Strong programming skills in a modern language (Java, Go, Python, or Node.js). - Knowledge of microservices architectures and cloud/n‑cluster environments. - Ability to design custom metrics, log parsing, and alerting logic. - Familiarity with CI/CD pipelines and container orchestration (Kubernetes, Docker). - Understanding of security observability and integration with vulnerability scanners. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, or related technical field. - Certifications such as Certified Kubernetes Administrator (CKA), Splunk Certified Enterprise Power User, or Datadog Certified Observability Professional are a plus.
Dallas, United states
On site
24-11-2025
Company background Company brand
Company Name
TekValue IT Solutions
Job Title
Senior Logic Apps Developer
Job Description
Job Title: Senior Logic Apps Developer Role Summary: Design, build, and maintain Azure-based integration workflows, focusing on Logic Apps, Power Automate, and Azure Functions to automate business processes and connect cloud/on‑prem systems. Expectations: Deliver scalable, secure, and cost‑efficient solutions that meet governance and compliance standards, collaborate closely with architects and analysts, and mentor junior team members. Key Responsibilities: - Architect and develop Azure Logic Apps, Power Automate flows, and Azure Functions. - Integrate cloud and on‑prem services (Dynamics 365, SharePoint, SQL Server, ServiceNow, SAP, Salesforce, custom APIs). - Create and maintain API connections, custom connectors, and webhooks. - Translate business requirements into efficient, reusable workflows. - Troubleshoot, optimize performance, reliability, and cost of existing integrations. - Implement error handling, logging, and monitoring using Application Insights and Azure Monitor. - Apply DevOps practices: CI/CD pipelines with Azure DevOps or GitHub Actions, Git-based source control. - Ensure security, governance, and compliance across all integrations. - Mentor and provide technical leadership to junior developers. Required Skills: - 5+ years enterprise integration experience (Azure Logic Apps, Power Automate, Azure Functions). - Deep understanding of Azure services (Functions, API Management, Event Grid, Service Bus, Blob Storage, Key Vault). - Proficient with REST APIs, JSON, XML, HTTP integrations, OAuth2, Managed Identities. - Experience scripting in PowerShell and C#/.NET. - Familiarity with Azure DevOps, Git, CI/CD pipeline configuration. - Strong analytical, problem‑solving, and communication abilities. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Technology, or related field. - Preferred certifications: Azure Developer Associate (AZ-204), Azure Integration Developer Associate (AZ-303/307).
United states
Remote
Senior
25-11-2025
Company background Company brand
Company Name
TekValue IT Solutions
Job Title
On-premise Data Engineer (Python, SQL, Databases)
Job Description
**Job Title:** On‑premise Data Engineer (Python, SQL, Databases) **Role Summary:** Design, develop, and maintain high‑performance on‑premise data pipelines and services. Leverage advanced Python and FastAPI micro‑services to implement business logic, integrate with a variety of relational and NoSQL databases, and support real‑time, user‑interactive applications. Ensure scalability, reliability, and optimal performance of data solutions. **Expectations:** - 5+ years of professional data engineering experience. - Proven ability to work autonomously and in cross‑functional teams. - Strong problem‑solving skills for complex data integration and performance challenges. - Commitment to code quality, testing, and documentation standards. **Key Responsibilities:** - Design and build scalable ETL/ELT processes using Python, SQL procedures, and NoSQL utilities. - Develop and maintain FastAPI‑based micro‑services for data access and business logic. - Integrate relational (Oracle, SQL Server, PostgreSQL, DB2) and NoSQL (Elastic, MongoDB) stores. - Implement real‑time data flows between UI layers and databases using appropriate protocols (REST/JSON, gRPC, WebSockets, etc.). - Optimize query performance and storage architecture for large‑scale workloads. - Write unit, integration, and performance tests; employ CI/CD pipelines for deployment. - Collaborate with senior developers, managers, and directors throughout the development lifecycle. **Required Skills:** - Advanced Python programming (including async, FastAPI). - Expert SQL development and tuning across Oracle, SQL Server, PostgreSQL, DB2. - Proficiency with NoSQL databases: Elastic, MongoDB. - Experience building micro‑services and RESTful APIs. - Familiarity with IDEs/tools such as PyCharm, VS Code, Git, Docker, and testing frameworks (pytest). - Understanding of data serialization formats (JSON, Avro, Protobuf) and communication protocols. - Strong debugging, performance profiling, and database monitoring abilities. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field (or equivalent practical experience). - Relevant certifications (e.g., AWS Certified Data Analytics, Oracle Database Certified Professional) are a plus but not mandatory.
Houston, United states
Hybrid
Mid level
15-12-2025
Company background Company brand
Company Name
TekValue IT Solutions
Job Title
Lead Backend Engineer (Java/Spark, ETL, AWS)
Job Description
**Job Title** Lead Backend Engineer (Java/Spark, ETL, AWS) **Role Summary** Architect and lead the development of scalable, secure data ingestion and processing pipelines using Java, Apache Spark, and AWS serverless services. Drive technical standards, code quality, and performance across distributed ETL workloads without direct people management. **Expectations** - Deliver high‑performance, fault‑tolerant data pipelines on time. - Mentor peers on best practices and emerging technologies. - Champion CI/CD automation, testing, and deployment pipelines. - Ensure compliance with security and data governance standards. **Key Responsibilities** 1. Design, implement, and maintain large‑scale ETL workflows in Apache Spark. 2. Build and optimize AWS serverless components (Step Functions, Glue, Lambda, S3) for data ingestion and transfer. 3. Tune batch processing and distributed computation for performance and cost efficiency. 4. Lead architectural discussions, define coding standards, and review technical designs. 5. Integrate automated testing, CI/CD pipelines, and monitoring into production deployments. 6. Collaborate with data science, data engineering, and DevOps teams to meet business requirements. 7. Continuously improve system reliability, security, and scalability. **Required Skills** - 7+ years of backend development, strong proficiency in Java. - 4+ years of experience building distributed ETL/data processing pipelines with Apache Spark. - Deep hands‑on knowledge of AWS serverless services: Step Functions, Glue, Lambda, S3. - Proven expertise in designing secure, large‑scale file ingestion and data transfer pipelines. - Strong understanding of batch processing, distributed computation, and data performance tuning. - Experience in CI/CD environments with automated testing and deployment. - Demonstrated ability to lead technical direction, set standards, and maintain quality. - Excellent communication, problem‑solving, and collaboration skills. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field (preferred). - AWS Certified Developer – Associate or equivalent certification (preferred).
Mclean, United states
On site
Senior
15-12-2025