cover image
IQuest Solutions Corporation

Data Scientist

On site

Raleigh, United states

Mid level

Freelance

20-02-2026

Share this job:

Skills

Communication Python Bash SQL MySQL PostgreSQL Incident Response GitHub GitLab CI/CD Docker Monitoring Test Training Machine Learning PyTorch Scikit-Learn TensorFlow Regression Programming Databases Azure AWS Pandas GCP Snowflake Data Science Databricks PySpark GitHub Actions

Job Specifications

Technology – Data Scientist

Must Have: Python & PySpark experience

Must Have: AWS, GCP or other machine learning certifications

Must Have: XGBoost, Time Series, PyTorch, or TensorFlow experience

Employement Type: W2

Key Responsibilities

Build ML Models: Design and implement predictive and prescriptive models for regression, classification, and optimization problems. Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
Train and Tune Models: Develop and tune machine learning models using Python, PySpark, TensorFlow, and PyTorch.
Collaboration & Communication: Work closely with stakeholders to understand business challenges and translate them into data science solutions. Collaborate with cross-functional teams to ensure successful integration of models into business processes.
Monitoring & Visualization: Rapidly prototype and test hypotheses to validate model approaches. Build automated workflows for model monitoring and performance evaluation. Create dashboards using tools like Databricks and Palantir to visualize key model metrics like model drift, Shapley values, etc.
Productionize ML: Build repeatable paths from experimentation to deployment (batch, streaming, and low-latency endpoints), including feature engineering, training, and evaluation.
Own ML Platform: Stand up and operate core platform components—model registry, feature store, experiment tracking, artifact stores, and standardized CI/CD for ML.
Pipeline Engineering: Author robust data/ML pipelines (orchestrated with Step Functions / Airflow / Argo) that train, validate, and release models on schedules or events.
Observability & Quality: Implement end-to-end monitoring, data validation, model/drift checks, and alerting SLA/SLOs.
Governance & Risk: Enforce model/version lineage, reproducibility, approvals, rollback plans, auditability, and cost controls aligned to enterprise policies.
Partner & Mentor: Collaborate with on-shore/off-shore teams; coach data scientists on packaging, testing, and performance; contribute to standards and reviews.
Hands-on Delivery: Prototype new patterns; troubleshoot production issues across data, model, and infrastructure layers.

Required Qualifications

Education: Bachelor’s degree in computer science, Information Technology, Data Science, or related field.
Programming: 5+ years’ experience with Python (pandas, PySpark, scikit-learn; familiarity with PyTorch/TensorFlow helpful), bash, and Docker.
ML Experimentation: Design and implement predictive and prescriptive models for regression, classification, and optimization problems. Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
ML Tooling: 5+ years’ experience with SageMaker (training, processing, pipelines, model registry, endpoints) or equivalents (Kubeflow, MLflow/Feast, Vertex, Databricks ML).
Pipelines & Orchestration: 5+ years’ experience with Databricks DABs or Airflow or Step Functions, event-driven designs with EventBridge/SQS/Kinesis.
Cloud Foundations: 3+ years’ experience with AWS/Azure/GCP on various services like ECR/ECS, Lambda, API Gateway, S3, Glue/Athena/EMR, RDS/Aurora (PostgreSQL/MySQL), DynamoDB, CloudWatch, IAM, VPC, WAF.
Snowflake Foundations: Warehouses, databases, schemas, stages, Snowflake SQL, RBAC, UDF, Snowpark.
CI/CD: 3+ years hands-on experience with CodeBuild/CodePipeline or GitHub Actions/GitLab; blue/green, canary, and shadow deployments for models and services.
Feature Pipelines: Proven experience with batch/stream pipelines, schema management, partitioning, performance tuning; parquet/iceberg best practices.
Testing & Monitoring: Unit/integration tests for data and models, contract tests for features, reproducible training; data drift/performance monitoring.
Operational Mindset: Incident response for model services, SLOs, dashboards, runbooks; strong debugging across data, model, and infra layers.
Soft Skills: Clear communication, collaborative mindset, and a bias to automate & document.

Additional Qualifications

Experience in retail/manufacturing is preferred.

About the Company

iQuest, we are focused on bringing together the best people and the latest technologies to create and deliver best value for our customers. IQuest is backed by immersive experience and thrives and disruptive innovation capabilities; to deliver value to disparate sectors in the digital landscape, within means. Know more