Roles & Responsibilities
Responsibilities
Build & Automate ML Pipelines : Design, implement, and maintain CI / CD pipelines for machine learning models, ensuring automated data ingestion, model training, testing, versioning, and deployment.
Operationalize Models : Collaborate closely with data scientists to containerize, optimize, and deploy their models to production, focusing on reproducibility, scalability, and performance.
Infrastructure Management : build and manage the underlying cloud infrastructure (AWS) that powers our MLOps platform, leveraging Infrastructure-as-Code (IaC) tools to ensure consistency and cost optimization.
Monitoring & Observability : Implement comprehensive monitoring, alerting, and logging solutions to track model performance, data integrity, and pipeline health in real-time. Proactively address issues like model or data drift.
Tooling & Frameworks : Develop and maintain reusable tools and frameworks to accelerate the ML development process and empower data science teams.
Required Qualifications
Experience : Overall 5+ years of experience with 2+ years of experience in MLOps, Machine Learning Engineering, or a related DevOps role with a focus on ML workflows.
Cloud Expertise : Extensive hands-on experience in designing and implementing MLOps solutions on AWS. Proficient with core services like SageMaker, S3, ECS, EKS, Lambda, SQS, SNS, and IAM.
Coding & Automation : Strong coding proficiency in Python. Extensive experience with automation tools, including Terraform for IaC and GitHub Actions.
MLOps & DevOps : A solid understanding of MLOps and DevOps principles. Hands-on experience with MLOps frameworks like Sagemaker Pipelines, Model Registry, Weights and Bias, MLflow or Kubeflow and orchestration tools like Airflow or Argo Workflows.
Containerization : Expertise in developing and deploying containerized applications using Docker and orchestrating them with ECS and EKS.
Model Lifecycle : Experience with model testing, validation, and performance monitoring. Good understanding of ML frameworks like PyTorch or TensorFlow is required to effectively collaborate with data scientists.
Communication : Excellent communication and documentation skills, with a proven ability to collaborate with cross-functional teams (data scientists, data engineers, and architects).
Tell employers what skills you have
TensorFlow
Machine Learning
Airflow
Scalability
Pipelines
SageMaker
Architects
Documentation Skills
PyTorch
Python
Containerization
Docker
Data Science
Orchestration
Github
S3
Machine Learning • D16 Upper East Coast, Bedok, Eastwood, Kew Drive, SG