Roles & Responsibilities
Responsibilities :
Enhance the existing data pipeline using industry-standard tools and mechanisms.
Redesign and implement data pipeline to use AWS managed components.
Implement data pipeline monitoring to alert us if there is any issue with the pipeline.
Investigate and resolve any issue with data pipeline.
Document the data pipeline solution and how to operate the data pipeline.
Implement unit testing and integration testing for data pipeline component.
Requirements :
At least 5 years hands-on working experience in data engineering, designing and building ETL pipelines.
Experience in project execution and demonstrated technical expertise.
Proficient in Python, SQL and Schell scripts (Linux).
Airflow or similar tool to create pipelines.
AWS tools like S3, Glue, AWS EMR, Lambda, Step functions.
Familiar with tools like AWS Batch, Athena, file formats like Parquet, NoSQL knowledge and tools like AWS OpenSearch.
Tell employers what skills you have
Machine Learning
Airflow
Big Data
Pipelines
Unit Testing
Hadoop
ETL
Data Engineering
EMR
SQL
Python
Integration Testing
Java
S3
Databases
Linux
Data Engineer • Islandwide, SG