Roles & Responsibilities
Description :
POSITION OVERVIEW : Assoc Data Engineer
POSITION GENERAL DUTIES AND TASKS :
Key Skills : ETL, ELT Databricks using PySpark, SQL, and Delta Lake Snowflake, Redshift, BigQuery, Synapse AWS, Azure, GCP Role and Responsibilities : Design, develop, and maintain data pipelines on Databricks using PySpark, SQL, and Delta Lake. Build scalable ETL / ELT processes for ingestion, transformation, and delivery across structured and unstructured datasets. Collaborate with data scientists and analysts to enable machine learning and AI workflows on Databricks. Optimize data lakehouse performance using Delta Lake, caching, partitioning, and indexing strategies. Integrate Databricks with enterprise systems, cloud data platforms (AWS, Azure, GCP), and BI tools. Implement data governance, quality, and security standards within the Databricks environment. Monitor, troubleshoot, and improve Databricks jobs, clusters, and workflows. Support CI / CD automation, DevOps practices, and Infrastructure-as-Code (IaC) for Databricks deployments.
Requirements / Qualifications :
Bachelor’s degree in Computer Science, Data Engineering, or related field.
3–5 years of experience in data engineering, ETL / ELT development, or big data platforms.
Hands-on expertise with Databricks, PySpark, and Delta Lake.
Strong proficiency in SQL and working with large datasets in cloud data warehouses (Snowflake, Redshift, BigQuery, Synapse). Experience with cloud platforms (AWS, Azure, GCP) and their native data services. Knowledge of data governance, data quality, and security principles. Familiarity with CI / CD, Git, and Infrastructure-as-Code (Terraform, ARM, CloudFormation).
Tell employers what skills you have
Machine Learning
Git
PySpark
Azure
Big Data
Pipelines
ARM
Hadoop
ETL
Data Quality
Data Governance
Data Engineering
SQL
GCP
Business Process
Analysis Services
Data Engineer • D01 Cecil, Marina, People’s Park, Raffles Place, SG