Talent.com
Senior Data Engineer

Senior Data Engineer

OPENSOURCE PTE. LTD.Singapore, SG.01, Singapore
18 days ago
Job description

Roles & Responsibilities

Role Overview

The Senior Data Engineer is responsible for designing, building, and maintaining large-scale, secure, and high-performance data pipelines supporting critical Financial Services workloads.

The role focuses on data modernization, regulatory data aggregation, and AI / ML enablement across domains such as Core Banking, Payments, Risk, Treasury, and Regulatory Reporting.

2. Key Responsibilities

  • Design, implement, and optimize ETL / ELT data pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
  • Build and operationalize real-time streaming pipelines leveraging Kafka / Confluent / Azure Event Hubs for risk and liquidity data.
  • Integrate and transform data across Core Banking, Trade, Payments, Treasury, CRM, and Compliance systems.
  • Implement data quality, validation, and lineage controls using tools such as Great Expectations / Deequ / dbt tests.
  • Develop and maintain data models and schemas (3NF, Dimensional, Data Vault 2.0).
  • Collaborate with Security and Governance teams to implement data security, masking, encryption, and tokenization in compliance with MAS TRM / PDPA / PCI-DSS.
  • Participate in data platform modernization projects (Teradata / DB2 → Snowflake / Databricks / Synapse).
  • Collaborate with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.
  • Support regulatory reporting (MAS 610 / 649) and Basel III / IV data flows.
  • Maintain CI / CD pipelines for data infrastructure using Azure DevOps / Terraform / GitHub Actions.

3. Required Technical Skills

  • Category Tools / Technologies
  • Languages Python, PySpark, SQL, Scala
  • Data Platforms Azure Data Lake, Synapse, Databricks, Snowflake
  • Orchestration Apache Airflow, Azure Data Factory, dbt
  • Streaming Kafka, Confluent, Event Hubs
  • Governance Apache Atlas, Azure Purview, Collibra
  • Security Encryption, RBAC, Tokenization, Audit Logging
  • CI / CD & IaC Terraform, Azure DevOps, GitHub Actions
  • 4. Experience and Qualifications

  • 6 – 10 years of experience in data engineering, with at least 3 years in BFSI (banking, insurance, or capital markets).
  • Proven experience building real-time and batch data pipelines on Azure or AWS.
  • Exposure to regulatory data models (MAS 610, Basel III, IFRS 9 / 17, BCBS 239).
  • Familiarity with DevOps and MLOps integration.
  • Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
  • Certifications preferred : Microsoft Azure Data Engineer Associate, Databricks Data Engineer Professional, Snowflake SnowPro Core.
  • 5. Key Attributes

  • Strong analytical and problem-solving mindset.
  • Ability to work across multi-disciplinary and geographically distributed teams.
  • Excellent written and verbal communication skills.
  • High accountability and ownership for quality and delivery.
  • Tell employers what skills you have

    Microsoft Azure

    PySpark

    Factory

    Airflow

    Apache Spark

    Scala

    Azure

    Pipelines

    Data Quality

    Data Engineering

    Logging

    SQL

    Python

    Tokenization

    Orchestration

    Apache

    Create a job alert for this search

    Senior Data Engineer • Singapore, SG.01, Singapore