Job Summary
Role : Confluent Consulting Engineer
Start : ASAP
Duration : 12 Months
Location : Singapore
We are looking for an experienced Confluent Consulting Engineer to design, develop, and maintain real-time data streaming solutions using Apache Kafka and Confluent technologies. The ideal candidate will have a solid background in distributed systems, event-driven architectures, and cloud-native deployments. You will work closely with cross-functional teams to deliver scalable and high-performance streaming solutions.
Requirements
- Minimum 5 years of hands-on experience with Apache Kafka (open-source or distributions like Confluent, Cloudera, AWS MSK).
- Strong proficiency in Java, Python, or Scala.
- Deep understanding of event-driven architecture and streaming data patterns.
- Experience with cloud platforms (AWS, GCP, or Azure).
- Familiarity with Docker, Kubernetes, and CI / CD pipelines.
Preferred / Desired Skills
Experience with Confluent Kafka and its ecosystem (Kafka Streams, Kafka Connect, Schema Registry, KSQL, REST Proxy, Control Center).Hands-on experience with Confluent Cloud services and Apache Flink.Knowledge of Stream Governance, Data Lineage, Stream Catalog, RBAC, and related components.Confluent certifications (Developer, Administrator, or Flink Developer) are a plus.Experience with multi-cloud deployments, Confluent for Kubernetes, or data mesh architectures.Exposure to monitoring tools (Prometheus, Grafana, Splunk) and big data technologies (data lakes, data warehouses).Roles and Responsibilities
Design and implement real-time data pipelines and event-driven architectures using Apache Kafka or Confluent Platform.Develop and maintain Kafka producers, consumers, and streaming applications in Java, Python, or Scala.Integrate Kafka with various data sources and sinks using Kafka Connect and related connectors.Deploy and manage Kafka clusters on cloud platforms (AWS, GCP, Azure) or on-premise environments.Ensure scalability, reliability, and performance of streaming applications.Collaborate with DevOps teams to build and maintain CI / CD pipelines and containerized deployments using Docker and Kubernetes.Monitor and troubleshoot Kafka infrastructure using tools like Prometheus, Grafana, or Splunk.Provide technical guidance and best practices for event streaming and Confluent ecosystem adoption.Please send your application highlighting :
Your relevant experienceCurrent / expected salaryAvailability informationA latest MS-WORD ResumeWe regret that only short-listed applicants will be contacted.
GECO Asia values the data privacy rights of our customers, associates, partners and prospective applicants. We have a privacy policy that governs our collection and use of personal data in place. In conjunction with the PDPA act in Singapore, we have updated our Privacy Policy and Terms of Use to better clarify our collection and use of your personal information. The same can be found here (https : / / www.geco.asia / about / privacy-policy)
Note : GECO Asia is an Information Technology Consulting Services provider. We provide specialist IT and Digital Transformation specialist resources on a project (SOW) and / or permanent basis. We operate under a Comprehensive License offered by Ministry of Manpower, Singapore.
[GECO Asia Pte Ltd, License No. 07C4453]
[2 Venture Drive, #10-18 Vision Exchange, Singapore 608526]
#J-18808-Ljbffr