This is a 1 year contract position. The Data Engineer will serve as a technical expert in the fields of design and develop AI data pipelines to manage both large unstructured and structured datasets, with a particular focus on GenAI RAG / Agent solutions. The Data Engineer will serve as a technical expert in the fields of design and develop AI data pipelines to manage both large unstructured and structured datasets, with a particular focus on GenAI RAG / Agent solutions.
In your new role you will :
- Working closely with data scientists and domain experts to design and develop AI data pipelines using agile development process.
- Developing pipelines for ingesting and processing large unstructured and structured datasets from a variety of sources, ensure efficient and effective data processing.
- Development of BIA solution using defined framework for Data Modelling; Data Profiling; Data Extraction, Transformation & Loading
- Design and provide data / information in form of reports, dashboards, scorecards and data storytelling using Visualization Tools such as Business Objects & Tableau.
- Work with cloud technologies such as AWS to design and implement scalable data architectures
- Supporting the operation of the data pipelines involves troubleshooting and bug fixing, as well as implementing change requests to ensure that the data pipelines continue to meet user requirements.
You are best equipped for this task if you have :
Master's or Bachelor's Degree in Computer Science / Mathematics / Statistics or equivalent.Minimum of 3 years of relevant work experience in data engineering, including in-depth technical knowledge of databases, BI tools, SQL, OLAP, ETL, RAG / Agentic Data pipeline.Proficient in RDBMS : Oracle / PL SQLExtensive hands-on experience in conceptualising, designing, and implementing data pipelines. Proficiency in handling unstructured data formats (, PPT, PDF, Docx), databases (RDMS, NoSQL such as Elasticsearch, MongoDB, Neo4j, CEPH) and familiarity with big data platforms (HDFS, Spark, Impala).Experience in working with AWS technologies focusing on building scalable data pipelines.Front-end Reporting & Dashboard and Data Exploration tools –TableauStrong background in Software Engineering & Development cycles (CI / CD) with proficiency in scripting languages, particularly Python.Good understanding and experience with Kubernetes / Openshift Platform.Other Skills / Attributes :
Good understanding of data management, data governance, and data security practices.Highly motivated, structured and methodical with high degree of self-initiativeTeam player with good cross-cultural skills to work in an international teamCustomer and result-orientedThis is a 12 months contract under 3rd party payroll partner and entitled to benefits according to partner companyBenefits
Wide range of training offers & planning of career developmentInternational assignmentsDifferent career paths : Project Management, Technical Ladder, Management & Individual ContributorStaggered working hours for normal shift employeesHome office options, certain conditions apply.Part-time work possible (applicable for normal shift employees)On-site day-care centerMedical coverageOn-site social counselling and works doctorHealth promotion programsOn-site canteenPrivate insurance offersPaid sick leave according to law, personal accident & work injury insurance, long term illness leaveRetirement benefits, re-employment opportunities, employment assistance paymentPerformance bonusProvision of long haul transport for shift and shuttle services for office to defray transport costs