Roles & Responsibilities
To support the development and maintenance of enterprise data products within our organisation's data platform, we are looking for experienced data engineers to join our team who will be responsible for :
Data Engineering and Platform Integration
- Design, develop, and maintain data pipelines and ETL processes using AWS services (Glue, Athena, S3, RDS)
- Work with data virtualisation tools like Denodo and develop VQL queries
- Ingest and process data from various internal and external data sources
- Perform data extraction, cleaning, transformation, and loading operations
- Implement automated data collection processes including API integrations when necessary
Data Architecture
Design and implement data models (conceptual, logical, and physical) using tools like ER StudioDevelop and maintain data warehouses, data lakes, and operational data storesDevelop and maintain data blueprintsCreate data marts and analytical views to support business intelligence needs using Denodo, RDSImplement master data management practices and data governance standardsTechnical Architecture and Integration
Ensure seamless integration between various data systems and applicationsImplement data security and compliance requirementsDesign scalable solutions for data integration and consolidationDevelopment and Analytics
Develop Python scripts in AWS Glue for data processing and automationWrite efficient VQL / SQL queries and stored proceduresDesign and develop RESTful APIs using modern frameworks and best practices for data servicesWork with AWS Sagemaker for machine learning model deployment and integrationManage and optimise database performance, including indexing, query tuning, and maintenanceWork in an Agile environment and participate in sprint planning, daily stand-ups, and retrospectivesImplement and maintain CI / CD pipelines for automated testing and deploymentParticipate in peer code reviews and pair programming sessionsDocumentation and Best Practices
Create and maintain technical documentation for data models and systemsFollow industry-standard coding practices, version control, and change management proceduresStakeholder Collaboration
Partner with cross-functional teams on data engineering initiativesGather requirements, conduct technical discussions, implement solutions, and perform testingCollaborate with Product Managers, Business Analysts, Data Analysts, Solution Architects, UX Designers to build scalable, data-driven productsProvide technical guidance and support for data-related queriesQualifications and Experience :
At least 3 years of experience in data engineering or similar roleStrong proficiency in Python, VQL, SQLExperience with AWS services (Glue, Athena, S3, RDS, Sagemaker)Knowledge of data virtualisation concepts and tools (preferably Denodo)Experience with BI tools (preferably Tableau, Power BI)Understanding of data modelling and database design principlesFamiliarity with data governance and master data management conceptsExperience with version control systems (Gitlab) and CI / CD pipelinesExperience working in Agile environments with iterative development practicesStrong problem-solving skills and attention to detailExcellent communication skills and ability to work in a team environmentKnowledge of AI technologies (AWS Bedrock, Azure AI, LLMs) would be advantageousTell employers what skills you have
Tableau
Machine Learning
RDS
Business Intelligence
Azure
Pipelines
ETL
Data Integration
Data Governance
Data Engineering
SQL
Python
Data Architecture
API
S3
Power BI