Roles & Responsibilities
Job Description
We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and platforms. This role supports digital transformation initiatives and requires working closely with infrastructure and AI engineers to provide clients with robust data infrastructure for insights and innovation.
Key Responsibilities
- Design, implement, test, deploy, and maintain secure, scalable data engineering solutions and pipelines
- Integrate new data sources into data warehouses and build reports / visualizations
- Develop insights and recommendations, delivering advice and solutions to client problems
- Ensure best-in-class security measures within data platforms
- Automate repetitive data management tasks through scalable, replicable code
- Contribute to fostering a data-driven culture
Requirements
Bachelor’s degree in Information Technology, Computer Science, or related field1–3 years of experience in data engineering or related fieldsProficiency in Python or Java for data manipulation and automationExperience building and maintaining ETL pipelines on big data platformsStrong knowledge of relational databases (SQL, MySQL), Hadoop, Spark, and column-oriented databases (e.g., BigQuery, Cassandra)Understanding of data lifecycle concepts and data transformation methodsFamiliarity with Google Cloud Platform (GCP) services for data engineeringExperience in building Data Lake and Data Warehouse solutionsAbility to work independently and lead small teamsTell employers what skills you have
Digital Transformation
Big Data
Ability To Work Independently
Pipelines
Hadoop
Data Transformation
Data Management
Google Cloud Platform
MySQL
ETL
Information Technology
Cassandra
Data Engineering
Python
Java
Databases