Roles & Responsibilities
Key Responsibilities
- Collaborate across workstreams to support data requirements, including the development of reports and dashboards.
- Analyze and perform data profiling to identify patterns, discrepancies, and improvements in line with Data Quality and Data Management processes.
- Design and develop end-to-end (E2E) data pipelines : ingestion, transformation, processing, and surfacing of data for large-scale applications.
- Automate data pipelines using Azure, AWS data platforms, Databricks, and Data Factory .
- Translate business requirements into technical specifications for project design and delivery.
- Perform data ingestion in batch and real-time via file transfer, APIs, data streaming (Kafka, Spark Streaming).
- Develop ETL processes using Apache Spark to meet data transformation and standardization needs.
- Build data exports, APIs, and visualizations using Power BI, Tableau, or other visualization tools .
- Ensure alignment with best practices in data governance, security, and architecture .
Qualifications & Experience
Bachelor’s degree in Computer Science, Computer Engineering, IT, or related fields .Minimum 4 years’ experience in Data Engineering .Strong skills in :
Programming & Data Engineering : Python, SQL, Spark
Cloud & Data Platforms : Azure, AWS, Databricks, Data Factory
Architecture : Data / Solution Architecture, APIs
Data Visualization Skills : Power BI (preferred) or equivalent tools, DAX programming, data modeling, storytelling, and wireframe design.Business Analyst Skills : Requirement analysis, data profiling, basic data model design, SQL programming, business knowledge.Knowledge of Data Lake, Data Warehousing, Big Data tools, Apache Spark, RDBMS, NoSQL, and Knowledge Graphs .Strong team player , with excellent analytical and problem-solving skills.Tell employers what skills you have
Tableau
Azure
Big Data
Data Modeling
Pipelines
Data Transformation
Data Management
ETL
Data Quality
Data Governance
Data Engineering
SQL
Python
Visualization
Power BI
Data Warehousing