KEY ACCOUNTABILITIES
- Design, develop, document and implement end to-end data pipelines and data integration processes, both batch and real-time. This include data analysis, data profiling, data cleansing, data lineage, data mapping, data transformation, developing ETL / ELT jobs and workflows, and deployment of data solutions.
Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency and cleanliness, and to optimize and fine-tune ETL / ELT processes.- Recommend, execute and deliver best practices in data management and data lifecycle processes, including modular development of ETL / ELT processes, coding and configuration standards, error handling and notification standards, auditing standards, and data archival standards.
Prepare test data, assist to create and execute test plans, test cases and test scripts.
- Collaborate with Data Architect, Data Modeler, IT team members, SMEs, vendors and internal business stakeholders, to understand data needs, gather requirements and implement data solutions to deliver business goals.
BAU support for any data issues and change requests, document all investigations, findings, recommendations and resolutions.
QUALIFICATIONS / EXPERIENCE- Bachelor's degree in IT, Computer Science or related Engineering.
At least 5 years of relevant experience in data engineering or related field
- In-depth understanding of data structures, algorithms, databases, and software engineering.
Experience in data management and data warehousing technologies (data mart, data lake, lake house)- Ability to write efficient and scalable code
Experience with end-to-end data processing pipelines, ETL/ELT, and data warehousing
- Knowledge of distributed systems and experience working with distributed data processing technologies.
5+ years of professional experience in implementing operational data stores, data warehouses, data mart and large scale data architectures in Unix and/or Windows environments- 5+ years of hands-on ETL development experience, including transforming complex data structures from multiple sources.
5+ years of experience with big data technologies including Azure and AWS, Azure data factory, Databricks, Synapse, Hadoop, Hive, Storm, Presto, and real-time data transformation and processing
Job Type: Permanent
Pay: RM8,
- 00 - RM10,000.00 per month
Schedule: - Monday to Friday
Application Question(s):
Experience:
- Data Engineering: 5 years (Preferred)
ETL Development: 5 years (Preferred)
Big data technologies: 5 years (Preferred)