Key Responsibilities- Design, develop, and maintain scalable and robust data architecture.
Create and manage databases, data processing systems, and data integration solutions.- Develop and implement efficient ETL processes for data ingestion and transformation.
Ensure data quality and integrity throughout the ETL pipeline.
- Design and implement data models for optimal storage and retrieval.
Build and maintain data warehouses for analysis and reporting purposes.- Monitor and optimize database performance.
Troubleshoot and resolve data-related issues in a timely manner.
- Work with stakeholders to understand data requirements and translate them into effective database structures.
Implement and enforce data security measures and compliance standards.- Collaborate with cross-functional teams, including AI Engineer, Data Scientists, Data Analysts, Software engineers and DevOps.
Bachelor's degree or higher education qualification in Computer Science, Software Engineering, or a related field- Proven experience as a Data Engineer or similar role.
Able to speak in English and Mandarin.
- Hands-on experience with data warehousing, ETL processes, and database management.
Proficiency in SQL and one or more programming languages (e.g., Python, Java, JavaScript).- Knowledge of data modelling and database design principles.
Strong proficiency with ETL tools such as Apache Airflow and Dagster.
- Familiarity with Big Data technologies (e.g., Hadoop, Spark).
Understanding of data governance principles and practices.
Exposure to machine learning concepts and integration of machine learning models into data pipelines.- Familiarity with containerization tools like Docker and orchestration tools like Kubernetes.
Knowledge of streaming data technologies (e.g., Apache Kafka).
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
Job Type: Full-time
Pay: From RM3,
- 00 per month
Benefits: - Free parking
Meal allowance- Opportunities for promotion
Supplemental Pay:
Performance bonus