Experience: more than 6 years
Job Mode: Full-time
Work Mode: On-site
Encora is a global Software and Digital Engineering company that helps business overcome the Software Engineering Talent shortage and provides next-gen services and such as Predictive Analysis, Artificial Intelligence & Machine Learning, IoT, Cloud, and Test Automation. We count with 16 global offices and 25 innovation labs.
Our Software Engineering experts work with leading-edge technology companies to improve their speed to impact.
Responsibilities
Design(efficient, adaptable, and scalable), develop, and manage Kafka-based data pipelines.
Develop an expert-level understanding of data migration and CDC as it relates to Kafka using Kafka Connect and Debezium
Understand and apply eventdriven architecture patterns and Kafka best practices. Enable development teams to do the same.
Assist developers in choosing correct patterns, event modeling, and ensuring data integrity.
Continuous learning to be a Confluent/Kafka subject matter expert.
Work with Kafka and Confluent API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation.
Prepare documentations data mapping and data dictionary.
Design monitoring solutions and baseline statistics reporting to support the implementation
Level 2 & 3 support BAU and optimize existing Kafka setup
Requirements:
- Bachelor Degree in Computer Science, related field or equivalent work experience
- 6+ years of experience in data engineering experience, including Kafka on premise.
- Strong knowledge of Kafka, database and other big data technologies
- Strong knowledge of SQL (tuning) and database fundamentals - Good understanding of distributed systems - Good communication and interpersonal skills
- Must have experience in
Data replication / data streaming
Kafka debezium (good to have)
Change data capture (CDC)
Performance tuning
Database architecture.