job Description
Education:
Bachelor and above in computer engineering and information technology
Responsibilities:
* Have good knowledge of data warehousing concepts, and design, develop, and maintain scalable data pipelines for real-time and batch processing
* Implement robust data streaming architectures to support business analytics and reporting requirements
* Optimize data workflows and ETL processes for improved performance and reliability
* Ensure data quality, integrity, and security throughout all stages of data processing
Requirements:
* proficiency in Apache Kafka for data streaming
* proficiency in Python for scripting, data manipulation, and automation
* Experience in ETL data from different sources (MSSQL, PostgreSQL, API, Google Sheets, etc)
* Experience in working with Apache Druid, and Docker is a plus
* Proven experience (3+ years) working as a Data Engineer or similar role
* Excellent problem-solving skills
* Effective communication skills and ability to collaborate with other teams to meet analytical and reporting needs
ثبت مشکل و تخلف آگهی
ارسال رزومه برای تجارت الکترونیک پارسیان