Job Description
Summary
Responsibilities
- Build a universal data platform, meet real-time/offline computing and storage requirements;
- Define the data pipeline scheme according to the demand scenarios, and deliver it by multiple methods based on various infrastructure;
- Enhance the data platform, improve the stability and flexibility of data asset, and optimize the resource efficiency.
Requirements
- Bachelor degree or above, major in computer, big data, mathematics, and more than 3-5 years of data development experience;
- Familiar with Hadoop, Spark, Flink, Airflow and other popular data platform components, understand the working principle of them, and have optimization experience about these components;
- Have a certain understanding of distributed system principles, calculations, and storage;
- Good communication and logical thinking skills, good self-drive, continuous learning and updating knowledge system;
Skills
- Communications Skills
- Development
- Logical Thinking
- Software Engineering