Job Description

Summary

Responsibilities

  1. Build a universal data platform, meet real-time/offline computing and storage requirements;
  2. Define the data pipeline scheme according to the demand scenarios, and deliver it by multiple methods based on various infrastructure;
  3. Enhance the data platform, improve the stability and flexibility of data asset, and optimize the resource efficiency.

Requirements

  1. Bachelor degree or above, major in computer, big data, mathematics, and more than 3-5 years of data development experience;
  2. Familiar with Hadoop, Spark, Flink, Airflow and other popular data platform components, understand the working principle of them, and have optimization experience about these components;
  3. Have a certain understanding of distributed system principles, calculations, and storage;
  4. Good communication and logical thinking skills, good self-drive, continuous learning and updating knowledge system;

Skills
  • Communications Skills
  • Development
  • Logical Thinking
  • Software Engineering
© 2025 cryptojobs.com. All right reserved.