Job Description

Summary

As an On-Chain Protocols Data Engineer, you’ll work with our data team to expand our understanding of the on-chain protocol ecosystem at large. You will be responsible for services that constantly monitor all of the world’s most popular blockchains and watch for any new smart contract activity that may occur. When this activity happens, our pipelines index, extract, standardize, and organize all of the information retrieved from on-chain across dozens of categories, including DEXes, Bridges, Staking and Lending protocols, and many more. Together, these pipelines form the basis of the world’s most comprehensive blockchain intelligence layer.

In this role, you’ll:

  1. Work as part of our On-Chain Protocols team to develop and maintain data pipelines responsible for indexing, categorizing, standardizing, and organizing all smart contract events that occur on-chain.
  2. Collaborate to define the roadmap for new chain and new protocol onboarding efforts into these systems.
  3. Design the schema and structure of datasets for use by our customers.
  4. Lead and contribute to efforts to improve the scalability, reliability, and efficiency of these data indexing systems (new protocols, new chains, etc.).
  5. Explore new opportunities for integrating emerging technologies (artificial intelligence, machine learning, large language models) into the on-chain protocol indexing modality.
  6. Work with our Data Engineering team to optimize the core database layer (schemas, indexes, etc.) to improve dataset usability.

We’re looking for candidates who have:

  1. Strong development experience in Python
  2. Excellent SQL Skills
  3. Knowledge of OLTP and OLAP database technologies
  4. A deep understanding of on-chain protocols/smart contracts
  5. What is an RPC call?
  6. What is a topic0?
  7. What is a log/trace?
  8. A passion for the web3/crypto & DeFi ecosystem

Nice to have experience:

  1. Experience deploying workloads in Kubernetes
  2. Experience with developing ETL pipelines
  3. Experience with Databricks
  4. Experience with PostgreSQL
  5. Experience reading & writing Solidity
  6. Experience integrating large language models into software systems
  7. Experience with TheGraph (or writing subgraphs in general)

Technologies we use: 

  1. Python
  2. SQL (PostgreSQL, Databricks)
  3. Kubernetes
  4. Databricks
  5. Etherscan/blockchain explorers
  6. Redis

Skills
  • Database Management
  • Development
  • Python
  • Software Engineering
  • SQL
© 2024 cryptojobs.com. All right reserved.