Job Description
Summary
As a Software Engineer on the Data Solutions team, you’ll be responsible for building and maintaining the backend services that power our cloud and on-premise products. These services are critical for vending data to our customers via API, tracking usage and billing, monitoring our data landscape, generating alerts for our customers, transforming and piping hundreds of terabytes of data, and much more. You’ll help us to raise the bar on code quality, while working closely with our team of software engineers, data engineers, data scientists, and product managers to drive globally impactful results and build trust in our software.The work that you lead and contribute to will provide our customers with the ability to derive powerful and unique insights that drive global investigatory efforts, trans-national threat actor monitoring, national security enhancements, and much more.
In this role, you’ll:
- Play a key role in delivering new capabilities to the Data Solutions platform.
- Bring exciting new ideas and energy to the team, that we will go on to integrate into building the future of Data Solutions.
- Contribute to improving the scalability and performance of our systems.
- Solve complex engineering problems with peers and stakeholders across the organization.
- Build trust in our software, by continuously improving the security and reliability posture of everything we deliver.
We’re looking for candidates who:
- Are Python experts
- Are comfortable building backend ETL pipelines and services in a Cloud environment (AWS, GCP)
- Have an understanding of how blockchains work, and core tenets of decentralized systems
- Have experience working with SQL databases
- Have an expertise in APIs, data streaming systems, and event-driven workflows
- Have a high degree of ownership and appetite for responsibility
- Have experience working with distributed systems (microservice architectures, Kubernetes deployments, etc.)
- Exemplify a high degree of pride in what you create, and own
- Experience developing systems that scale to support high volumes of traffic (hundreds to thousands of requests per second)
Nice to have experience:
- Deep blockchain knowledge (writing smart contracts in Solidity, etc.)
- Experience with PySpark/DLT
Technologies we use:
- Python
- Kubernetes
- Cloudflare
- Docker
- Databricks
- Terraform
- GCP & AWS
- Databricks
- Kafka
- Cloud Functions
Skills
- Database Management
- Development
- Python
- Software Engineering
- SQL