Job Description

Summary

As a member of our data engineering team, you'll be setting standards for data engineering solutions that have organizational impact. You'll provide Architectural solutions that are efficient, robust, extensible and are competitive within business and industry context. You'll collaborate with senior data engineers and analysts, guiding them towards their career goals at Gemini. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

  • Focused on technical leadership, defining patterns and operational guidelines for their vertical(s)
  • Independently scopes, designs, and delivers solutions for large, complex challenges
  • Provides oversight, coaching and guidance through code and design reviews
  • Designs for scale and reliability with the future in mind. Can do critical R&D
  • Successfully plans and delivers complex, multi-team or system, long-term projects, including ones with external dependencies
  • Identifies problems that need to be solved and advocates for their prioritization
  • Owns one or more large, mission-critical systems at Gemini or multiple complex, team level projects, overseeing all aspects from design through implementation through operation
  • Collaborates with coworkers across the org to document and design how systems work and interact
  • Leads large initiatives across domains, even outside their core expertise. Coordinates large initiatives
  • Designs, architects and implements best-in-class Data Warehousing and reporting solutions
  • Builds real-time data and reporting solutions
  • Develops new systems and tools to enable the teams to consume and understand data more intuitively

Minimum Qualifications:

  • 10+ years experience in data engineering with data warehouse technologies
  • 10+ years experience in custom ETL design, implementation and maintenance
  • 10+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Advanced skills with Python and SQL are a must
  • Experience and expertise in Databricks, Spark, Hadoop etc.
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
  • Strong computer science fundamentals including data structures and algorithms
  • Strong software engineering skills in any server side language, preferable Python
  • Experienced in working collaboratively across different teams and departments
  • Strong technical and business communication skills

Preferred Qualifications:

  • Kafka, HDFS, Hive, Cloud computing, machine learning, LLMs, NLP & Web development experience is a plus
  • NoSQL experience a plus
  • Deep knowledge of Apache Airflow
  • Expert experience implementing complex, enterprise-wide data transformation and processing solutions
  • Experience with Continuous integration and deployment
  • Knowledge and experience of financial markets, banking or exchanges
  • Web development skills with HTML, CSS, or JavaScript

It Pays to Work Here   The compensation & benefits package for this role includes:

  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $172,000 - $215,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidates compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

Skills
  • Communications Skills
  • Data Structures
  • Database Management
  • Python
  • Software Engineering
  • SQL
  • Team Collaboration
© 2024 cryptojobs.com. All right reserved.