Role overview
We are a diverse team that takes pride in understanding the perspectives of others. We fully embrace working remotely and we are eager to act, improve and accelerate progress inside and outside of our organization.
To drive revolutionary changes in society and make crypto useful, we delight our customers with world-class products, deep care, and intentional empathy.
The Head of Data Engineering is responsible for leading the development and management of Bitsoâs data infrastructure, enabling seamless & governed data access and utilization across business, data science, and regulatory domains. This role collaborates with product squads to design and optimize data models, oversees ETL processes, and enhances the Data Lakehouse with advanced capabilities like data streaming, governance layers and monitoring systems. A critical focus is placed on upholding data governance and quality standards. Moreover, the Head of Data Engineering is expected to partner with the Platform team to integrate DevOps principles to ensure scalable and reliable systems.
What you'll work on
- 4+ years of professional experience working with analytics, ETLs and data systems as an individual contributor.
- 3+ years of experience in engineering management at tech companies.
- Expertise in defining and implementing data architectures, including ETL/ELT pipelines, data lakes, data warehouses, and real-time data processing systems.
- Expertise with cloud platforms (AWS preferred), data engineering tools (Databricks, Spark, Kafka), and SQL/NoSQL databases.
- Expertise translating business requirements into technical solutions and data architecture. Ability to influence good data modeling at product squads.
- Expertise with orchestration tools (e.g. AWS step functions, Databricks workflows, or Dagster)
- Proven experience in building data migration services or implementing change data capture (CDC) processes to enable efficient and near real-time data synchronization. Skilled in designing data streaming pipelines to ingest production data into a data lakehouse architecture for analytical and operational use cases.
- Experience with CI/CD tools (Github actions)
- Experience with CDP platforms and handling behavioral data (e.g. Segment, Amplitude, AVO)
- Experience in both infrastructure as code technologies (e.g. terraform) and serverless for different data engineering tasks
What we're looking for
- Experience on Machine Learning operations. Specifically on serving capabilities, monitoring systems and feature stores
- Proven successes at Building data governance programs.