BlockFi
AI

Senior Data Engineer

BlockFi · SP São Paulo, São Paulo, Brazil · $91k - $96k

Actively hiring Posted over 3 years ago

Role overview

The Data Platforms organization is responsible for the end-to-end data needs of BlockFi products and services, composed of following teams: Data Engineering, Data Strategy, Master Data Management, & Machine Learning Eng. The Senior Data Engineer will be a part of the Enterprise Data Engineering organization serving all of BlockFi data needs. The Data Engineering Team designs, builds and supports data platforms, products, pipelines, governance frameworks that powers analytics, business insights, data science and machine learning. We aim to make data a competitive advantage for BlockFi by empowering our business partners with industry-leading insights and tools so that they can make fast, bold decisions with trusted data to create unsurpassed client experiences and grow our market share. We enable automation at scale that helps reduce risk, improves speed and eliminates manual processes.

  • Technical Breadth as well as Depth in Several Areas: 5+ years of experience as a data engineer with extensive experience in architecting data warehouses, data lakes, and data platforms for consumption by analytics and data science/ML. Extensive experience in building data pipelines (batch ETL, micro batches, and real-time streaming). 
  • Technical Ownership: Experience owning data platforms end-to-end (from data generation/ingestion to curates/aggregated layers), designing, estimating, implementing, testing, maintaining, debugging, and supporting high-quality software in production. Experience in building foundational, curated and aggregated data layers enabling self-service business intelligence (easily consumable by non-technical users)
  • Communication: Excellent communication, presentation and interpersonal skills. 
  • Collaboration: Empathetic and does the legwork required for building consensus. Always seeks out feedback on technical designs, solutions, and code.
  • Initiative and focus on outcomes: Works independently and takes initiative while maintaining transparency and collaboration. Can deliver high quality solutions without assistance. Proactively identifies problems and comes to conversations with possible solutions. 
  • Adaptive: Ability and motivation to quickly learn new languages, technologies and tools. Pragmatic bias toward outcomes, and technical decisions that solve real business problems.
  • Successful candidates will have:
    • Strong knowledge of cloud data platforms (either one of AWS, GCP, Snowflake, Azure)
    • Strong skills in Python, ETL transformations, data modeling, & feature engineering
    • Experience with SQL adapters such as Ecto and managing SQL schema changes with code
    • Experience with AWS cloud services: S3, EC2, RDS, Aurora, Redshift (or other cloud services)
    • Experience with real-time stream-processing systems: Kinesis, EventBridge, Confluent, Kafka or similar. 
    • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
    • Experience building and optimizing data pipelines, architectures and data sets.
    • Strong business acumen, critical thinking and technical abilities along with problem solving skills.
    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
    • Strong analytic skills related to working with unstructured datasets.
    • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
    • A successful history of manipulating, processing and extracting value from large disconnected datasets.
    • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
Your Perks:
We benefit from the great work our employees do each day. That is why we are committed to providing a variety of awesome benefits to help them live their best lives.
  • Competitive Compensation because we value your experience and expertise
  • Unlimited vacation / sick days because everyone deserves time for R&R
  • Flexible work environment because we are a geographically dispersed team and we believe in balance
  • A close-knit team of enthusiastic, collegial and driven people to work alongside in a highly meritocratic environment because teamwork makes the dreamwork

Tags & focus areas

Used for matching and alerts on DevFound
Data Science Dev Senior Python R Aws Gcp Azure