Data Engineer — Data Delivery Team

Finom
Apply

About the Role

We're building our Data Delivery function from scratch - dedicated partnership between data engineering and business stakeholders. You'll be the core engineer focused on translating business needs into robust data infrastructure at our international fintech company.

Data Delivery for us means end-to-end pipeline ownership, real-time analytics for SME financial decisions, and seamless data integration. Mission-critical infrastructure where data quality issues directly impact customer experience.

You'll become the bridge between raw business requirements and production-ready data solutions, build this technical function from ground zero, and solve non-trivial engineering challenges.


Responsibilities

What You'll Do

  • Build production data infrastructure - design ETL pipelines from business requirements, implement monitoring and alerting, maintain data pipeline health
  • Own data solutions end-to-end - from initial stakeholder conversation to production deployment, you build it, you run it, you maintain it
  • Work directly with product managers and analysts — translate chaotic data requests into scalable technical architectures, push back when requirements don't make sense
  • Lead technical initiatives - research new tools (Databricks, Kafka, dbt), plan pilots, make architectural decisions that impact the entire data stack
  • Create robust data pipelines - build ETL processes that handle millions of transactions, implement data quality checks, optimize query performance
  • Collaborate with Data Science and ML teams - build feature stores, create endpoints for model serving, solve complex data engineering challenges

Requirements

Who We're Looking For

  • 3+ years in data engineering - preferably fintech or high-volume transactional systems, understanding of production data challenges
  • Advanced SQL and Python skills - complex queries, data processing automation, daily work with terabyte-scale datasets
  • Proactive problem-solving experience - ability to walk into ambiguous requirements, structure technical solutions, deliver concrete results without detailed specs
  • Technical leadership mindset - ready to work with challenging stakeholders, convince with architectural expertise, propose non-obvious solutions
  • Independence - this is a position for those who can lead technical projects autonomously from idea to production
  • ETL/orchestration tools - Airflow, dbt, experience building production data pipelines
  • Stream processing experience - Apache Kafka, real-time analytics (advantage)

Tech Stack

  • SQL - Databricks, BigQuery for data processing
  • Orchestration - Apache Airflow, dbt for transformations
  • Streaming - Apache Kafka for real-time data flows
  • Languages - SQL (advanced), Python for automation
  • Cloud - AWS/GCP infrastructure
  • Data Sources - financial transactions, external APIs, real-time streams

Conditions

Why It's Interesting

  • New technical direction — build data engineering function from scratch, your architectural decisions will shape company's data infrastructure
  • Complex engineering challenges — not standard ETL jobs, but real-time financial data processing, solving scalability bottlenecks in critical systems
  • Direct business impact — your pipelines enable data-driven decisions for European SMEs, architecture choices affect millions in transactions
  • International team — work with colleagues across Europe, modern engineering practices, high technical standards
  • Technical autonomy — minimal micromanagement, maximum ownership of technical solutions and architecture decisions
  • Growth potential — clear path to Data Delivery team leadership as function scales

About the Team

  • Direct reporting — to Head of Data Analytics team
  • Key partners — Product managers, business analysts, Data Science team (fair warning: they'll challenge your technical choices, even experienced engineers appreciate the push)
  • Technical collaboration — work with platform engineers, analysts, ML specialists to implement your data solutions
  • International environment — distributed team, English-speaking communication, modern remote-first culture

Languages

  • English — full working proficiency

Interview Process

  • Recruiter screening — experience overview + basic technical questions
  • Technical deep-dive — architecture cases + hands-on problem solving
  • Team meeting — meet your future technical partners
Share this job opening

Application:

I agree to the processing of my personal data in accordance with the Dina Veprikova Privacy Policy