Skip to content

Senior Platform Engineer, Data Streaming

  • On-site, Remote, Hybrid
    • Wrocław, Dolnośląskie, Poland
    • Rzeszów, Podkarpackie, Poland
    • Gdańsk, Pomorskie, Poland
    • Praca zdalna, Mazowieckie, Poland
    +3 more
  • Data

Job description

Hello, let’s meet!

Who We Are

While Xebia is a global tech company, in Poland, our roots came from two teams – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.

What We Do

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!

Beyond Projects

What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.

What sets us apart? 

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.

You will be:

  • building and scaling streaming infrastructure using technologies such as  Kafka, Apache Spark, Apache Flink, and Snowflake,

  • ensuring the reliability and security of the infrastructure using Kubernetes (k8s) and Infrastructure as Code (IaaC), 

  • contributing to the architecture and defining best practices for our Data Platform, 

  • collaborating with data producers and consumers across the company to build seamless, efficient data pipelines. 

Job requirements

Your profile:

  • 5+ years of hands-on experience in Software /Data/Infrastructure Engineering,  

  • proficiency in a programming language like Python, Java, or Go,  

  • strong experience with Terraform, 

  • experience with Kafka and/or data processing engines like Apache Flink or Spark, 

  • knowledge of Grafana, Prometheus, or Honeycomb,

  • very good command of English (min. B2).

Nice to have:

  • experience with distributed databases or data warehouses, 

  • knowledge of ETL/data services like Airflow, DBT, Snowflake, or Databricks. 

Work from EU and a work permit to work from EU are required.

Candidates must have an active VAT status in the EU VIES registry: https://ec.europa.eu/taxation_customs/vies/

Recruitment Process:

CV review – HR call – Technical Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview - Decision

or