Skip to content

Senior GCP Data Engineer

  • On-site, Remote, Hybrid
    • Wrocław, Dolnośląskie, Poland
    • Rzeszów, Podkarpackie, Poland
    • Gdańsk, Pomorskie, Poland
    +2 more
  • PLN 21,500 - PLN 33,000 per month
  • Data

Job description

Who We Are 

While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started. 

What We Do 

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost. 

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland! 

Beyond Projects 

What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow. 

What sets us apart?  

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself. 

About the role:

As a Data Engineer at Xebia, you will work closely with engineering, product, and data teams to deliver our clients scalable and robust data solutions. Your key responsibilities will include designing, building, and maintaining data platforms and pipelines and mentoring new engineers.


You will be:

  • developing and maintaining data pipelines to ensure seamless data flow from the Loyalty system to the data lake and data warehouse,

  • collaborating with data engineers to ensure data engineering best practices are integrated into the development process,

  • ensuring data integrity, consistency, and availability across all data systems,

  • integrating data from various sources, including transactional databases, third-party APIs, and external data sources, into the data lake,

  • implementing ETL processes to transform and load data into the data warehouse for analytics and reporting,

  • working closely with cross-functional teams including Engineering, Business Analytics, Data Science and Product Management to understand data requirements and deliver solutions,

  • collaborating with data engineers to ensure data engineering best practices are integrated into the development process,

  • optimizing data storage and retrieval to improve performance and scalability,

  • monitoring and troubleshooting data pipelines to ensure high reliability and efficiency,

  • implementing and enforcing data governance policies to ensure data security, privacy, and compliance,

  • developing documentation and standards for data processes and procedures.

Job requirements

Your profile:

  • 5+ years in a data engineering role, with hands-on experience in building data processing pipelines,

  • experience in leading the design and implementing of data pipelines and data products,

  • proficiency with GCP services, for large-scale data processing and optimization,

  • extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization,

  • knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing,

  • strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL),

  • hands-on experience with ETL tools and processes,

  • practical experience with dbt for data transformation,

  • deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts,

  • excellent command of oral and written English,

  • Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field.

Work from the European Union region and a work permit are required.

Candidates must have an active VAT status in the EU VIES registry: https://ec.europa.eu/taxation_customs/vies/

Nice to have:

  • experience with ecommerce systems and their data integration,

  • knowledge of data visualization tools (e.g., Tableau, Looker),

  • understanding of machine learning and data analytics,

  • certification in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). 


Recruitment Process:

CV review – HR call – Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview - Decision

or