
Senior Platform Engineer with Hadoop
- On-site, Remote, Hybrid
- Wrocław, Dolnośląskie, Poland
- Rzeszów, Podkarpackie, Poland
- Gdańsk, Pomorskie, Poland
- Praca zdalna, Mazowieckie, Poland
+3 more- Data
Job description
Hello, let’s meet!
Who We Are
While Xebia is a global tech company, in Poland, our roots came from two teams – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.
What We Do
We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.
We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!
Beyond Projects
What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.
What sets us apart?
Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.
You will be:
maintaining and troubleshooting platform components,
automating platform configuration,
designing and implementing new functionality,
performing performance and reliability tuning,
supporting other teams using the solution,
taking active part in discussions and refinements within the operations scrum team in the Wholesale Banking Data Management domain,
contributing to building highly scalable, fault-tolerant and efficient ingestion solutions based on technologies such as NiFi, Spark, and CDP,
driving the roadmap of the client’s Google Cloud migration and adoption of object storage solutions;
Job requirements
several years of experience in system administration within Linux environments,
experience with distributed storage and processing of Big Data,
practical knowledge of Hadoop,
hands-on experience with Ansible,
good understanding of DevOps principles,
experience with:
Cloudera (CDP) stack,
Azure DevOps,
OpenShift (customer-specific flavors),
Google Cloud,
familiarity with container-based platforms and tooling (Docker, Kubernetes, OpenShift),
knowledge of Python,
fluency in written and spoken English, enabling smooth collaboration in international teams;
Work from EU and a work permit to work from EU are required.
Candidates must have an active VAT status in the EU VIES registry: https://ec.europa.eu/taxation_customs/vies/
Recruitment Process:
CV review – HR call – Technical Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview - Decision
or
All done!
Your application has been successfully submitted!