At Itility we believe in merging technology and data to drive our customers one step beyond. Itility digital consultants are experts in data, cloud, software, and IT infrastructure.
Our culture can be described as ‘no-nonsense, with passion’. Working at Itility is about working with people, staying close to our customers.
We work for large enterprises and innovative startups. Acting as the ‘digital twin’ of customers, we work shoulder-to-shoulder to exceed business goals and push the boundaries of what you thought was possible.
Do you like to go above and beyond? Do you want to work with passion for what you do, in a team of people fueled by the same passion?
Then we would love to meet you!
We need your expertise
Do you have experience with writing code to ingest data? Do you like data wrangling, digging into data sources, and processing data to a readable and usable state? Do you love that feeling of accomplishment when data is flowing seamlessly into a data lake, day in day out, hour after hour, based on code that you have carefully crafted? Then this Data Engineer job is just what you are looking for!
For multiple enterprise customers we create data connectors to make data flow from various sources, using state-of-the art technologies and cloud providers like Azure and AWS. But we are agnostic and do not shy away from any technical implementation. What they all have in common, is that the data is part of the business value chain and will be used in a production environment, so data quality, continuity and seamless flow are crucial – of course with monitoring for disruptions.
- Create data connectors or processing solutions, using Python or other coding languages.
- Define validation tests to run in the data pipeline.
- Define monitoring and alerting to ensure visibility when the data flow is interrupted or corrupted.
- If incidents occur, you take the lead in getting to the root cause as soon as possible to solve the incident with as little impact on the end users as possible.
- Interact and validate with end users for the solutions you build, aiming for continuous improvement.
- You and the team are responsible for building, deploying, maintaining, and optimizing the data flow solution.
You believe in
Scrum/agile way-of-working and in software practices that enable a professional data flow. Further requirements:
- You have a bachelor’s or master’s degree.
- You have experience creating data ingestion scripts.
- Experience with a cloud provider like Azure or AWS is a pre.
- You have 2 – 3 years of relevant work experience.
- You have a good understanding of SQL and Python.
- You are a team player and you have good communication skills.
- Ideally, you have worked with data platforms and data lakes within an enterprise environment.
Screening is part of the hiring procedure.
This is what we offer
You will be given the opportunity to develop in the best way possible, under the personal guidance of fellow data engineers and architects of Itility. We have several trainings available in our own Itility Academy to help you develop your hard & soft skills. But also an active community that organizes tech hours to share knowledge, demo solutions and new tools.
If you do not have the required expertise for the job but do have the passion for data engineering, we offer a substantial trainee program to get you up to speed for the job in a planned manner.
In addition to a competitive salary, you will receive extras such as: