Data Engineer

We are now looking for a Data Engineer on the behalf of our client.

Scope of work:

Join us in shaping the future of a data-driven IKEA!

We are on the journey to simplify the complexity of digital and to support Retail Concept and the IKEA retail system. A journey that needs passionate people who embrace change and development, dare to question, and want to make a difference. If that sounds like you, come and join us. Together we can do great things so IKEA can be an even better home furnishing retailer in the future. 

About you:

In this role as a Data Engineer within the My Learning digital product area, you will work closely with the IKEA concept and brand. You will need to understand how learning solutions support the organization, as the product reaches more than 100,000 co-workers across the IKEA franchise system every year.

In addition to be a successful data engineer, we expect:

  • Minimum 3 years of data engineering experience
  • Minimum 2 years of Databricks and Python experience
  • Minimum 2 years of experience working with cloud providers preferable Azure
  • Experience with Git-based development workflows
  • Familiar with APIs preferably in Python
  • Familiar with regular back-end work as an engineer
  • Experience building scalable solutions using PySpark, Delta Lake, and Databricks Workflows.
  • Been working with CI/CD pipeline for data platforms, infrastructure as code and monitoring.
  • Design and develop data access layers, APIs data services using access control, authentication to enable secure data sharing across teams.

As a Data Engineer, your main responsibilities will include: 

  • Be accountable for engineering and solution architecture within the Learning Technologies and Data area but also align engineering work across teams within the domain.
  • Design, build, and maintain scalable data pipelines in Databricks using Python and SQL
  • Ensure data quality, lineage, and governance across datasets.
  • Optimize Spark workloads and SQL queries for performance and cost efficiency
  • Implement data observability, monitoring, alerting pipelines, develop and maintain dashboards.

  • Start: 2026-06-01

    Duration: 2026-12-31 (possibility for extension)

    Workload: 100%

    Location: Malmö (needs to be onsite at least 3 days/week)

    We will present candidates on an ongoing basis, so if interested please don´t hesitate to apply!

  • Locations: Malmo
  • Technologies: Azure, Continuous Integration / Continuous Deployment, Databricks, Git, Python
  • Language: English