AWS Data Engineer

💻 Ework Group - founded in 2000, listed on Nasdaq Stockholm, with around 13,000 independent professionals on assignment - we are the total talent solutions provider who partners with clients, in both the private and public sector, and professionals to create sustainable talent supply chains.

With a focus on IT/OT, R&D, Engineering and Business Development, we deliver sustainable value through a holistic and independent approach to total talent management.

By providing comprehensive talent solutions, combined with vast industry experience and excellence in execution, we form successful collaborations. We bridge clients and partners & professionals throughout the talent supply chain, for the benefit of individuals, organizations and society.

🔹For our Client we are looking for AWS Data Engineer🔹

Location: 100% remote from Poland (onboarding possible in Warsaw)

B2B contract: Long-term cooperation

Responsibilities: Design and implement solutions for processing large-scale and unstructured datasets (Data Mesh, Data Lake, or Streaming Architectures). Develop, optimize, and test modern Data Warehouse (DWH)/Big Data solutions based on the AWS cloud platform within CI/CD environments. Improve data processing efficiency and support migrations from on-premises systems to public cloud platforms. Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions. Ensure data quality, consistency, and performance across AWS services and environments. Participate in code reviews and contribute to technical improvements.

Profile: Proven experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC. Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services. Solid understanding of AWS services, ideally supported by relevant certifications. Familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark. Basic proficiency in at least one of the following programming languages: Python, Scala, Java, or Bash. Good command of English; German language skills would be an advantage. Nice to Have: Experience with orchestration tools (e.g., Airflow, Prefect). Exposure to CI/CD pipelines and DevOps practices. Knowledge of streaming technologies (e.g., Kafka, Spark Streaming). Experience working with Snowflake or Databricks in a production or development environment. Relevant certifications in AWS, data engineering, or big data technologies. ✔️ We offer:

B2B agreement

Transparent working conditions with both Ework and the client

Current support during our cooperation

Possibility to work in an international environment

Collaborative environment in Swedish organizational culture

Private medical care

Life insurance

Multisport

Teambuilding events

Do you know someone who would fit this position? Recommend a candidate by sending her/his CV to: polecenia@eworkgroup.com

Whistleblowing Policy, which provides guidelines for reporting misconduct can be found on Ework website: https://www.eworkgroup.com/about-us/our-respon

  • Locations: Remote
  • Technologies: Amazon Web Services (AWS), Continuous Integration / Continuous Deployment, Data Warehouse, Databricks, DevOps, Docker, Snowflake
  • Language: English