Browse All Jobs

Capco is seeking a (Senior) Data Engineer to join their team in Zurich, Switzerland. The ideal candidate will be responsible for designing and optimizing data pipelines, managing data analytic solutions and lakes, ensuring data quality, security, and compliance, and innovating with emerging technologies. The candidate will work with a global technology and management consultancy dedicated to the financial services industry.

Role involves:

  • Designing and optimizing data pipelines using Apache Spark, Kafka, and Airflow.
  • Managing data analytic solutions and data lakes.
  • Implementing data validation, encryption, and governance practices.
  • Exploring and integrating new big data tools and frameworks.

Requirements:

  • Proven expertise in big data technologies (Hadoop, Apache Spark, Kafka).
  • Advanced knowledge of ETL processes using Python/Scala scripts.
  • Strong programming and data modeling skills in Python, Java, or Scala.
  • Experience with data infrastructure and security using IaC tools like Terraform.
  • Fluency in German and English.

Capco offers:

  • Highly competitive benefits.
  • Flexible working hours.
  • Technical and soft skills training.
  • Smartphone & fast mobile internet.
  • A work culture focused on innovation and creation of lasting value.
Apply

Capco

Capco is a global management and technology consultancy dedicated to the financial services industry. Established in 1998, Capco operates across Europe, Asia, and the Americas, with a team of 7,000 professionals, including 180 in France. Capco Paris specializes in providing innovative solutions and expertise in banking to both French and international financial institutions. The firm focuses on areas such as risk management, regulatory compliance, and digital transformation within the financial sector.