Skip to content

GERMAN SPEAKING BIG DATA ENGINEER

  • Remote
  • €5,000 - €8,000 per month
  • DATA

Job description

ABOUT THE COMPANY

LITIT, a joint venture between NTT DATA and Reiz Tech, is a company with deep-rooted industry know-how, dedicated to innovation within the IT sector. Its primary focus is delivering high-quality solutions in the DACH region. With a commitment to excellence, LITIT combines the best of German precision, Japanese work ethics, and Lithuanian talent to provide unparalleled IT service and support to its clients.

ABOUT THE CLIENT

Our client is a well-established company in the financial and accounting sector, trusted by thousands of professionals and organizations. They focus on building secure and reliable digital solutions that make everyday work easier and more efficient. With a strong commitment to innovation, they are expanding their data platforms to support smarter services, modern architectures, and data-driven decision-making.

ABOUT THE ROLE

We are looking for a German speaking Big Data Engineer to support the design, development, and optimization of scalable data pipelines and architectures. You will work with modern open-source technologies such as Apache Iceberg, Apache NiFi, and Trino to enable reliable, secure, and efficient data platforms. In this role, you will collaborate with interdisciplinary teams to deliver data-driven solutions, ensure data quality and governance, and help shape the technical backbone of our data infrastructure.

RESPONSIBILITIES

  • Design and implement scalable ETL/ELT pipelines using Apache NiFi

  • Build and maintain Iceberg-based data lakehouse structures

  • Develop and optimize SQL queries and data access with Trino

  • Integrate diverse data sources (e.g., Kafka, S3, RDBMS, REST APIs)

  • Ensure data quality, security, and governance across systems

  • Perform performance tuning and monitoring of the data infrastructure

  • Work closely with cross-functional teams to deliver data-driven products

  • Document technical concepts, processes, and best practices

REQUIREMENTS

  • Hands-on experience with Apache Iceberg, Apache NiFi, and Trino

  • Understanding of Apache Spark code

  • Strong knowledge of ETL/ELT processes and API integrations

  • Experience with real-time streaming (Kafka) and handling large-scale data

  • Cloud experience with AWS, Azure, or GCP (especially S3, IAM, Glue, etc.)

  • Strong SQL skills and knowledge of performance tuning

  • Programming in Python, Java, or Scala

  • Scripting skills (Bash)

  • Containerization: Docker, Kubernetes

  • CI/CD & Infrastructure-as-Code: Git, GitHub Actions, Jenkins, Terraform

  • Monitoring & Logging: Prometheus, Grafana, ELK Stack, Splunk

  • Willingness and readiness to travel as required by project or client needs is expected. This may include occasional domestic or international travel, sometimes on short notice.

Nice to have:

  • Experience with schema evolution, partitioning, and time travel in Apache Iceberg

  • Advanced flow design and error handling in Apache NiFi

  • Deep expertise in Trino SQL and data source integration

  • Additional technologies such as Kotlin

WHAT WE OFFER

  • Learning opportunities with compensated certificates, learning lunches, and language lessons.

  • Chance to switch projects after one year.

  • Team building twice a year.

  • Office in Vilnius, Lithuania that offers themed lunches and a pet-friendly environment.

  • Remote work opportunities.

  • Flexible time off depending on a project.

  • Seasonal activities with colleagues.

  • Additional health insurance and loyalty days for Lithuanian residents.

  • Referral bonuses.

  • Recognition of important occasions of your life.

or

Apply with Linkedin unavailable
Apply with Indeed unavailable