Data Streaming Engineer


For a company focused on banking and credit card processes, we are looking for a Senior Data Streaming Engineer.

IT Delivery Data Services:

• Setup and operations of Kafka Cluster managed on-premise and/or in cloud-based environments (AWS, Confluent Cloud)
• Definition of topics and use-cases using Kafka Streams & kSQL
• Implementation of custom Kafka consumers
• Configuration of topics, producers, consumers
• Implement stream processing
• Maintaining Kafka security and encryption
• Assure automated deployment of new Kafka versions / changes
• Define / participate on overall solution monitoring
• Cooperation with customers in Kafka integration into their e2e services
• Define and drive the implementation of methodologies and new processes in accordance with the “big picture” of the companies data model and Group architecture principles
• Support prioritization of product backlog
• Shape the future setup of a dedicated product team
• Participate and contribute in the international community of practice


• English – Advanced (C1)
• Number of years of experience min. 1 (we area also open to hire more senior candidate)
• Experience with SQL/PLSQL
• Development experience with Java, REST APIs, microservices, message bus technologies
• Strong knowledge in CI/CD DevOps best practices and tools, knowledge on DevOps tools like Git and Jenkins
• Kafka Configuration Authorization and Authentication Patterns
• Experience in Spring Framework/SpringCloud Kafka topics is a huge advantage
• Experience in builds and deployment
• Experience with Ansible, Kubernetes and Docker is advantage
• Strong problem-solving skills
• Experience working in an agile environment (Scrum) is advantage
• 2+ years of experience in the Kafka operated environments – preferable in financial services industry is advantage

Location: FULL REMOTE Stringdata BTL Trask CRAW