Docker (regular)
Enterprise Architect (regular)
Kafka (regular)
We are looking for a Kafka Integration Architect for our client who supporting millions of internal and external customers with state-of-the-art IT solutions to everyday problems.
He is dedicated to bring a digital innovations to every aspect of the landscape of insurance. Discover how implementing AI, IoT, Voice Recognition, Big Data science, advanced mobile solutions and much more to accommodate customers’ future needs around the globe.
How you will get the job done
Managing day to day support issues, but with the experience and responsibility to resolve more complex incidents and problems,
Work to SLA thresholds for incident(s), request(s) and problem(s),
Prioritizing and managing workload effectively, managing several open incidents and problems,
Systems and performance monitoring,
Trend analysis and problem management,
Review of services, application delivery, patching,
Work with a variety of business and technical teams to enhance service.
Skills and experience you will need
Professional experience in enterprise architecture and in software development
Profound knowledge of enterprise architecture, enterprise architecture methodologies, governance structures and framework
Knowledge of the entire software life cycle and corresponding methods of specification, implementation, testing, commissioning and migration / replacement
Experience with monitoring tools and logging systems such as Prometheus, Grafana etc.
Solid experience with Kafka or similar large-scale distributed data systems
Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and / or Kubernetes
Experience in developing resilient scalable distributed systems and microservices architecture
Experience with various distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)
Experience with stream processing frameworks (e.g. Kafka Streams, Spark Streaming, Flink, Storm)
Experience with Continuous Integration / Continuous Delivery (CI / CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and / or Hybrid / Private Cloud;
as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
Analyze requirements for improvements in existing and new cloud native functionalities
Be a partner with the business and consult on application and interface related topics
Plan and manage the integration of new and existing applications
Design of strategically important integration solutions based on Kafka
Expertise in Kafka producers, consumers, Kafka Connect, streams, KSQL, machine learning models directly on Kafka, taking into account the entire IT landscape
Development of Kafka components either on Kafka clusters or as Java micro-services
Development of micro-services with Kafka integration with Java and common micro-services frameworks like Spring Boot.
Design and construction of the infrastructure and governance for CI / CD
Ensure smooth transition of projects into operations
Monitor application performance and data quality
Plan for security, performance, uptime, disaster recovery, and capacity growth