Airflow (regular)
Microservice Architecture (advanced)
Hadoop (advanced)
Type of contract : B2B contract
Location : Warsaw, Marynarska 12 (full remote)
Remote recruitment
If you are not afraid of challenges, have an open mind, want to develop yourself and learn the latest technologies - this offer is for you!
Technology stack : Hadoop, Kubernetes, Spark, Spark Streaming, Kafka, Git, MongoDB, Trino, Airflow, Cassandra, Hive
Languages : Scala, Java, Python, SQL
Team : We currently have five development teams, over 50 people in total.
Responsibilities :
Designing state-of-the-art git-oriented code-as-a-service microservice based Big Data LakeHouse Platform addressing various business use cases with respect to architecture, technologies, data model, integrations, data transformations.
Supervising Data Lake operation and configuration
Working in agile team focused on delivering fast business results while maintaining good coding standards and following defined architecture.
Mentoring team members
What do you bring?
Have min. 5 years of experience in designing Big Data solutions and data models
Have strong knowledge of Data Lakehouse, Big Data and NoSQL concepts
Have experience in designing event and batch processing data pipelines
Have experience in Big Data open source components configuration
Have practical knowledge of microservice architecture
Have experience in Agile is a strong plus
Have experience in public cloud big data solutions is a plus
Have knowledge of UML and Enterprise Architect is a plus