Data Engineer (GCP/HADOOP)
Capco Poland
Athens, Greece
πριν από 2 μέρες
source : Just Join IT

GCP (advanced)

Elastic / Logstash / Kibana (advanced)


Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry.

We are passionate about helping our clients succeed in an ever-changing industry.

We also are :

  • Experts in banking and payments, capital markets, wealth and asset management
  • Focused on maintaining our nimble, agile, and entrepreneurial culture
  • Committed to growing our business and hiring the best talent to help us get there

    We are looking for Data Engineers to work on the collecting, storing, processing, and analyzing of large sets of data and be a part of the Clients' Wholesale Chief Data & Analytics Office.

    Our clients' Big Data Lake is the largest aggregation of data ever within financial services with over 300 sources and a rapidly growing book of work.

  • Deliver an ecosystem of curated, enriched, and protected sets of data created from global, raw, structured, and unstructured sources
  • Collect, store, analyze, and leverage data
  • Integrate data with the architecture used across the company
  • Build core services that power Machine Learning and analytics systems
  • Data Engineering and Management
  • Data development process : design, build and test data products that are complex or large-scale
  • Promote development standards, code reviews, mentoring, testing, scrum story writing
  • Cooperate with customers / stakeholders
  • Challenges :

    1. Refactoring the current technology stack and architecture from on-premise Hadoop to Google Cloud Platform

    2. Integrating with an established, complex Multitenant Hadoop based


  • Experience working with data pipeline building technologies : PySpark, Scala, Hive, Java
  • Good knowledge of Data warehouse concepts
  • Proficient in SQL and relational database design
  • Elastic Search experience (Elastic / Logstash / Kibana etc)
  • Google Cloud Platform knowledge
  • Knowledge and experience of Hadoop eco-system and data management frameworks
  • Knowledge of CI / CD, Agile, DevOps, Software Development Life Cycle (SDLC)
  • Excellent communication, interpersonal, and decision-making skills
  • Good English knowledge

  • Employment contract and / or Business to Business as you prefer
  • Possibility to work remotely
  • Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
  • Multiple employee benefits packages (multisport card, private medical care, lunch card)
  • Access to 3.000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • Being part of the core squad focused on the growth of the Polish business unit
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
  • A work culture focused on innovation and creating lasting value for our clients and employees

  • Screening call with the Recruiter
  • Home assignment if required
  • Technical / Competencies interview with Capco Hiring Manager
  • Client’s interview
  • Feedback / Offer
  • Αναφορά αυτής της εργασίας

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Το e-mail μου
    Πατώντας στο κουμπί «Συνέχεια », δίνω στο neuvoo τη συγκατάθεση μου να καταχωρήσει τα δεδομένα μου και να μου στέλνει ειδοποιήσεις μέσω email, όπως αναφέρεται λεπτομερώς στην πολιτική προστασίας προσωπικών δεδομένων του neuvoo. Μπορείτε ανά πάσα στιγμή να αποσύρετε τη συγκατάθεση σας ή να διαγραφθείτε οποιαδήποτε στιγμή.
    Φόρμα αίτησης