Big Data (advanced)
Are you curious what product has saved Americans $30 billion on healthcare since 2011?
Sounds pretty good, doesn't it?
Or would you like to become part of the team that is constantly improving and developing this tool?
Nothing easier - apply, let's talk!
Collaborate with product managers, data scientists, data analysts, and engineers to define requirements and data specifications.
Develop, deploy and maintain data processing pipelines using cloud technology such as AWS, Kubernetes, Airflow, Redshift, and EMR.
Develop, deploy and maintain serverless data pipelines using Event Bridge, Kinesis, AWS Lambda, S3, and Glue.
Define and manage the overall schedule and availability for a variety of data sets.
Work closely with other engineers to enhance infrastructure, and improve reliability and efficiency.
Make smart engineering and product decisions based on data analysis and collaboration.
Act as in house data expert and make recommendations regarding standards for code quality and timeliness.
Architect cloud-based data infrastructure solutions to meet stakeholder needs.
Skills & Qualifications :
Bachelor’s degree in analytics, statistics, engineering, math, economics, science, or related discipline,
Professional experience in the big data space ,
Experience in engineering data pipelines using big data technologies ( Spark, etc...) on large-scale data sets,
Expert knowledge in writing complex SQL and ETL development with experience processing extremely large datasets,
Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions,
Deep familiarity with AWS Services (S3, Event Bridge, Glue, EMR, Redshift, Lambda) ,
Ability to quickly learn complex domains and new technologies.
Salary : 35.000 PLN / NET / B2B,
Work 100 % remotely,
500 PLN / MONTH FOR MULTISPORT, MEDICAL CARE ETC.
2400 PLN / YEAR FOR CONFERENCES, CERTIFICATES, ETC.
20 days off fully paid.