ETL (regular)
Data Engineer (Finance Data)
Why project44?
Our mission is to usher in a new era of trust and predictability to transportation and logistics, but there’s still a lot of work to be done.
The industry is massive, as is the opportunity. We’re looking for bright, ambitious individuals to join our growing global team and help us enable a more productive and successful world.
We’re changing the way the world ships, and we’re looking for you to help us get there!
About The Role
The Product Operations & Analytics team is a key part of project44's cross-functional product development teams, helping to drive every stage of product development through data analytic, statistical, and operational expertise.
As a Data Engineer, you'll have opportunities to work on ETL pipeline , support & solve various data integration challenges and improve core data infrastructure.
This is a newly created role to bring the best of our Product & Engineering capabilities to deliver reporting and analytics solutions for our internal stakeholders.
What You’ll Be Doing
Design, build and improve our ETL platform and pipelines utilizing Airflow / Argo Workflow or 3rd party tools like Fivetran and integrates with Data lake (Snowflake)
Ensure standards for engineering excellence, scalability, reliability, and reusability
Debug production issues across services and multiple levels of the stack
Partner with the insights and data science teams to automate processes to improve data sets for analytical and reporting needs
Write test cases, code coverage, perform QA and participate with stakeholders on UAT
You could be a great fit if you have
At least 2+ years experience with building data pipelines using 3rd party tools (e. g., Fivetran) or Airflow, Spark, and / or Kafka Streams
Good proficiency in Python, Java, or other similar languages
Familiarity with the architecture of event collection pipelines and analytical data stores such as Snowflake
Bachelor's Degree in computer science or equivalent experience
Technical Skills
Strong programming / scripting knowledge in building and maintaining ETL using Java, SQL, Python, Bash, Go.
In-depth hands-on knowledge of public clouds - GCP(preferred) / AWS, PostgreSQL (version 9.6+), ElasticSearch, MongoDB, MySQL / MariaDB, Snowflake
Participate in an on-call rotation to mitigate any data pipeline failures
Strong experience with Kafka or equivalent event / streaming based systems
Experience with Docker, Kubernetes
Develop and deploy CICD pipelines for Data Engineering
Experience and knowledge of optimizing database performance and capacity utilization to provide high availability and redundancy
Proficiency with high volume OLTP Databases and large data warehouse environments
Ability to work in a fast-paced, rapidly changing environment
Understanding of Agile and its implementation for Data Warehouse Development
Professional Skills / Competency
Focuses on development / improvement of framework to support repeatable and scalable solutions
Demonstrates excellent communication and interpersonal skills; able to communicate clearly and concisely
Takes initiative to recommend / develop innovative approaches to getting things done
Familiarity with financial systems data
Is a team player and encourages collaboration
About Project44
Since 2014, project44 has been transforming the way one of the largest, most important global industries does business. As transportation and logistics continues to evolve and customer expectations around delivery become more demanding, industry technology must rise to the occasion.
In just a few short years, we’ve created a digital infrastructure that eliminates the inefficiencies caused by dated technology and manual processes.
Our Advanced Visibility Platform is used by the world’s leading brands to track shipments, collaborate with supply chain partners, drive operational efficiencies, and create outstanding customer experiences.