Octopus is a reverse-logistic startup that currently doing collection of post-consumer product through circular economy.
Job Description
Responsibilities
Design, build, and maintain the company’s single source of truth (data lake, data warehouse, and data mart), dashboarding/reporting platforms, and/or any other data products.
Work closely with BI analysts and data scientists to implement a variety of descriptive and predictive analytic solutions.
Work closely with the product and engineering team to ensure the data ecosystem quality and accountability.
Requirements
2-3 years of experience as data engineer
A clean coder, strong programming and software engineering background (in Python is highly desired)
Highly proficient with a variety of data engineering tech-stacks (Airflow, Kafka, Beam)
Extensive exposure to any major cloud platforms (Google Cloud Platform is preferred, otherwise AWS, Azure, Ali Cloud)
Knowledge of real-time analytics, and/or data-infra is necessary, hands-on experience in any of them is very desirable (e.g. Docker, Kubernetes)
Good understanding of the basics of data pipeline, and experience in implementing data-lake/data-warehouse/data-mart, with most recent data tech-stacks.
Knowledge/experience in working with large scale applications, asynchronous architecture is a major plus
We use cookies to customize your user experience. Click “Agree” if you agree with our Policy.