Get ready to
- Ensure stable and timely data transfer
- Create integrations between various applications
- Create data pipelines from multiple sources using tools and/or programming languages
- Maintain production-level ready data quality and solve any issues related to missing data or its discrepancies
- Be able to work independently and within the team
- Take ownership of tasks and initiatives
- Being up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
We expect you to
- Have at least 2 years of data engineering and/or analytics engineering experience
- Understanding of protocols (HTTP) and concepts (RESTful API, webhook, etc.)
- Proficient experience with programming languages such as Python, SQL
- Understanding of various data formats (JSON, YAML, CSV) and structures (array, dictionary, hashmap, etc.)
- Experience with any of cloud platform, preferably GCP (Google Cloud Platform)
- Experience with data orchestration (e.g. Airflow, Dagster) and modeling (DBT) tools
- Experience with DevOps tools like Kubernetes, Docker
- Have strong communication skills: timely, clear, and consistent sharing of information, work progress, and findings
- Fluent English and Lithuanian.
Salary
Gross salary range is 3300-4200 EUR/month.
Location
We have amazing offices in Vilnius (HQ) and Kaunas.