Get ready to
- Ensure stable and timely data transfer
- Create data pipelines from multiple sources using tools and/or programming languages
- Maintain production-level ready data quality and solve any issues related to missing data or its discrepancies
- Be able to work independently and within the team
- Take ownership of tasks and initiatives
- Being up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
We expect you to
- Have at least 2 years of data engineering and/or analytics engineering experience
- Proficient experience with programming languages such as Python, SQL
- Understanding of various data formats (JSON, YAML, CSV) and structures (array, dictionary, hashmap, etc.)
- Experience with any of cloud platform, preferably GCP (Google Cloud Platform)
- Experience with data orchestration (e.g. Airflow, Dagster) and modeling (DBT) tools
- Experience with DevOps tools like Kubernetes, Docker would be a plus
- Understanding of protocols (HTTP) and concepts (RESTful API, webhook, etc.) would be a plus
- Have strong communication skills: timely, clear, and consistent sharing of information, work progress, and findings
- Fluent English and Lithuanian.
Salary
Gross salary range is 3000 – 4500 EUR/month.
Location
We have plenty of amazing workspaces you can choose from: our awesome headquarters in Vilnius, and super cool hubs in Kaunas, Klaipėda, and Riga!
For Vilnius: 4 days onsite, 1 flexible day remote. Meanwhile, in Kaunas: onsite, hybrid, or remote options are available depending on the role.