Get ready to
- Employ GCP tools (dbt, Airflow and Looker) to enhance data quality, efficiency, and delivery of accurate and timely data
- Identify, investigate and solve data issues: data quality, data discrepancies, missing data
- Contribute to the development and improvement of data solutions
- Leverage the power of dbt for overcoming complex modelling problems with a focus on performance, robustness and scalability
- Work on prevention and alerting solutions
- Adopt and refine our best practices e.g. naming convention, data modeling, and data quality testing
- Communicate with cross-functional teams and non-technical stakeholders in a clear and structured manner
- Assist and support other team members in the design, development, and implementation of data warehousing, reporting, and analytics solutions
- Take ownership of tasks and initiatives
We expect you to
- Have at least 3 years of proven SQL skills: ability to join and manipulate data of various types (String, Integer, JSON, Array), write parameterized scripts, debug SQL code
- Know ETL and warehousing concepts
- Have strong communication skills: timely, clear, and consistent sharing of information, work progress, bottlenecks and findings
- Have curious and growth-oriented mindset towards constant learning
- Be capable to understand, tackle, and communicate problems from both technical and business perspectives
- Be fluent in English and Lithuanian
Salary
Gross salary range is 4000 – 5200 EUR/month.
Location
We have plenty of amazing workspaces you can choose from: our awesome headquarters in Vilnius, and super cool hubs in Kaunas and Klaipėda.
For Vilnius: 4 days onsite, 1 flexible day remote. Meanwhile, in Kaunas, Klaipėda: onsite, hybrid, or remote options are available depending on the role.