Data Engineer (Delivery Experience)
Type of contract
Allegro sp. z o.o.
The salary range for this position is PLN 12 300 - 17 600 in gross terms (contract of employment)
A hybrid work model that incorporates solutions developed by the leader and the team
We are looking for a Data Engineer with a focus on the data processing and preparation, deployment and maintenance of our ML/data projects. Join our team to enhance your skills related to deploying data-based processes, MLOps Machine Learning approaches and share the skills within the team.
We are looking for people who have:
- Have at least 3 years of experience as Data Engineer and working with large datasets
- Have experience with cloud providers (GCP preferred)
- Are highly proficient in SQL
- Have strong understanding of data modeling and cloud DWH architecture
- Have experience in designing and maintaining ETL/ELT processes
- Are capable of optimizing cost and efficiency of data processing
- Are proficient in Python for working with large data sets (using PySpark or Airflow)
- Use good practices (clean code, code review, CI/CD)
- Have a high degree of autonomy and take responsibility for developed solutions
- Have English proficiency on at least B2 level
- Like to share knowledge with other team members
An additional advantage would be:
- Experience with Azure and cross-cloud data transfers and multi-cloud architecture
- Python, SQL, (Py)Spark
- Google Cloud Platform (Airflow, BigQuery, Composer)
What we offer:
- A hybrid work model that you will agree on with your leader and the team.
- We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
- A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
- Laptop with m1 processor, 32GB RAM, SSD - a 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
- Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
- English classes that we pay for related to the specific nature of your job
What will your responsibilities be?
- You will be actively responsible for developing and maintaining processes for handling large volumes of data
- You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts
- You will be monitoring and enhancing quality and integrity of the data
- You will manage and optimize costs related to our data infrastructure and data processing on GCP
Why is it worth working with us?
- Big Data is not an empty slogan for us, but a reality - you will be working on really big datasets (petabytes of data)
- You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs
- We are a close-knit team where we work well together
- You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech
Send us your CV and learn why it’s #goodtobehere
What candidates love
about our recruitment process?
Substantive feedback and tips for further development
Clear and fast contact during the entire process
A practice-focused skills review
Recruiters with knowledge and passion
A relaxed atmosphere during the interview