Task range:
As part of the Global Sourcing Analytics team, you will support Data Engineer and Data Scientist in developing and implementing data flow automation projects.
– Collaborate closely with Data Engineer, Data Scientist and Data Analysts to implement processes automation from start to finish. The team is mostly in Lodz and Zurich.
– Work with various teams to understand their needs and translate them into technical requirements and identify potential automation solutions.
– Develop process & technical documentation.
– Reporting on automation tasks.
– Writing clean, functional code
We Offer:
– Paid internship
– Employment based on a mandate contract
– Flexible working time
– Work in a hybrid model (1-2 days a week in the office)
– An excellent and welcoming work environment
– Possibility of gaining valuable work-related experience
– Training and support of an experienced team
– In the office: chocolate, fresh fruit and vegetables available every day
– A relaxation space offering a variety of options including PlayStation 5, table football, pool, ping-pong, a bookcase, various board games, and massage chairs
– Sports at work? Absolutely! Our office is equipped with exercise bikes and a treadmill featuring a laptop station
Skills:
– You are a last years’ student or graduate in Computer Science or a similar field of software / information systems development
– You have a good knowledge of the Python programming language
– You are able to collaborate on code using the Git version control system
– You have basic knowledge of data engineering field (Data warehouses, data structures, SQL, data modeling)
– You are eager to learn
– You are fluent in English
– You are able to collaborate in a team
Optional:
– You are familiar with the PySpark library & spark technology
– Familiarity with cloud systems, you know what the Microsoft Azure platform is;
– Knowledge of DevOps principles
– You had experience with Tableau or PowerBI or both