Welcome to Hapag-Lloyd, a leading global logistics company. As the fifth largest container liner shipping company in the world, we are here to make sure that the flow of goods never stops. We are an international team of 12,800 employees working across 400 offices in 128 countries. This year we are growing even more and opening up Hapag-Lloyd’s very first Knowledge Center – in Tricity, Poland.
The Center, located in Gdańsk, will function as a hub for innovation and develop state-of-the-art business and technology solutions to help us navigate the future. And we want to do that together with you.
Our Mission - Your Chance
We are on the Mission of building a world class AI Team capable of supporting a world class shipping company like Hapag Lloyd to stay best in class with intelligent customer centric services.
You are passionate about Machine Learning, NLP, CNN, GANS, RL and other AI Methods? Then come on board, cause we have tons of real business cases and labeled data waiting for you to be analyzed and brought to life.
For our location in Gdansk we are looking for a
Junior Data Scientist (m/f/d)Responsibilities and Tasks:
- Design Data- and Model-Driven solutions for tasks that currently need human involvement or are too complex for standard engineering approaches.
- You are orchestrating AI Technologies, Analysis Methods and Statistics to unveil the missing links in current processes and solutions and unchain new innovative solutions that can evolve from a simple prototype to full blown intelligent enterprise products.
- You are analyzing big amounts of structured and unstructured data, augment and identify patterns applying state of the art Data-Mining methodologies, tools and libraries.
- You are curiously evaluating and testing new Papers, Libraries, Third Party Solutions and vividly participate in AI Communities like Meetups, Conferences, Hackathons or Kaggle competitions, always looking for new arising opportunities with high innovation potential for Hapag Lloyd.
AI Product development
- You are a key player when new business case specific AI Modules are developed.
- You analyze and understand the business case, processes and the available data structures
- consult the product teams on the required training data
- support the data engineers on engineering the data cleaning
- lead the feature engineering process
- support the data engineers in building the ETL pipeline
- develop the model experiments and provide the production ready Model
- conduct the operationalization of the model
- participate in the implementation of the learning loop and automatic deployment evaluation of retrained models
- participate in design and implementation of the model performance monitoring
- Support product teams to handle AI Module related incident situations in a fast and solution oriented manner.
AI Platform Development
- You are key player in building the generic, standardized and highly reusable platform of the Hapag Lloyd Data Science Development and Analysis Stack composed of Tools, Services and Modules, to enable the AI Team as well as Business- and System- Analysts to continuously improve time to market and cost efficiency of AI Solutions.
- You are organizing trainings and information sessions for IT and Business Departments as well as for Public Community Events on various AI Topics to spread the knowledge and awareness about the possibilities, limits and future of AI in Logistic and IT.
Requirements and Qualifications:
A bachelor’s or master’s degree in computer science, business administration, mathematics, physics or other scientific area is preferred, but not required. Much more important is your experience and your attitude.
One to two years of relevant experience in enterprise level IT that equipped you to communicate effectively with the diverse stakeholders at corporate level is a good starting point.
- 1 year of development hands on experience, preferably with backend involvement would be great. Experiences with Batch, Web service- or Test-driven development is a plus.
- SQL, Python and related Development Stack elements including Git, common IDEs and ML frameworks are important assets for your endeavor. Java, JS or C++ would be helpful but optional.
- Hands on experience with relational DBMS like DB2, SQLite, PostgreSQL or NoSQL would be a plus.
- You should be used to work with various data formats from tabular data like CSV to some markup formats like HTML and common transfer formats like xml and json. Hands on experience with MS-Office formats, image, audio or video formats are a plus but initially not required.
- Cloud experience like AWS or Azure, … and related concepts and protocols of distributed computing would be helpful but are not a must.
Data Science experiences
- You should have good understanding of Linear Algebra to work with Vectors and Matrices, Statistics, Multivariate Calculus like Integration & Derivatives & Gradients & Optimization, and should be able to come up with numeric stable algorithms.
- You’ll need some hands on experience in building ETL pipelines.
- So you should have experience at least in loading data from files. Being able to consume data streams, extract data from databases or use REST APIs would be interesting but initially optional.
- For ETL you’ll need to have some experience on data cleaning, evaluating basic data statistics, impute and other transformation methods. More is better.
- You basically know about methods to confirm that data is matching model assumptions.
- You should have some hands on experience on every level of a ML pipeline: Data preparation, adequate visualization, model types, model test & evaluation as well as unsupervised learning methods.
- You should bring basic hands on experience in at least one of the following disciplines: NLP, CNNs, RNNs, GANs … and related ML frameworks like Sklearn, Keras, PyTorch or Tensorflow. More is better.
- Being able to present and explain data with proper diagrams as well as experiences with current data science platforms like Anaconda, Dataiku, Rapidminer… is a plus.
- You’ll need to be able to understand, explain and discuss complex topics in fluent English. German and any other language is a plus in context of Multilingual – Models.
- It will be helpful to have experienced working in a SCRUM team. But classical project management topics like task breakdown, requirements engineering, Make or Buy analysis will also be helpful.
- Your good analytical understanding of complex interrelationships and basic experience on pre-processing and evaluation of large amounts of data will support you in dealing with exciting questions.
- Challenging problems want to be solved by you. You show high commitment and want to make a difference.
- You enjoy sharing your expert knowledge with others and thus generate new knowledge.
- Furthermore, you can present results to your team, our business stakeholders in a simple and understandable way. It makes no difference to you whether in Polish or English.
- Strong troubleshooting and problem solving skills.
- Thrive in a fast-paced, innovative environment.
- You enjoy working in an international team and are passionate about new technologies and software.
Hapag-Lloyd Aktiengesellschaft (Spółka akcyjna) Oddział w Polsce
HR Management • Mrs Mateusz Grabarski
Al. Grunwaldzka 413 • 80-309 Gdańsk