Welcome to Hapag-Lloyd, a leading global logistics company. As the fifth largest container liner shipping company in the world, we are here to make sure that the flow of goods never stops. We are an international team of 12,800 employees working across 400 offices in 128 countries. This year we are growing even more and opening up Hapag-Lloyd’s very first Knowledge Center – in Tricity, Poland.
The Center, located in Gdańsk, will function as a hub for innovation and develop state-of-the-art business and technology solutions to help us navigate the future. And we want to do that together with you.
Not only our vessels are constantly on a voyage around the globe – also we as a company and we as a company internal IT department are constantly on the journey to come closer to our company’s vision of being the number one for quality in our industry.
We are looking for individuals who would like to be part of this endeavor and join our “BI, Analytic and Data Integration” domain as “Data Engineer” helping us to build up the product teams “Data Lake” and “Data Warehouse” in Gdynia. Our Slogan is
„Data is the currency of the future“
We are knowledgeable concerning all data that live within or cross the shore of our IT systems – we are passionate about salvaging its value and make it usable.
and we want YOU to be part of our IT team !
Be part of the vibrant IT development within Logistic and Liner-shipping if you burn for data and analytics and believe that the biggest enemy of a good solution is an even better solution.
For our location in Gdansk we are looking for a
Data Engineer - Knowledge Centre (m/f/d)Responsibilities and Tasks:
Your responsibilities will be multifaceted as the journey just started.
- Build and maintain complex data management systems that combine core data sources into data warehouses or other accessible structures
- Manage data pipelines consisting of a series of stages through which data flows
- Drive automation through effective metadata management
- Learning and using modern data preparation, integration and AI-enabled metadata management tools and techniques
- Perform data conversions, imports and exports of data within and between internal and external software systems
- Develop programs to optimally extract, transform, and load data between complex data sources
- Design and implement processes to ensure data integrity and standardization
- Recommend and implement data reliability, efficiency, and quality improvements
- Ensure the collected data is within required quality standards
- Resolve conflicts between models, ensuring that data models are consistent with the enterprise model (e.g., entity names, relationships and definitions)
- Troubleshoot, diagnose, document and resolve escalated support problems
- Work with your teammates to implement governance for role-based access control (RBAC), cost containment and cloud provider account management
- Uses technology to implement automation and orchestration
Requirements and Qualifications:
A bachelor’s or master’s degree in computer science or business administration is preferred, but not required. Much more important is your experience and your attitude.
- Three to five years of relevant experience in IT and logistics acumen or vice versa that equipped you to communicate effectively with the diverse stakeholders.
- Passion to accompany a product throughout the full life cycle, and work to balance short-term achievements with long-term goals (such as minimizing technical debt while maximizing delivery frequency of business results)
- Strong believe that data governance is much more than a cumbersome task
- Knowledge of trends and developments in the fields of data management and data analytics
- Information management and quantitative skills, including working knowledge of IT infrastructure, various technologies/platforms (AWS, Qlik…), and enterprise-specific vendor solutions
- Ability to understand problems from a broad perspective and anticipate the impact of administrative issues and solutions
- Familiar with big data technologies and Extract Transform Load (ETL) tools in an cloud (AWS) based environment
HR Management • Mrs Mateusz Grabarski
Ballindamm 25 • 20095 Hamburg