LB partners is specialized in improving the EBIT of equipment manufacturers by optimizing the price and margins of drawing spare parts.
LB partners acts on behalf of major companies as part of their corporate value growth plans, driven by management and shareholders.
LB partners offers to companies its expertise, methods and technological process in order to optimize the sales of drawing parts according to three aspects:
* A consistent pricing with the perceived value of the service provided to customers,
* An optimization of margins in line with the company’s added value,
* A customized technological process that guarantees the sustainability of the gains realized.
Join LB partners for a human and technological experience on key strategic projects with fantastic teams and customers.
Mission
Within our technical platform team, your mission will be to master the whole customer data journey from and to our customer systems.
You will work closely with the whole team to understand data requirements, implement scalable data pipelines, and ensure data quality and integrity. This is an opportunity to work with the latest technologies and contribute to impactful projects that drive our company’s success.
Key responsibilities
* Collaborate with stakeholders to understand data needs and requirements.
* Design, build, and maintain robust and scalable data pipelines to ingest, process, and transform data from various sources.
* Optimize data workflows and implement best practices for data storage, retrieval, and processing.
* Ensure data quality and integrity by implementing data validation and cleansing techniques.
* Develop and maintain data architecture and data models to support analytical and operational needs.
* Work closely with data scientists and analysts to provide them with access to high-quality data for analysis and insights generation.
* Continuously evaluate and integrate new technologies and tools to enhance data processing efficiency and performance.
* Troubleshoot and resolve data pipeline issues in a timely manner.
Additionally, your mission will include:
* Engaging actively in the IT team dynamic, propose new ideas and contribute to the overall platform success.
* Participate in customer workshops, contributing to mutual success.
Qualifications
* Master’s degree in computer science, Engineering, or related field.
* Minimum of 3 years of experience in data engineering or related roles.
* Hands-on experience with Linux systems.
* Solid understanding of distributed computing principles and big data technologies.
* Experience with data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts.
* Excellent problem-solving skills and attention to detail.
* Strong communication and collaboration skills, with the ability to work effectively in a fast-paced, dynamic environment.
Preferred Qualifications
* Experience with data processing frameworks such as Apache Spark, Apache Flink, or Apache Beam.
* Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
* Experience with containerization technologies such as Docker and orchestration tools like Kubernetes.
* Familiarity with machine learning and data science concepts.
* Experience with DevOps practices and tools for automation and CI/CD (Continuous Integration/Continuous Deployment).
* A permanent contract with 5 weeks of annual leave.
* Flexible working options with up to 3 days of remote work per week.
* Occasional business trip (on average 2-3 days per month, mostly in Switzerland)
#J-18808-Ljbffr