Big Data Lake Data Modeler
Overview of contract
On behalf of our client, a large bank in Switzerland, we are looking for a Big Data Lake Data Modeler.
Location: Zürich
Start date: 01.04.2025
Duration: 12 months - with possibility of extension
Key Responsibilities
1. Capture and model data requirements, definitions, business rules, and data quality needs.
2. Create logical and physical data models using best practices to ensure data integrity and minimal redundancy.
3. Perform reverse engineering of physical data models.
4. Evaluate existing data models and databases for discrepancies and improvements.
5. Work closely with Data & Analytics Program Management and other stakeholders to co-design the Enterprise Data Strategy.
6. Develop and implement principles, best practices, and standards to ensure consistency in the Common Data Model.
7. Collaborate with development teams to implement data strategies and build efficient data flows.
8. Lead large-scale data migration projects within a Big Data on-premise environment (Lambda Architecture, Apache Stack).
9. Apply expertise in metadata management and reference architectures to develop integrated, data-driven solutions.
Required Skills & Experience
10. At least five years of professional experience in physical and relational data modeling.
11. Proven ability to translate business needs into scalable data models.
12. Experience with large data migration projects and Big Data architectures.
13. Strong knowledge of data modeling standards, metadata management, and data cleansing.
14. Hands-on experience with data transformation/modeling tools such as:Alteryx, Dataiku, Tableau Prep, Hackolade, Erwin
15. Expertise in financial sector data products, classification, and sensitivity handling.
16. Knowledge of agile project management methodologies (Scrum, SAFe).