Social network you want to login/join with:
On behalf of our client, a large bank in Switzerland, we are looking for a Big Data Lake Data Modeler.
Duration: 12 months - with possibility of extension
Key Responsibilities
* Capture and model data requirements, definitions, business rules, and data quality needs.
* Create logical and physical data models using best practices to ensure data integrity and minimal redundancy.
* Perform reverse engineering of physical data models.
* Evaluate existing data models and databases for discrepancies and improvements.
* Work closely with Data & Analytics Program Management and other stakeholders to co-design the Enterprise Data Strategy.
* Develop and implement principles, best practices, and standards to ensure consistency in the Common Data Model.
* Collaborate with development teams to implement data strategies and build efficient data flows.
* Lead large-scale data migration projects within a Big Data on-premise environment (Lambda Architecture, Apache Stack).
* Apply expertise in metadata management and reference architectures to develop integrated, data-driven solutions.
Required Skills & Experience
* At least five years of professional experience in physical and relational data modeling.
* Proven ability to translate business needs into scalable data models.
* Experience with large data migration projects and Big Data architectures.
* Strong knowledge of data modeling standards, metadata management, and data cleansing.
* Hands-on experience with data transformation/modeling tools such as:Alteryx, Dataiku, Tableau Prep, Hackolade, Erwin
* Expertise in financial sector data products, classification, and sensitivity handling.
* Knowledge of agile project management methodologies (Scrum, SAFe).
#J-18808-Ljbffr