At Julius Baer, we celebrate and value the individual qualities you bring, enabling you to be impactful, to be entrepreneurial, to be empowered, and to create value beyond wealth. Let’s shape the future of wealth management together.
In our team, you will be a key contributor to building our Data Platform as a foundation for Data Analytics and AI.
YOUR CHALLENGE
* Design, develop, and maintain data pipelines and backend services for real-time decisioning, reporting, data collecting, and related functions
* Data Requirements and Modelling
* Data Management and Transformation
* Produce high-quality, well-tested, and secure code
* Develop and maintain software designed to improve data governance and security
* Develop processes designed to ensure Data Security and Data Quality
YOUR PROFILE
Required experience and skills:
* Experience with structured, semi-structured, and unstructured data
* Experience utilizing a variety of data stores, including data warehouses, RDBMSes, in-memory caches, and searchable document Databases
* Experience working with large data sets in SQL/Databricks/PySpark
* Experience with Spark Streaming, and Delta (Live) Tables
* Experience with storage systems such as Azure Storage/ Data Lakes
* Experience with data modelling
* Strong design, implementation, and testing skills
* Experience developing for continuous integration and automated deployments
* Experience developing on cloud platforms (preferably Azure) in a continuous delivery environment
Additional or Preferred Qualifications:
* Experience with microservices platforms (Kubernetes, Docker, Helm Charts ..)
* Experience with Event-Driven streaming systems (Kafka, Event Hub, Event Grid, Apache Flink)
* Knowledge and use of DBT
* Experience with data vault is a plus
* Experience with Microsoft Fabric
* Experience with BI tools
* 1+ years software development using languages like JavaScript, Java, or C#
Education and soft skills:
* BS/MS in Computer Science, Data Analytics, Data Management, Information Systems or related technical field or equivalent training & experience
* 5+ years Incorporating data processing and workflow management tools into pipeline design
* 5+ years’ experience in ETL/ELT, Data warehousing, and/ or Business Intelligence Development
* 5+ years building and maintaining end-to-end data systems and supporting services in Python, Scala or similar
* 5+ years using SQL to understand data and to investigate data issues or problems and provide solutions
* 3+ years working with cloud data technologies
* Excellent communication and collaboration skills
* Ability to provide technical leadership to other developers
* English is a must, German is optional
We only consider candidates who can start immediately.
We are looking forward to receiving your full job application through our online application tool.
#J-18808-Ljbffr