Job description: Oversee the maintenance, updates, and expansion of in-house exchange connector library. Ensure the data infrastructure is reliable, scalable, and well-maintained, handling both real-time and historical data. Implement and enforce data security and integrity measures, ensuring compliance with industry standards. Proactively monitor and optimize the performance and efficiency of data systems for peak operation. Collaborate closely with the trading team to deliver accurate, timely data, creating simple and effective scripts to streamline data flow. About the customer: On behalf of our client, a leading proprietary trading firm, we are seeking a highly skilled Data Engineer (Trading Engine Operations). This role demands the management of both real-time and historical data, along with the continuous enhancement and expansion of the library of exchange connectors. The ideal candidate will be an expert in Python, system optimization, and debugging, coupled with a strong drive for creating high-quality, PnL-focused solutions. This is a full-time contract role (100%) with a duration of 6 months, based on-site in Zurich. Requirements: Minimum of 2-3 years of experience in software development, with a strong focus on data processing. Master's degree in Computer Science, Data Science, Engineering, or a related field. Proven experience with exchange connector development and integration. Advanced proficiency in Python (Rust experience is a plus). Expertise in Unix-based systems and cloud platforms, particularly AWS. In-depth knowledge of real-time data messaging, specifically Redis and WebSockets. Strong command of SQL (Postgres) and hands-on experience with non-SQL data warehousing. Familiarity with deployment and monitoring tools such as Supervisor, Docker, Grafana, and Nagios. Solid experience working with RESTful APIs. Experience in the finance or cryptocurrency sectors is highly desirable. Exposure to low-latency systems or high-frequency trading environments is advantageous. Ability to develop efficient and intuitive scripts to streamline data flow.