Job Title: C / C++ Developer
The firm's performance heavily relies on efficient data processing pipelines. You will frequently need to utilize your developer skills to discover innovative methods to optimize your code.
Routine Challenges:
* Rapid communication with exchanges via low-latency networking code;
* Context-switch-free code;
* Design and implementation of custom data storage structures with minimal footprint;
* Data pipelines using a streaming paradigm;
* Complex trading logic for the decision engine with the least possible compute time;
* Reimplementation of existing code using advanced CPU features (SIMD).
To accommodate the firm's expansion, they are investing more efforts in maintainability and manageability of large, highly optimized, and multithreaded codebases while preserving its primary purpose (low latency). You will need to navigate between these worlds to achieve the best results.
Role Responsibilities:
In addition to coding, the Tech team constructs and maintains the global hardware infrastructure to facilitate trading. In your role, you will regularly be involved in all aspects of the pipeline and different tech stacks; from hardware compositions, network design to data logging pipelines for the traders and quants.
Requirements:
* Comprehensive knowledge of C and C++ in Linux;
* An understanding of - and the ability to verify - the assembly the compiler will produce from the code you write;
* Understanding what system calls you invoke and their cost;
* Knowledge of x64 hardware and how to use it efficiently;
* Understanding what storage structures to select or implement given their use;
* Ability to work with debuggers and profilers;
* Operative coding (git, documentation);
* Ability to learn from and contribute knowledge to the team;
* Pro-active, self-motivated, honest, adaptable and stress resistant.
Desirable Specializations:
* Experience with network engineering; either low-latency networking hardware deployment or networking protocol implementations;
* Experience with Big Data engineering and knowledge of Big Data best practices and implementations best suited for different use cases.