Data Analytics Engineer
Swisslinx is seeking an open-minded and team-oriented Data Engineer with strong Python experience to support a transformation program leveraging new data technologies.
This initial contract runs for 12+ months with a strong likelihood of extension on a long-term basis. The role involves implementing new data sets, migrating existing data sets to an on-premises data lake, and eventually moving to the cloud.
The successful candidate will help unlock new capabilities such as real-time streaming and advanced analytics, working closely with expert business analysts, technologists, project managers, data scientists, and statisticians.
Main Responsibilities:
* Develop end-to-end data pipelines using Spark, SQL, and other technologies to extract, transform, and load data from various sources, incorporating business rules and security requirements.
* Assist business analysts in identifying, capturing, and analyzing business requirements.
* Translate functional and technical requirements into detailed designs.
* Contribute to overall data architecture and design.
* Work with stakeholders to address data-related technical issues and support their data needs.
* Ensure effective system architectures through frequent product deliveries and employ governance methods for transparency and communication.
* Stay up-to-date with industry standards and technological advancements to improve output quality.
* Support proof of concepts as data technology evolves.
Requirements:
* Minimum 5 years' experience in modern data technologies.
* Experience building data ingestion pipelines for data warehouse and/or data lake/lakehouse architecture.
* Hands-on development using open-source data technologies such as Hadoop, Hive, Spark, HBase, Kafka, Impala, dbt, ELK etc., preferably with Cloudera.
* Strong experience in data modeling, design patterns, and building highly scalable applications.
* Strong experience with Python.
* Fluency in English.
Nice to Have:
* Experience in a Banking/Financial institution, e.g., knowledge of financial instruments, trade lifecycle, risk management.
* Experience with lakehouse technologies, particularly Apache Iceberg.
* Experience with relational SQL and NoSQL databases: SQL Server, Sybase IQ, Postgres, Cassandra, etc.
* Experience with data pipeline and workflow management tools: Airflow, RunDeck, Nifi etc.
* Experience with stream-processing systems: Kafka, Spark-Streaming, etc.
* Experience with CICD pipelines.
* Experience in Agile methodologies like Scrum, Kanban.
* Experience in Automated Testing, Test-driven Development, debugging, troubleshooting, and optimizing code.
Compensation Benefits:
* A diverse and international work environment on a long-term basis.
* 50% working from home.
* 20 days working remotely from abroad per year.
* Modern office in central Basel with subsidised canteen.