Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
Job Description
* A Confluent Kafka Engineer is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions using Confluent's platform.
* This involves configuring, deploying, and monitoring Kafka clusters to ensure high availability and scalability of data streaming services.
* Design and Implement: Create Kafka-based data pipelines and messaging solutions to support real-time data processing.
* Configure and Deploy: Maintain Kafka clusters, ensuring high availability and scalability.
* Monitor and Troubleshoot: Ensure uninterrupted data flow and minimize downtime.
* Collaborate: Work with development teams to integrate Kafka into applications and services.
* Security: Implement measures to protect Kafka clusters and data streams.
* Optimize: Enhance Kafka configurations for performance and reliability.
* Automate: Use tools like Terraform or Ansible to automate Kafka operations.
* Support: Provide technical guidance on Kafka best practices.
* Documentation: Maintain detailed records of Kafka environments and processes.
* Stay Updated: Keep up with the latest Kafka features and industry best practices.
* Knowledge on IBM MQ, APIC, APIM, ODM
Primary Skills
* Kafka
* IBM API Connect (APIC)
* IBM API Management (APIM)
Secondary Skills
* IBM MQ, ODM
Experience Level: Experienced Professionals
Contract Type: Permanent
Location: Mumbai (ex Bombay), IN
Brand: Capgemini
Professional Community: Cloud Infrastructure Management
#J-18808-Ljbffr