Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)
Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)

Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)

Vollzeit 48000 - 84000 € / Jahr (geschätzt) Kein Home Office möglich
R

Auf einen Blick

  • Aufgaben: Administer Kafka clusters and ensure high performance in cloud environments.
  • Arbeitgeber: Join Resource Drive Consulting, a global leader in tech solutions.
  • Mitarbeitervorteile: Enjoy remote work flexibility and opportunities for professional growth.
  • Warum dieser Job: Be part of cutting-edge technology and make an impact in data streaming.
  • Gewünschte Qualifikationen: Experience with Kafka, RabbitMQ, and cloud environments is essential.
  • Andere Informationen: Mention you found this job on Pangian.com Remote Network when applying.

Das voraussichtliche Gehalt liegt zwischen 48000 - 84000 € pro Jahr.

Kafka Admin With RabbitMQ Experience In Self-Hosted Cloud Environment (Remote)

Resource Drive Consulting | Worldwide

Position Overview: Kafka Admin with RabbitMQ experience in self-hosted and cloud environments.

Job Description:

  1. Standing up and administering On-Prem & cloud-native Kafka clusters.
  2. Ability to architect and create a reference architecture for Kafka implementation standards.
  3. Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy, and Kafka Control Center.
  4. Ensure optimum performance, high availability, and stability of solutions.
  5. Provide expertise and hands-on experience working on Kafka connect using schema registry in a very high volume.
  6. Administration and operations of the Kafka platform like provisioning, access lists, Kerberos, and SSL configurations.
  7. Expertise in Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, and Transforms.
  8. Hands-on experience on custom connectors using the Kafka core concepts and API.
  9. Involve in design and capacity review meetings to provide suggestions in Kafka usage.
  10. Participate in work planning and estimation.
  11. Create topics, setup redundancy cluster, deploy monitoring tools, alerts, and knowledge of best practices.
  12. Create stubs for producers, consumers, and consumer groups for onboarding applications from different languages/platforms.
  13. Use automation tools like provisioning using Docker, Jenkins, and GitLab.
  14. Setting up security on Kafka, monitor, prevent, and troubleshoot security-related issues.
  15. Ability to perform data-related benchmarking, performance analysis, and tuning.
  16. Competent working in one or more environments highly integrated with an operating system.
  17. Ability to manage tasks independently and take ownership of responsibilities.
  18. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences.
  19. Experience integrating Kafka with other technology products.
  20. Understanding cloud data streaming technologies (Kafka/ksqlDB, StreamSets, AWS Kinesis, Google Cloud Platform PubSub, etc.), stream processing, event-driven architectures.
  21. Knowledge of techniques for data injection and extraction into/from Kafka-based or similar pipelines.
  22. Working knowledge on Kafka Rest proxy.
  23. Experience in Installation, Configuration, and Administration of RabbitMQ on UNIX/Cloud environments.
  24. Strong knowledge of message-oriented middleware concepts including different messaging styles (Asynchronous, pub-sub), Messaging APIs (JMS, STOMP, AMQP, REST).
  25. Setting up of exchanges, queues, and virtual hosts.
  26. Implementation experience in Clustering, Security, and High Availability of RabbitMQ nodes.
  27. Monitoring alarms of RabbitMQ.
  28. Troubleshooting and migration of RabbitMQ.
  29. Experience and knowledge with Cloud (AWS MQ).
  30. Expertise in deploying and maintaining in Cloud environments.

Good to have:

  1. Experience and knowledge of common issues associated with RHEL Servers and strong verbal and written communication skills.
  2. Good scripting skills (Python, etc.); experience with Java.
  3. Strong skills in In-memory applications and Data Integration.
  4. Working knowledge of Ansible & Automation.
  5. Knowledge of Kubernetes is a plus.
  6. Working knowledge of Cluster management.

When applying, state you found this job on Pangian.com Remote Network.

#J-18808-Ljbffr

Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote) Arbeitgeber: Resource Drive Consulting

At Resource Drive Consulting, we pride ourselves on being an exceptional employer that values innovation and collaboration in a fully remote work environment. Our team enjoys a flexible work culture that fosters professional growth through continuous learning opportunities and access to cutting-edge technologies. Join us to be part of a dynamic organization where your expertise in Kafka and RabbitMQ will not only be recognized but also rewarded, as we collectively drive impactful solutions in the cloud.
R

Kontaktperson:

Resource Drive Consulting HR Team

StudySmarter Bewerbungstipps 🤫

So bekommst du den Job: Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)

✨Tip Number 1

Make sure to highlight your hands-on experience with Kafka and RabbitMQ in your conversations. Discuss specific projects where you set up or managed Kafka clusters, as this will demonstrate your practical knowledge and problem-solving skills.

✨Tip Number 2

Familiarize yourself with the latest trends and best practices in cloud-native Kafka implementations. Being able to discuss these topics during interviews will show that you're not only experienced but also proactive about staying updated in the field.

✨Tip Number 3

Prepare to discuss your experience with automation tools like Docker, Jenkins, and GitLab. Employers love candidates who can streamline processes and improve efficiency, so be ready to share examples of how you've used these tools in past roles.

✨Tip Number 4

Since communication is key for this role, practice explaining complex technical concepts in simple terms. This will help you convey your expertise effectively during interviews and demonstrate your ability to work with diverse teams.

Diese Fähigkeiten machen dich zur top Bewerber*in für die Stelle: Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)

Kafka Administration
RabbitMQ Installation and Configuration
Cloud Environment Management
On-Premises Kafka Cluster Setup
Kafka Connect and Schema Registry Expertise
Performance Tuning and Benchmarking
Security Configuration (Kerberos, SSL)
Message-Oriented Middleware Concepts
Data Streaming Technologies (AWS Kinesis, Google Cloud PubSub)
Automation Tools (Docker, Jenkins, GitLab)
Custom Connector Development
Monitoring and Troubleshooting Kafka and RabbitMQ
Capacity Planning and Design Review
Scripting Skills (Python, Java)
Event-Driven Architecture Understanding
Cluster Management Knowledge
Strong Communication Skills

Tipps für deine Bewerbung 🫡

Understand the Job Requirements: Carefully read through the job description to understand the specific skills and experiences required for the Kafka Admin position. Highlight your relevant experience with Kafka, RabbitMQ, and cloud environments in your application.

Tailor Your CV: Customize your CV to emphasize your experience with Kafka clusters, RabbitMQ administration, and any relevant cloud technologies. Use keywords from the job description to ensure your CV aligns with what the company is looking for.

Craft a Strong Cover Letter: Write a cover letter that showcases your expertise in Kafka and RabbitMQ, along with your ability to work independently in a remote environment. Mention specific projects or achievements that demonstrate your skills and how they relate to the job.

Mention Your Source: When applying, make sure to state that you found this job on Pangian.com Remote Network. This shows your attention to detail and helps the company track their recruitment sources.

Wie du dich auf ein Vorstellungsgespräch bei Resource Drive Consulting vorbereitest

✨Showcase Your Kafka Expertise

Be prepared to discuss your hands-on experience with Kafka clusters, including administration and performance tuning. Highlight specific projects where you implemented Kafka solutions, focusing on your role in architecting and creating reference architectures.

✨Demonstrate RabbitMQ Knowledge

Since the role requires RabbitMQ experience, be ready to explain your familiarity with its installation, configuration, and administration. Discuss any clustering, security, and high availability implementations you've worked on, as well as troubleshooting techniques.

✨Communicate Complex Concepts Clearly

The ability to articulate complex technical information is crucial. Practice explaining Kafka and RabbitMQ concepts in simple terms, as you may need to communicate with non-technical stakeholders during the interview.

✨Prepare for Automation and Scripting Questions

Given the emphasis on automation tools like Docker, Jenkins, and GitLab, be ready to discuss your experience with these technologies. Additionally, brush up on your scripting skills, particularly in Python or Java, as this could come up during technical discussions.

Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)
Resource Drive Consulting
R
  • Kafka Admin With Rabbitmq Exp In Self Hosted Cloud Environment Wfh (Remote)

    Vollzeit
    48000 - 84000 € / Jahr (geschätzt)

    Bewerbungsfrist: 2026-12-10

  • R

    Resource Drive Consulting

    50 - 100
Ähnliche Positionen bei anderen Arbeitgebern
Europas größte Jobbörse für Gen-Z
discover-jobs-cta
Jetzt entdecken
>