Connecting Kafka Tool to Confluent Cloud
Kafka Tool is a great tool to use to see data that is being put on the Kafka Log. I use it regularly to verify topics, data, and configuration within a Kafka cluster. Confluent Cloud provides a cloud solution that can run on Azure, AWS, and Google Cloud. Their platform provides a lot of useful features like a UI that can create topics, monitor data, and consumer lag.
You can get started with Confluent Cloud for FREE! They give you a 1 week $200 dollar credit and you can also apply other promo codes.
Setting up a User API Key in Confluent Cloud
First, you’re going to have to set up an API key to access the Confluent Cloud cluster.
Click API Access on the left navigation and then click “Create key”.
You’re going to want to select the default selection, “Global access”. This will grant your user full access to the cluster.
On the final screen, you can get your API key and secret. You will need this to set up Kafka Tool. Save this information because once you click “Save” it will be unavailable and you will have to repeat the process. I also put my name under the description, it makes life easier.
Configuring Kafka Tool for Confluent Cloud
Before getting started with Kafka Tool make sure it is up-to-date. You will need to make sure you have the latest version because Confluent Cloud keeps their clusters close to the latest releases of Apache Kafka. As I’m writing this post, they are on version 2.6 and the Kafka Tool supports 0.11 and up. I’m using Kafka Tool 2.0.9 as of 1/21/2021.
Step 1: Create a New Connection
Click “File” -> “Add New Connection” or right-click “Clusters” and select “Add New Connection”. Your values should look similar to this.
Note: I expect Zookeeper will go away and will not matter in the near future. Ping does not work here.
KIP-555: Deprecate Direct Zookeeper access in Kafka Administrative Tools
Step 2: Security
Click the “Security” tab and set the Type to “SASL SSL”. You will not need to set any values here.
Step 3: Advanced
Next, click the “Advanced” tab and set the bootstrap servers, SASL Mechanism to PLAIN, and check the Offset Topic option to make use of a background thread.
Step 4: JASS Config
OK, so this part was a bit confusing for me then it finally worked. You will need to put your API key for the username and API secret for the password that you saved earlier from Confluent Cloud.
org.apache.kafka.common.security.plain.PlainLoginModule required username="USERNAME" password="PASSWORD";
Step 5: Verify it Works!
Using Confluent Cloud’s UI I was able to easily create a topic and produce sample data to that topic. This made verification easy.
It took me a lot of time to become familiar and several tries to figure this out! I hope this helps someone out. If it helped you be sure to comment below! Thanks!