Getting Started with Landoop’s Kafka on Docker for Windows
Here’s a quick guide to running Kafka on Windows with Docker. I’ll show you how to pull Landoop’s Kafka image from Docker Hub, run it and how you can get started with Kafka. We’ll also be building an ASP.NET C# console app for this demonstration. I learned most of this from the Kafka Udemy videos mentioned below and someone showed me a Kafka C# demonstration. I’ve modified everything I’ve been taught to a more simpler version. This article is for people who are brand new to Kafka and want to work with ASP.NET.
Source Code: Kafka C# Sample
Requirements
Make sure you installed and are familiar with Docker for Windows, ConEmu, Visual Studio and C#.
Docker for Windows uses the localhost address of 127.0.0.1 where Docker Toolbox / Docker Machine will use 192.168.99.100. Landoop’s fast-data-dev Github readme currently displays the steps for docker-machine.
Getting Setup with Docker
After you’ve installed Docker,you will need to pull down the Landoop Kafka (fast-data-dev) image from Docker Hub. The commands below will pull the image.
1 |
docker pull landoop/fast-data-dev |
Running Landoop’s Kafka Container in Docker
We will start this container in a detached state (“-d”) and then we will access the logs so we can see what’s happening.
1 |
docker run -d --name landoopkafka -p 2181:2181 -p 3030:3030 -p 8081:8081 -p 8082:8082 -p 8083:8083 -p 9092:9092 -e ADV_HOST=127.0.0.1 landoop/fast-data-dev |
To see the logs of the container and have them persist to the screen pass the docker attribute “-f” for follow. This screen is important for troubleshooting errors. Some common errors you may see are memory issues. The URL for the Kafka UI is also in the logs.
1 |
docker logs -f landoopkafka |
Landoop’s Kafka UI

Once the Docker image for fast-data-dev is running you will be able to access theLandoop’s Kafka UI. This gives developers the ability to see in real time what Kafka is doing, how it creates and manages topics. You can visually see configuration and topic data in the UI.
Landoop’s Kafka UI: http://127.0.0.1:3030/
Working with Kafka via Command Line
It will benefit you to learn how Kafka works from a command line basis. Users can create topics, publish and subscribe; all from the bash terminal. This is a great starting place because you can see exactly how it works under the hood.
Accessing Docker Container with Bash
Executing the command below will connect you to the container with bash. Doing so will allow you to interact with Kafka via command line.
1 |
docker exec -it landoopkafka bash |
Kafka CLI Producer
We’re going to create our first topic, “first_topic” and push a single string of data to it.
If you receive an error message something similar to ” Error: {new_topic={LEADER_NOT_AVAILABLE}”, this is normal because it takes time to create the topic.
1 2 |
kafka-console-producer --topic first_topic --broker-list 127.0.0.1:9092 hello from producer |
Kafka CLI Consumer
The consumer will immediately listen for publishes from the producer.
1 |
kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic first_topic |
However, if you want to pull all of the data from the beginning you can run this command:
1 |
kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic first_topic --from-beginning |
Confluent Kafka with DotNet Core
The source code for this application is maintained in GitHub. This demo is very simple, we’ll use Kafka as a pub/sub for messages. We’ll serialize a model class, Msg and publish it to Kafka and allow the consumer to use C# to read that data.
Source Code: Kafka C# Sample
Confluent Kafka: https://github.com/confluentinc/confluent-kafka-dotnet
Message Data Model
This data model will be shared between stored in its own project (Kafka.Data) and shared between the Producer and the Consumer.
1 2 3 4 5 6 7 8 9 10 11 |
namespace Kafka.Data.Models { public class Msg { public string User { get; set; } public string Message { get; set; } public DateTime? TimeStamp { get; set; } } } |
Kafka C# Producer Source Code
This is a simple sample that creates a console application DotNet core that captures the username and message then publishes that data to Kafka.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
namespace Kafka.Producer { class Program { private readonly static Dictionary<string, object> Config = new Dictionary<string, object> { { "bootstrap.servers", "127.0.0.1:9092" }, { "acks", "all" }, { "batch.num.messages", 1 }, { "linger.ms", 0 }, { "compression.codec", "gzip" } }; static void Main(string[] args) { try { // banner Console.WriteLine("Kafka Producer Sample."); Console.WriteLine("Press CTRL+C to exit"); var cancelled = false; Console.CancelKeyPress += (_, e) => { e.Cancel = true; // prevent the process from terminating. cancelled = true; }; while (!cancelled) { // get name Console.WriteLine("\nWhat is your name?"); var name = Console.ReadLine(); // get message Console.WriteLine("\nWhat is your message?"); var message = Console.ReadLine(); // set time stamp var msgTimeStamp = DateTime.Now; Msg msg = new Msg(); msg.User = name; msg.Message = message; msg.TimeStamp = msgTimeStamp; // send to Kafka using (var producer = new Producer<Null, string>(Config, null, new StringSerializer(Encoding.UTF8))) { // convert the Msg model to json var jsonMsg = JsonConvert.SerializeObject(msg); // push to Kafka var dr = producer.ProduceAsync("topic_messages", null, jsonMsg).Result; producer.Flush(1); } // produce message to Kafka Console.WriteLine("Message published to Kafka."); Console.WriteLine("\n"); } } catch (System.Exception ex) { Console.WriteLine(ex.ToString()); } } } } |
Kafka C# Consumer
This is a sample of how to read the data from the Kafka stream.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
namespace Kafka.Consumer { class Program { private static readonly Dictionary<string, object> Config = new Dictionary<string, object> { { "group.id", "msg-id" }, { "bootstrap.servers", "127.0.0.1:9092" }, { "auto.offset.reset", "earliest" } }; static void Main(string[] args) { try { using (var consumer = new Consumer<string, string>(Config, new StringDeserializer(Encoding.UTF8), new StringDeserializer(Encoding.UTF8))) { consumer.OnMessage += (key, msg) => { Console.WriteLine($"Read '{msg.Value}' from: {msg.TopicPartitionOffset}"); }; consumer.OnError += (_, error) => Console.WriteLine($"Error: {error}"); consumer.OnConsumeError += (key, msg) => Console.WriteLine($"Consume error ({msg.TopicPartitionOffset}): {msg.Error}"); consumer.Subscribe("topic_messages"); while (true) { consumer.Poll(TimeSpan.FromMilliseconds(100)); } } } catch (System.Exception ex) { Console.WriteLine(ex.ToString()); } } } } |
Troubleshooting
Some people have issues with getting this up and running because they aren’t famililar with the surrounding technologies like Docker.
winpty – if you are running Windows and run into an issue prepend the command with winpty so it reads “winpty docker run -d….”
Closing
We recently started using Kafka at my current job and I thought it would be beneficial to share a quick tutorial of how to use C#, Confluent Kafka and Landoop’s fast-data-dev. I couldn’t find a quick all-in-one tutorial for the basics so I created one.
Confluent Kafka Git Repo
Udemy Videos
If you’d like to learn more about Kafka I highly recommend watching these videos by Stephane Maarek at Udemy.com
https://www.udemy.com/apache-kafka/
https://www.udemy.com/kafka-streams/
https://www.udemy.com/kafka-cluster-setup/
https://www.udemy.com/kafka-connect/