@@ -8,6 +8,8 @@ It has numerous use cases, including distributed logging, stream processing and
881 . [ Data streaming with Apache Kafka] ( https://developer.confluent.io/ )
992 . [ Kafka 101] ( https://developer.confluent.io/courses/apache-kafka/events/ )
10103 . [ How Kafka works] ( https://www.confluent.io/blog/apache-kafka-intro-how-kafka-works/ ) (Great!)
11+ 4 . [ confluent kafka dotnet examples] ( https://github.com/confluentinc/confluent-kafka-dotnet/tree/master/examples )
12+ 5 . [ Apache Kafka for .NET developers] ( https://developer.confluent.io/courses/apache-kafka-for-dotnet/overview/ ) (Great!)
1113
1214## Terminology
1315### Event
@@ -17,7 +19,7 @@ Kafka encourages you to see the world as sequences of events, which it models as
1719
1820Events are immutable, as it is (sometimes tragically) impossible to change the past.
1921
20- ### Topic (category of messages)
22+ ### Topic (Think of it as category of messages, table, log etc. )
2123Because the world is filled with so many events, Kafka gives us a means to organize them and keep them in order: topics.
2224A topic is an ordered log of events.
2325
@@ -105,6 +107,39 @@ Ashishs-MacBook-Pro:dotnet-kafka ashishkhanal$ dotnet new sln
105107### Add a console app as Consumer
106108<img width =" 450 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/25394bc5-984c-4eaf-a071-3cb20a8fec59 " >
107109
110+ #### Setup appsettings.json
111+ [ Reference] ( https://learn.microsoft.com/en-us/dotnet/core/extensions/configuration#alternative-hosting-approach )
112+
113+ Install ` Microsoft.Extensions.Hosting ` package.
114+
115+ Add ` appsettings.json ` , and set these options:
116+ - Build action: Content
117+ - Copy to output directory: Copy if newer
118+
119+ And use it
120+ https://github.com/akhanalcs/dotnet-kafka/blob/7313b07edd0d5e2a947b813aae2598ab596f298b/Consumer/Program.cs#L4-L10
121+
122+ #### Use user-secrets to store API key and secret
123+ [ Reference] ( https://learn.microsoft.com/en-us/aspnet/core/security/app-secrets?view=aspnetcore-8.0&tabs=windows#enable-secret-storage )
124+
125+ ``` bash
126+ dotnet user-secrets init
127+ ```
128+
129+ This command adds a ` UserSecretsId ` element, populated with a GUID, to the project file.
130+
131+ If you want the Producer to also access secrets pointed by this Id, copy this element into the Producer's project file as well.
132+
133+ Now you can store API keys and secrets in there without it being checked into source control.
134+
135+ Right click the project -> Tools -> [ .NET User Secrets] ( https://plugins.jetbrains.com/plugin/10183--net-core-user-secrets )
136+
137+ <img width =" 350 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/507002f2-4517-45f6-b285-8b87a30e981f " >
138+
139+ Put your secrets here
140+
141+ <img width =" 350 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/9168a6f6-5fc2-4a64-82ac-6db5ff2f3262 " >
142+
108143### Install dependencies
109144Manage Nuget Packages
110145
@@ -114,6 +149,11 @@ Install it in both projects
114149
115150<img width =" 850 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/b85f2175-ac38-4a30-9588-190d444171ff " >
116151
152+ ## Install Java
153+ Go to [ Java Downloads] ( https://www.oracle.com/java/technologies/downloads/ ) and install the latest JDK. (JDK 21 as of Feb 2024).
154+
155+ [ Not required, so not doing it now]
156+
117157## Local Kafka cluster setup
118158### Install confluent cli
119159``` bash
@@ -186,7 +226,11 @@ Reload the bash profile file using the source command:
186226source ~ /.bash_profile
187227```
188228
189- I gave up on local setup at this time as I could not make it work even after scouring the interent. So now on to Confluent cloud.
229+ It works at this point. For more details, check out my [ StackOverflow question] ( https://stackoverflow.com/q/77985757/8644294 ) .
230+
231+ <img width =" 950 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/9a9b15c5-4a0f-4d5c-8249-3883810c1f7c " >
232+
233+ I gave Confluent Cloud a try instead of local cluster at this time.
190234## Run Kafka in Confluent Cloud
191235### Signup
192236[ Go to Signup page] ( https://www.confluent.io/confluent-cloud/tryfree/ )
@@ -207,25 +251,204 @@ Now go to Confluent cloud by clicking Launch
207251
208252<img width =" 300 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/672d9b73-cd46-4a59-9f23-a5c613362b00 " >
209253
210- ### Create cluster
254+ ### Create environment
211255Go to Home
212256
213- < img width = " 450 " alt = " image " src = " https://github.com/akhanalcs/dotnet-kafka/assets/30603497/23081af9-9797-4b86-8e59-6ebe5d686162 " >
257+ Environments -> Add cloud environment
214258
215- Add Cluster -> Create cluster -> Choose "Basic" cluster type
259+ < img width = " 350 " alt = " image " src = " https://github.com/akhanalcs/dotnet-kafka/assets/30603497/a569d5b1-58e8-411a-8984-b281a69bcb6f " >
216260
217- <img width =" 600 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/8b188706-57f3-410e-8af9-de32b2aba91b " >
261+ -> Create
262+
263+ Stream Governance Packages -> Essentials -> Begin configuration
264+
265+ #### Select which cloud and region you want to create your Schema Registry and Stream Catalog in (i.e. where you will be storing the metadata)
266+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/2c57f793-801d-4f67-a07c-07a233ec53a0 " >
267+
268+ -> Enable
269+
270+ #### View it using CLI
271+ ``` bash
272+ confluent login
273+ ```
274+
275+ <img width =" 200 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/653fcab1-692d-4c27-b8ae-42ae9561d94f " >
276+
277+ CLI shows the successful login
278+
279+ <img width =" 1000 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/6646fbe7-56c9-4e2c-a913-00c0e1e861b3 " >
280+
281+ ``` bash
282+ confluent environment list
283+ ```
284+
285+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/8efab508-ae07-44d3-9d59-bba678c8630f " >
286+
287+ Set the new environment I just created as the active environment:
288+ ``` bash
289+ confluent environment use env-19vow5
290+ ```
291+
292+ Now the ` * ` has changed
293+
294+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/3f73b8fa-4c31-40dd-90aa-5260aec5400c " >
295+
296+ ### Create cluster
297+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/f1070fff-b389-409f-baf9-27078918b2b2 " >
298+
299+ -> Create cluster on my own
300+
301+ Create cluster -> Basic
218302
219- Click "Launch cluster"
303+ < img width = " 200 " alt = " image " src = " https://github.com/akhanalcs/dotnet-kafka/assets/30603497/de308359-313f-4346-a51b-0caf12e2470e " >
220304
305+ -> Begin configuration
306+
307+ <img width =" 600 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/8b188706-57f3-410e-8af9-de32b2aba91b " >
308+ <br >
221309<img width =" 350 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/ea734e4b-1f4a-4bea-992c-2d03d8d7d021 " >
222310
223- ### Grab Bootstrap server address
224- Home -> Environments -> default -> cluster_0 -> Cluster Settings -> Endpoints
311+ -> Launch cluster
312+
313+ ### Billing and payment
314+ Payment details
315+
316+ Add coupon code: ` DOTNETKAFKA101 `
317+
318+ <img width =" 750 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/b47bd8f5-af4e-43a1-9a24-58a48f6fdc90 " >
319+
320+ ### Add an API key for Kafka cluster (Cluster level)
321+ We will need an API Key to allow applications to access our cluster.
322+
323+ <img width =" 600 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/55e85853-d27f-4e01-b5c2-c9c76995f0bc " >
324+
325+ -> Create key
326+
327+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/f94892bd-6027-49d7-aa6e-53702ced8a40 " >
328+
329+ Global access -> Next
330+
331+ Download and save the key somewhere for future use.
332+
333+ ### Add an API key for Schema Registry (Environment level)
334+ 1 . From the main menu (top right) or the breadcrumb navigation (top) select ** Environments** .
335+ 2 . Select the ** kafka-with-dotnet** environment.
336+ 3 . In the right-hand menu there should be an option to ** Add key** . Select it and create a new API Key.
337+
338+ <img width =" 250 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/63252f75-54ab-4166-9902-f3635e86ba8f " >
339+ <br >
340+ <img width =" 250 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/f4e92617-719a-42d2-a464-99a603094ae7 " >
341+
342+ ## Kafka Messages
343+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/2c56d72f-9788-437f-9715-2de12d2b3731 " >
344+
345+ ### Event
346+ A domain event signals something that has happened in the outside world that is of interest to the application.
347+
348+ Events are something that happened in the past. So they are immutable.
349+
350+ Use past tense when naming events.
351+
352+ For eg: UserCreated, UserAddressChanged etc.
353+
354+ ### Kafka Message Example
355+ ``` cs
356+ var message = new Message <string , Biometrics >
357+ {
358+ Key = metrics .DeviceId ,
359+ Value = metrics
360+ };
361+ ```
362+ If you care about message ordering, provide key, otherwise it's optional.
363+
364+ In above example, because we're using ` DeviceId ` as key, all messages of that specific device are handled in order.
365+
366+ Value can be a primitive type such as string or some object that can be serialized into formats such as JSON, Avro or Protobuf.
367+
368+ ## Producing messages to a topic
369+ <img width =" 550 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/1d0706c6-5b6e-465f-ba32-2aef3628dd8d " >
370+
371+ You can consider the messages being produced by your system to be just another type of API.
372+
373+ Some APIs will be consumed through HTTP while others might be consumed through Kafka.
374+
375+ ### Producer Config
376+ ``` json
377+ "KafkaProducer" : {
378+ // One or more Kafka brokers each specified by a host and port if necessary.
379+ // It will be used to establish the initial connection to the Kafka cluster.
380+ // Once connected, additional brokers may become available.
381+ "BootstrapServers" : " pkc-4rn2p.canadacentral.azure.confluent.cloud:9092" ,
382+ "SecurityProtocol" : " SaslSsl" ,
383+ "SaslMechanism" : " PLAIN" ,
384+ "SaslUsername" : " comes from user-secrets' secrets.json" ,
385+ "SaslPassword" : " comes from user-secrets' secrets.json" ,
386+ // Used to identify the producer.
387+ // In other words, to give it a name.
388+ // Although it's not strictly required, providing a ClientId will make debugging a lot easier.
389+ "ClientId" : " my-dotnet-kafka"
390+ }
391+ ```
392+
393+ Grab the config
394+ ``` cs
395+ var producerConfig = builder .Configuration .GetSection (" KafkaProducer" ).Get <ProducerConfig >();
396+ ```
397+
398+ Create the Producer
399+ ``` cs
400+ using var producer = new ProducerBuilder <string , Biometrics >(producerConfig ).Build ();
401+ ```
402+
403+ Send the message
404+ ``` cs
405+ var result = await producer .ProduceAsync (BiometricsImportedTopicName , message );
406+ ```
407+
408+ The messages aren't necessarily sent immediately.
409+ They may be buffered in memory so that multiple messages can be sent as a batch.
410+ Once we're sure we want the messages to be sent, it's a good idea to call the Flush method.
411+
412+ ``` cs
413+ // Synchronous method, so it will wait for acknowledgement from broker before continuing
414+ // It's often better to produce multiple messages into a batch prior to calling Flush.
415+ producer .Flush ();
416+ ```
417+
418+ ## Create Topic
419+ [ Reference] ( https://developer.confluent.io/courses/apache-kafka-for-dotnet/producing-messages-hands-on/#create-a-new-topic )
420+
421+ A topic is an immutable, append-only log of events. Usually, a topic is comprised of the same kind of events.
422+
423+ <img width =" 450 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/8338369a-158d-4932-8d82-3b301a85a628 " >
424+
425+ Create a new topic, ` RawBiometricsImported ` , which you will use to produce and consume events.
426+
427+ <img width =" 450 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/324d25d8-a71f-4485-86b0-5470d45370d6 " >
428+
429+ -> Create with defaults
430+
431+ When asked to ** Define a data contract** select ** Skip** .
432+
433+ ## Populate config file using API Key file you downloaded earlier
434+ - The Kafka.BootstrapServers is the Bootstrap server in the file.
435+ - The Kafka.SaslUsername is the key value in the file.
436+ - The Kafka.SaslPassword is the secret value in the file.
437+ - SchemaRegistry.URL is the Stream Governance API endpoint url.<br >
438+ <img width =" 200 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/6a7404d5-b3ec-47b7-9a03-883b53c4ea01 " >
439+ - SchemaRegistry.BasicAuthUserInfo is ` <key>:<secret> ` from the API Key file you downloaded for the Schema Registry.
440+
441+ Store user name and passwords inside ` secrets.json ` and other details in ` appsettings.json ` .
442+
443+ ## Produce messages
444+ Go to the web api you created earlier.
445+
446+ It will work as a simple REST endpoint that accepts data from a fitness tracker in the form of strings and pushes it to Kafka with no intermediate processing.
447+
448+ In the long run, this may be dangerous because it could allow a malfunctioning device to push invalid data into our stream. We probably want to perform a minimal amount of validation, prior to pushing the data. We'll do that later.
225449
226- <img width =" 750 " alt =" image " src =" https://github.com/akhanalcs/dotnet-kafka/assets/30603497/f520ce07-ac02-46fe-818b-b682d456a10a " >
227450
228- ## Configuration
229- Home -> Environments -> default -> cluster_0 -> API Keys -> Create key
230451
452+ - Truth: That comports to reality.
453+ - Maybe true there's a diamond shaped exactly like my head on MARS, but there's no way for us to know that. so we can't really say "oh it's true that there's diamon shaped my head in MARS"
231454
0 commit comments