Apache Kafka Adapter Samples

Introduction

This topic describes how to run the sample applications for the Apache Kafka Adapter Suite, and illustrates how to use the Kafka adapters when connecting a StreamBase application to a Kafka message broker.

The first sample, kafka.sbapp, demonstrates a complete process of connecting to a Kafka broker with a consumer and producer and sending messages. The second sample, kafkaCustomSerialize.sbapp, is a similar example but uses the custom serializer to serialize and deserialize tuple messages to and from the broker using a JSON string format.

Both samples use a built-in DemoBroker adapter which creates an Apache ZooKeeper™ service and three Kafka brokers for the consumer and producer adapters to connect to. The demo broker produces a single output status message when running. Each sample uses the status message to tell the consumer and producer to connect, create, and subscribe to some predefined topics (different for each sample).

The Apache Kafka adapter suite is implemented against the version of the Kafka libraries listed on the Supported Configurations page.

Running the Kafka Sample in StreamBase Studio

Run this sample in Studio as follows:

  1. If you have not yet loaded the Apache Kafka sample into Studio, follow the steps in Importing This Sample.

  2. In the Package Explorer view, open the sample_adapter_embedded_kafka folder.

  3. Double-click to open the kakfa.sbapp application.

  4. Make sure the application is the currently active tab in the EventFlow Editor, then click the Run button. This opens the SB Test/Debug perspective and starts the application.

  5. In the Manual Input view, select the PublishIn input stream.

  6. Enter topic1, topic2, or topic3 into the topic field and any message text into the message field. You can enter any optional value into the key field, or leave the default value as null. For the partition field, because the demo broker creates only a single partition, leave the default null value, or enter a value of 0 (null defaults to 0).

  7. Click Send Data.

  8. Observe your message emitted on both the PublishData and KafkaMessage output streams.

  9. In the Manual Input view, select the AdminCommand input stream.

  10. Enter topics into the command field.

  11. Click Send Data. This sends a tuple to a Kafka Admin adapter, which asks the broker to list the available topics and their associated properties.

  12. Observe the output on the TopicConfigs output stream.

  13. Experiment with sending messages to this sample's other input streams.

  14. When done, press F9 or click the Stop Running Application button.

Running the Kafka Custom Serialize Sample in StreamBase Studio

Run this sample in Studio as follows:

  1. If you have not yet loaded the Apache Kafka sample into Studio, follow the steps in Importing This Sample.

  2. In the Package Explorer view, open the sample_adapter_embedded_kafka folder.

  3. Double-click to open the kafkaCustomSerialize.sbapp application.

  4. Make sure the application is the currently active tab in the EventFlow Editor, then click the Run button. This opens the SB Test/Debug perspective and starts the application.

  5. In the Manual Input view, select the PublishIn input stream.

  6. Enter topic1 into the topic field and any test values into the Field1 and Field2 subfields of message field. You can enter any optional value into the key field, or leave the default value as null. For the partition field, because the demo broker creates only a single partition, leave the default null value, or enter a value of 0 (null defaults to 0).

  7. Click Send Data.

  8. Observe your message emitted on the KafkaMessage output stream.

  9. In the Manual Input view, select the AdminCommand input stream.

  10. Enter topics into the command field.

  11. Click Send Data. This sends a tuple to a Kafka Admin adapter, which asks the broker to list the available topics and their associated properties.

  12. Observe the output on the TopicConfigs output stream.

  13. Experiment with sending messages to this sample's other input streams.

  14. When done, press F9 or click the Stop Running Application button.

Importing This Sample into StreamBase Studio

In StreamBase Studio, import this sample with the following steps:

  • From the top-level menu, select FileLoad StreamBase Sample.

  • Type Kafka to narrow the list of options.

  • Select Apache Kafka Producer, Consumer, and Admin adapters from the StreamBase Messaging Adapters category.

  • Click OK.

StreamBase Studio creates a single project for the Kafka adapter samples in your current Studio workspace.

Sample Location

When you load the sample into StreamBase Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.

Important

Load this sample in StreamBase Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.

Using the workspace copy of the sample avoids the permission problems that can occur when trying to work with the initially installed location of the sample. The default workspace location for this sample is:

studio-workspace/sample_adapter_embedded_kafka

See Default Installation Directories for the location of studio-workspace on your system.

In the default TIBCO StreamBase installation, this sample's files are initially installed in:

streambase-install-dir/sample/adapter/embedded/kafka

See Default Installation Directories for the location of streambase-install-dir on your system. This location may require administrator privileges for write access, depending on your platform.