Configuring the Connectors

You can use TIBCO Bridge for Apache Kafka as a source connector (FTL to Apache Kafka), a sink connector (Apache Kafka to FTL), or both at the same time. Configure each connector separately by modifying a copy of the appropriate sample configuration file.

Configuration Files

Comments in the sample configuration files describe all available configuration parameters and their default values.

To send messages from FTL to Apache Kafka, copy and modify the following source connector configuration files:

AKD_BRIDGE_HOME/config/tibftl-kafka-connect-source.properties example:

# Name of this connector instance.
name = tibftl-kafka-connect-source

# Required to use the source connector. Do not modify this value.
connector.class = FTLSourceConnector

# The connector supports only 1 task at present,
# and ignores any other value.
tasks.max = 1

# Required.  The source connector writes messages to
# this Kafka topic.
topic = ftl-messages

# Required. Path to FTL-specific YAML configuration file.
ftl.config = /opt/tibco/akd/bridge/2.3/config/ftl-source.yaml

AKD_BRIDGE_HOME/config/ftl-source.yaml example:

# FTL-specific source connector configuration

# Required.  Pipe-delimited list of FTL server URLs.
ftlServers: "http://localhost:8585"

# Required.  FTL application name of the source connector.
ftlApplicationName: "kafka-connect-source"

ftlEndpoints:
    - name: "kafka-connect-source-recvendpoint"

To send messages from Apache Kafka to FTL, copy and modify the following sink connector configuration files:

AKD_BRIDGE_HOME/config/tibftl-kafka-connect-sink.properties example:

# Name of this connector instance.
name = tibftl-kafka-connect-sink

# Required to use the sink connector. Do not modify this value.
connector.class = FTLSinkConnector

# The connector supports only 1 task at present,
# and ignores any other value.
tasks.max = 1

# Required.  The sink connector reads messages from
# the Kafka topics in this comma-separated list.
topics = ftl-messages

# Required. Path to FTL-specific YAML configuration file.
ftl.config = /opt/tibco/akd/bridge/2.3/config/ftl-sink.yaml

AKD_BRIDGE_HOME/config/ftl-sink.yaml example:

# FTL-specific sink connector configuration

# Required.  Pipe-delimited list of FTL server URLs.
ftlServers: "http://localhost:8585"

# Required.  FTL application name of the sink connector.
ftlApplicationName: "kafka-connect-sink"

# Required.  A list of FTL endpoints to publish to.
# Example of an endpoint list:
#
# ftlEndpoints:
#     - name: "endpointOne"
#     - name: "endpointTwo"
#     - name: "endpointThree"
ftlEndpoints:
    - name: "kafka-connect-sink-sendendpoint"

Supply the connector properties files as arguments on the Apache Kafka Connect command line (see Running the Connectors).

Procedure

  1. Required. Ensure ftl.realmservers contains correct URLs for the FTL realm servers.
  2. If the realm server requires client authentication, ensure ftl.username and ftl.password are set to the correct credentials to identify the bridge connectors as clients to the FTL server.
  3. If the realm service uses secure communications, ensure that the parameters ftl.trust.* are set correctly to ensure that the bridge connectors trust the FTL server.
  4. Configure schema generation and storage.
    If you use the Avro converter or the JSON converter, configure the source connector to store the schema with each converted message.
    For example,
    schemas.enable=true
    If you use the string converter, configure the source connector to not store schemas (which is the default behavior).
    For example,
    schemas.enable=false
  5. Modify other parameters as needed.