Contact Free trial Login

Kafka Studio Configuration - Mule 4

To configure a connector in Anypoint Studio:

  1. Install the connector.

  2. Configure the connector.

  3. Configure an input source for the connector.

Install the Connector in Studio

  1. In Studio, create a Mule Project.

  2. In the Mule Palette, click (X) Search in Exchange.

  3. In Add Modules to Project, type the name of the connector in the search field.

  4. Click the connector name in Available modules.

  5. Click Add.

  6. Click Finish.

Install the Connector in Exchange

  1. In Studio, create a Mule Project.

  2. Click the Exchange (X) icon in the Studio task bar.

  3. In Exchange, click Login and supply your Anypoint Platform username and password.

  4. In Exchange, select All assets and search for "Kafka".

  5. Select Kafka, click Add to project.

  6. Follow the prompts to install the connector.

Configure the Connector

  1. Drag and drop the connector to the Studio Canvas.

  2. To create a global element for the connector, set these fields:

    1. Basic:

      • Bootstrap Servers - Comma-separated host-port pairs used for establishing the initial connection to the Kafka cluster. This is the same as the bootstrap.servers value you must provide to Kafka clients (producer or consumer).

      • Additional properties - Additional properties as key-value pairs that you need for your connection. Here you can put whatever property Kafka supports.

        kafka basic studio config
    2. SSL:

      • All the parameters from Basic configuration.

      • Key Store Type - The file format of the key store file. This is optional and the default value is JKS.

      • Key Store Password - The store password for the key store file. This is optional and only needed if Key Store Location is configured.

      • Key Store Location - The location of the key store file. This is optional and can be used for two-way authentication for connector.

      • Trust Store Type - The file format of the trust store file.

      • Trust Store Password - The password for the trust store file.

      • Trust Store Location - The location of the trust store file.

        kafka ssl studio config
    3. Kerberos:

      • All the parameters from the Basic configuration.

      • Principal - Kerberos principal

      • Keytab - Path to keytab file associated with the principal.

      • Service Name - The Kerberos principal name that the Kafka broker runs as.

      • Additional JAAS Properties - Additional properties as key-value pairs that you set in sasl.jaas.config and that you usually include in JAAS configuration file.

        kafka kerberos studio config
    4. Kerberos SSL:

      • All the parameters from the Basic configuration.

      • All the parameters from the SSL configuration.

      • All the parameters from the Kerberos configuration.

        kafka kerberos ssl studio config
  3. Based on the operation what you dragged to the canvas, configure the following fields:

    1. Consumer trigger:

      • Topic - The name of the Kafka topic where to consume messages.

      • Partition offsets (Optional) - list of offsets for configuring partitions. For each element in the list, specify the partition index and offset.

        kafka consumer studio config
    2. Producer operation:

      • Topic - Topic to send the message to.

      • Key - Key belonging to the message that is going to be sent.

      • Message - Message to be sent.

        kafka producer studio config

Configure an Input Source

Configure an input source for the connector such as a connector operation, using an HTTP Listener, or Scheduler.

The Message Consumer operation can be used as an input source in the Kafka connector.

Message Consumer fields:

Name Description

Configuration

The name of the configuration to use.

Topic

Name of the topic to consume messages from.

Partition Offsets

Array of Offset - List of Offset representing partitions offsets configuration. For each element in the list you have to specify partition index and offset.

Primary Node Only

Whether this source should only be executed on the primary node when running in Cluster.

Streaming Strategy

  • repeatable-in-memory-stream

  • repeatable-file-store-stream

  • non-repeatable-stream

Configure to use repeatable streams.

Redelivery Policy

RedeliveryPolicy - Defines a policy for processing the redelivery of the same message.

Reconnection Strategy

  • reconnect

  • reconnect-forever - A retry strategy in case of connectivity errors.

See Also

We use cookies to make interactions with our websites and services easy and meaningful, to better understand how they are used and to tailor advertising. You can read more and make your cookie choices here. By continuing to use this site you are giving us your consent to do this.