Artisan IMG > JSON Reader (json-reader) (7aff357710278aff6826c2ec447599ff)

Kafka
1.4

Kafka is a high-throughput, low-latency platform for handling real-time data feeds.

Overview
Copy

Kafka is used for building real-time data pipelines and streaming apps. It is horizontally scalable, fault-tolerant, wicked fast, and runs in production in thousands of companies

Authentication
Copy

Add a Kafka connector step to your workflow. Click on the 'New Authentication' option now available in the right hand panel and name as appropriate.

In this example we will be using IBM Event Streams to provide our kafka service. The setup should be similar for any kafka service.

From IBM Event Streams you will get a config that looks like this:

Note: The values below are expired and you will need to provide your own credentials

1
"api_key": "Jy-adjiashidahsidhasiodhiaoshfong356dgs1dg6",
2
"apikey": "Jy-adjiashidahsidhasiodhiaoshfong356dgs1dg6",
3
"iam_apikey_description": "Auto-generated for key sdsdw2253-48d4-443c-fs14-3b63f6f2cc37",
4
"iam_apikey_name": "Service credentials-1",
5
"iam_role_crn": "crn:v1:bluemix:public:iam::::serviceRole:Manager",
6
"iam_serviceid_crn": "crn:v1:bluemix:public:iam-identity::a/e780fb1c585we8rvjko1127accb0365f::serviceid:ServiceId-8rvjko1-2cda-41e8-8ab6-5we8rvjk4de4",
7
"instance_id": "b5a4bceb-0157-4da3-2ab1-10a4f42aee9c",
8
"kafka_admin_url": "https://5we8rvjko112.svc01.us-south.eventstreams.cloud.ibm.com",
9
"kafka_brokers_sasl": [
10
"broker-5-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
11
"broker-4-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
12
"broker-3-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
13
"broker-1-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
14
"broker-2-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
15
"broker-0-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093"
16
],
17
"kafka_http_url": "https://5we8rvjko112.svc01.us-south.eventstreams.cloud.ibm.com",
18
"password": "ihoq38thyei38yqehtaiodhghightye8y748af45",
19
"user": "token"
20
}

The first thing we need to do is put our brokers list into a single comma separated string, so the brokers list will turn from this:

1
"kafka_brokers_sasl": [
2
"broker-5-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
3
"broker-4-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
4
"broker-3-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
5
"broker-1-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
6
"broker-2-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093",
7
"broker-0-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093"
8
],

to this:

1
broker-5-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-4-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-3-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-1-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-2-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093,broker-0-5we8rvjko112.kafka.svc01.us-south.eventstreams.cloud.ibm.com:9093

The next option is SASL, in the case of IBM event stream this is 'Scram sha 256' and can be selected from the drop down. Please check with your kafka provider on which SASL option needs to be selected.

The next options are for username and password: These can be taken from the 'user' and 'password' fields from the config. So in this case the username is 'token' and the password is 'ihoq38thyei3xxxxxxxxxxxxxxxtye8y748af45'

The next options are for SSL, for ibm event stream you can leave these as the default. If your kafka service doesn't support SSL, you will need to untick 'Enable SSL?' If your kafka service is using a self signed certificate, you will need to untick 'Verify server certificate?'

In this example the final authentication looks like:

Operations List
Copy

  • Consume messages

  • Produce messages

Produce messages
Copy

The produce message operation allows you to send message to a specific topic in Kafa, in this case RDC_CUSTOMER.

You can send as many messages as you would like, by clicking the the 'Add Item' button and setting the value of the message.

Consume messages
Copy

The Consume messages operations allows you to receive messages from a specific topic and group.

To consume the messages that you just produced:

  1. set the 'topic' to RDC_CUSTOMER.

  2. You can change the default group, if you already have group, you can use that or you can leave it as 'tray.io'

  3. Tick 'From beginning', this will ensure that all previous messages get sent to this new 'tray.io' group