Kafka Connect 2.0.1: PUT /connectors/(string:name)/config

Creates a new connector using the given configuration or updates the configuration for an existing connector. Returns information about the connector after the change has been made.

Description

The PUT request along with the parameters is used to create connectors in distributed mode.

Table 1. Parameters
Parameters Description
name (string) Name of the created connector.
config (map) Configuration parameters for the connector. See Kafka Connect 2.0.1: HDFS Connector and Kafka Connect 2.0.1: JDBC Connector for configuration options.
tasks (array) List of active tasks generated by the connector.
tasks[i].connector (string) Name of the connector that the task belongs to.
tasks[i].task (int) Task ID within the connector.

Syntax

http://<host>:8083/connectors/<string_name>/config

Request Example

PUT /connectors/hdfs-sink-connector/config HTTP/1.1 Host: connect.example.com Accept: application/json  
{     
    "connector.class": "io.confluent.connect.hdfs.HdfsSinkConnector",     
    "tasks.max": "10",     
    "topics": "test-topic",     
    "hdfs.url": "hdfs://fakehost:9000",     
    "hadoop.conf.dir": "/opt/hadoop/conf",     
    "hadoop.home": "/opt/hadoop",     
    "flush.size": "100",     
    "rotate.interval.ms": "1000" 
}

Response Example

The response JSON object is in the following form:
  • config (map) – Configuration parameters for the connector. All values should be strings.
Note: In this example, the return status indicates that the connector was created. In the case of a configuration update, the status would be 200 OK.
HTTP/1.1 201 Created Content-Type: application/json  
{     
    "name": "hdfs-sink-connector",     
    "config": 
        {         
            "connector.class": "io.confluent.connect.hdfs.HdfsSinkConnector",         
            "tasks.max": "10",         
            "topics": "test-topic",         
            "hdfs.url": "hdfs://fakehost:9000",         
            "hadoop.conf.dir": "/opt/hadoop/conf",         
            "hadoop.home": "/opt/hadoop",         
            "flush.size": "100",         
            "rotate.interval.ms": "1000"     
            },     
     "tasks": [         
        { "connector": "hdfs-sink-connector", "task": 1 },         
        { "connector": "hdfs-sink-connector", "task": 2 },         
        { "connector": "hdfs-sink-connector", "task": 3 }     ] 
}