In Kafka Connector, how do I get the bootstrap-server address My Kafka Connect is currently using?
I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.
My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)
So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?
apache-kafka apache-kafka-connect
|
show 1 more comment
I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.
My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)
So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?
apache-kafka apache-kafka-connect
What do you mean by Connect'sbootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?
– Giorgos Myrianthous
Nov 20 '18 at 12:57
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
I don't think you can make a call to REST Proxy API in order to getbootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a.properties
file that contains yourbootstrap.servers
and load that file on producer's startup.
– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50
|
show 1 more comment
I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.
My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)
So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?
apache-kafka apache-kafka-connect
I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.
My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)
So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?
apache-kafka apache-kafka-connect
apache-kafka apache-kafka-connect
edited Nov 20 '18 at 15:31
cricket_007
79.4k1142109
79.4k1142109
asked Nov 20 '18 at 11:44
SkyOne
85
85
What do you mean by Connect'sbootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?
– Giorgos Myrianthous
Nov 20 '18 at 12:57
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
I don't think you can make a call to REST Proxy API in order to getbootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a.properties
file that contains yourbootstrap.servers
and load that file on producer's startup.
– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50
|
show 1 more comment
What do you mean by Connect'sbootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?
– Giorgos Myrianthous
Nov 20 '18 at 12:57
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
I don't think you can make a call to REST Proxy API in order to getbootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a.properties
file that contains yourbootstrap.servers
and load that file on producer's startup.
– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50
What do you mean by Connect's
bootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?– Giorgos Myrianthous
Nov 20 '18 at 12:57
What do you mean by Connect's
bootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?– Giorgos Myrianthous
Nov 20 '18 at 12:57
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
I don't think you can make a call to REST Proxy API in order to get
bootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a .properties
file that contains your bootstrap.servers
and load that file on producer's startup.– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't think you can make a call to REST Proxy API in order to get
bootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a .properties
file that contains your bootstrap.servers
and load that file on producer's startup.– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50
|
show 1 more comment
1 Answer
1
active
oldest
votes
Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.
You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.
For more details, see the KIP that added this feature.
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53392298%2fin-kafka-connector-how-do-i-get-the-bootstrap-server-address-my-kafka-connect-i%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.
You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.
For more details, see the KIP that added this feature.
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
add a comment |
Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.
You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.
For more details, see the KIP that added this feature.
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
add a comment |
Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.
You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.
For more details, see the KIP that added this feature.
Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.
You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.
For more details, see the KIP that added this feature.
answered Nov 20 '18 at 13:31
Mickael Maison
7,06432529
7,06432529
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
add a comment |
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53392298%2fin-kafka-connector-how-do-i-get-the-bootstrap-server-address-my-kafka-connect-i%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
What do you mean by Connect's
bootstrap.servers
? Do you want the host of Kafka Connect or Kafka brokers?– Giorgos Myrianthous
Nov 20 '18 at 12:57
@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08
I don't think you can make a call to REST Proxy API in order to get
bootstrap.servers
. You can only get Brokers' IDs. I would suggest to create a.properties
file that contains yourbootstrap.servers
and load that file on producer's startup.– Giorgos Myrianthous
Nov 20 '18 at 13:10
I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34
@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50