In Kafka Connector, how do I get the bootstrap-server address My Kafka Connect is currently using?












1














I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.



My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)



So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?










share|improve this question
























  • What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
    – Giorgos Myrianthous
    Nov 20 '18 at 12:57












  • @GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
    – SkyOne
    Nov 20 '18 at 13:08










  • I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
    – Giorgos Myrianthous
    Nov 20 '18 at 13:10












  • I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
    – cricket_007
    Nov 20 '18 at 15:34










  • @cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
    – SkyOne
    Nov 20 '18 at 22:50
















1














I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.



My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)



So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?










share|improve this question
























  • What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
    – Giorgos Myrianthous
    Nov 20 '18 at 12:57












  • @GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
    – SkyOne
    Nov 20 '18 at 13:08










  • I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
    – Giorgos Myrianthous
    Nov 20 '18 at 13:10












  • I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
    – cricket_007
    Nov 20 '18 at 15:34










  • @cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
    – SkyOne
    Nov 20 '18 at 22:50














1












1








1







I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.



My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)



So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?










share|improve this question















I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company.



My confuse is: I can't find any API for me to get my Connect's bootstrap.servers.(I know it's in the confluent's etc directory but it's not a good idea to write hard code of the directory of "connect-distributed.properties" to get the bootstrap.servers)



So question, is there another way for me to get the value of bootstrap.servers conveniently in my connector program?







apache-kafka apache-kafka-connect






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 20 '18 at 15:31









cricket_007

79.4k1142109




79.4k1142109










asked Nov 20 '18 at 11:44









SkyOne

85




85












  • What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
    – Giorgos Myrianthous
    Nov 20 '18 at 12:57












  • @GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
    – SkyOne
    Nov 20 '18 at 13:08










  • I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
    – Giorgos Myrianthous
    Nov 20 '18 at 13:10












  • I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
    – cricket_007
    Nov 20 '18 at 15:34










  • @cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
    – SkyOne
    Nov 20 '18 at 22:50


















  • What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
    – Giorgos Myrianthous
    Nov 20 '18 at 12:57












  • @GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
    – SkyOne
    Nov 20 '18 at 13:08










  • I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
    – Giorgos Myrianthous
    Nov 20 '18 at 13:10












  • I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
    – cricket_007
    Nov 20 '18 at 15:34










  • @cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
    – SkyOne
    Nov 20 '18 at 22:50
















What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
– Giorgos Myrianthous
Nov 20 '18 at 12:57






What do you mean by Connect's bootstrap.servers ? Do you want the host of Kafka Connect or Kafka brokers?
– Giorgos Myrianthous
Nov 20 '18 at 12:57














@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08




@GiorgosMyrianthous the bootstrap.servers for me to build a Kafka producer... I think that should be the Kafka brokers
– SkyOne
Nov 20 '18 at 13:08












I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
– Giorgos Myrianthous
Nov 20 '18 at 13:10






I don't think you can make a call to REST Proxy API in order to get bootstrap.servers. You can only get Brokers' IDs. I would suggest to create a .properties file that contains your bootstrap.servers and load that file on producer's startup.
– Giorgos Myrianthous
Nov 20 '18 at 13:10














I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34




I don't see anything wrong with reading the file from disk, assuming all Connect workers are consistently installed... In fact, that's how the new password hiding feature works
– cricket_007
Nov 20 '18 at 15:34












@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50




@cricket_007 What if the file name is various against different environments? i.e. etc/dev/dev-connect-distributed.properties and etc/prd/prd-connect-distributed.properties
– SkyOne
Nov 20 '18 at 22:50












1 Answer
1






active

oldest

votes


















2














Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.



You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.



For more details, see the KIP that added this feature.






share|improve this answer





















  • Worth pointing out - Connect can be upgraded independently of the brokers
    – cricket_007
    Nov 20 '18 at 15:32










  • @cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
    – SkyOne
    Nov 21 '18 at 1:58










  • @Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
    – cricket_007
    Nov 21 '18 at 4:38













Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53392298%2fin-kafka-connector-how-do-i-get-the-bootstrap-server-address-my-kafka-connect-i%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.



You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.



For more details, see the KIP that added this feature.






share|improve this answer





















  • Worth pointing out - Connect can be upgraded independently of the brokers
    – cricket_007
    Nov 20 '18 at 15:32










  • @cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
    – SkyOne
    Nov 21 '18 at 1:58










  • @Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
    – cricket_007
    Nov 21 '18 at 4:38


















2














Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.



You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.



For more details, see the KIP that added this feature.






share|improve this answer





















  • Worth pointing out - Connect can be upgraded independently of the brokers
    – cricket_007
    Nov 20 '18 at 15:32










  • @cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
    – SkyOne
    Nov 21 '18 at 1:58










  • @Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
    – cricket_007
    Nov 21 '18 at 4:38
















2












2








2






Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.



You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.



For more details, see the KIP that added this feature.






share|improve this answer












Instead of trying to send the "bad" records from a SinkTask to Kafka, you should instead try to use the dead letter queue feature that was added in Kafka Connect 2.0.



You can configure the Connect runtime to automatically dump records that failed to be processed to a configured topic acting as a DLQ.



For more details, see the KIP that added this feature.







share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 20 '18 at 13:31









Mickael Maison

7,06432529




7,06432529












  • Worth pointing out - Connect can be upgraded independently of the brokers
    – cricket_007
    Nov 20 '18 at 15:32










  • @cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
    – SkyOne
    Nov 21 '18 at 1:58










  • @Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
    – cricket_007
    Nov 21 '18 at 4:38




















  • Worth pointing out - Connect can be upgraded independently of the brokers
    – cricket_007
    Nov 20 '18 at 15:32










  • @cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
    – SkyOne
    Nov 21 '18 at 1:58










  • @Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
    – cricket_007
    Nov 21 '18 at 4:38


















Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32




Worth pointing out - Connect can be upgraded independently of the brokers
– cricket_007
Nov 20 '18 at 15:32












@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58




@cricket_007 Our company now use Confluent4.0.1 with Kafka 1.0.1, can we just upgrade Confluent4 to Confluent5 while still using the Kafka 1.0.1 cluster?
– SkyOne
Nov 21 '18 at 1:58












@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38






@Sky You should be able to yes. Brokers since Confluent 3.0 support newer Confluent versions for Connect workers
– cricket_007
Nov 21 '18 at 4:38




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53392298%2fin-kafka-connector-how-do-i-get-the-bootstrap-server-address-my-kafka-connect-i%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

If I really need a card on my start hand, how many mulligans make sense? [duplicate]

Alcedinidae

Can an atomic nucleus contain both particles and antiparticles? [duplicate]