How to fetch Kafka source connector schema based on connector name












1















I am using Confluent JDBC Kafka connector to publish messages into topic. The source connector will send data to topic along with schema on each poll. I want to retrieve this schema.



Is it possible? How? Can anyone suggest me



My intention is to create a KSQL stream or table based on schema build by Kafka connector on poll.










share|improve this question





























    1















    I am using Confluent JDBC Kafka connector to publish messages into topic. The source connector will send data to topic along with schema on each poll. I want to retrieve this schema.



    Is it possible? How? Can anyone suggest me



    My intention is to create a KSQL stream or table based on schema build by Kafka connector on poll.










    share|improve this question



























      1












      1








      1


      1






      I am using Confluent JDBC Kafka connector to publish messages into topic. The source connector will send data to topic along with schema on each poll. I want to retrieve this schema.



      Is it possible? How? Can anyone suggest me



      My intention is to create a KSQL stream or table based on schema build by Kafka connector on poll.










      share|improve this question
















      I am using Confluent JDBC Kafka connector to publish messages into topic. The source connector will send data to topic along with schema on each poll. I want to retrieve this schema.



      Is it possible? How? Can anyone suggest me



      My intention is to create a KSQL stream or table based on schema build by Kafka connector on poll.







      apache-kafka apache-kafka-connect confluent confluent-schema-registry ksql






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 22 '18 at 17:53









      cricket_007

      80.7k1142110




      80.7k1142110










      asked Nov 22 '18 at 6:55









      AchaiusAchaius

      1,872134388




      1,872134388
























          1 Answer
          1






          active

          oldest

          votes


















          2














          The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.



          You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:



          key.converter=io.confluent.connect.avro.AvroConverter
          key.converter.schema.registry.url=http://schema-registry:8081


          (Update schema-registry to the hostname of where your Schema Registry is running)



          From there, in KSQL you just use



          CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');


          You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.



          You can read more about Converters and serialisers here.



          Disclaimer: I work for Confluent, and wrote the referenced blog post.






          share|improve this answer
























          • I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

            – Achaius
            Nov 23 '18 at 5:59













          • The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

            – Achaius
            Nov 23 '18 at 6:22






          • 1





            1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

            – Robin Moffatt
            Nov 23 '18 at 9:01






          • 2





            2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

            – Robin Moffatt
            Nov 23 '18 at 9:01











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425385%2fhow-to-fetch-kafka-source-connector-schema-based-on-connector-name%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.



          You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:



          key.converter=io.confluent.connect.avro.AvroConverter
          key.converter.schema.registry.url=http://schema-registry:8081


          (Update schema-registry to the hostname of where your Schema Registry is running)



          From there, in KSQL you just use



          CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');


          You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.



          You can read more about Converters and serialisers here.



          Disclaimer: I work for Confluent, and wrote the referenced blog post.






          share|improve this answer
























          • I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

            – Achaius
            Nov 23 '18 at 5:59













          • The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

            – Achaius
            Nov 23 '18 at 6:22






          • 1





            1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

            – Robin Moffatt
            Nov 23 '18 at 9:01






          • 2





            2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

            – Robin Moffatt
            Nov 23 '18 at 9:01
















          2














          The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.



          You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:



          key.converter=io.confluent.connect.avro.AvroConverter
          key.converter.schema.registry.url=http://schema-registry:8081


          (Update schema-registry to the hostname of where your Schema Registry is running)



          From there, in KSQL you just use



          CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');


          You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.



          You can read more about Converters and serialisers here.



          Disclaimer: I work for Confluent, and wrote the referenced blog post.






          share|improve this answer
























          • I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

            – Achaius
            Nov 23 '18 at 5:59













          • The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

            – Achaius
            Nov 23 '18 at 6:22






          • 1





            1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

            – Robin Moffatt
            Nov 23 '18 at 9:01






          • 2





            2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

            – Robin Moffatt
            Nov 23 '18 at 9:01














          2












          2








          2







          The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.



          You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:



          key.converter=io.confluent.connect.avro.AvroConverter
          key.converter.schema.registry.url=http://schema-registry:8081


          (Update schema-registry to the hostname of where your Schema Registry is running)



          From there, in KSQL you just use



          CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');


          You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.



          You can read more about Converters and serialisers here.



          Disclaimer: I work for Confluent, and wrote the referenced blog post.






          share|improve this answer













          The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.



          You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:



          key.converter=io.confluent.connect.avro.AvroConverter
          key.converter.schema.registry.url=http://schema-registry:8081


          (Update schema-registry to the hostname of where your Schema Registry is running)



          From there, in KSQL you just use



          CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');


          You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.



          You can read more about Converters and serialisers here.



          Disclaimer: I work for Confluent, and wrote the referenced blog post.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 22 '18 at 13:18









          Robin MoffattRobin Moffatt

          6,6081329




          6,6081329













          • I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

            – Achaius
            Nov 23 '18 at 5:59













          • The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

            – Achaius
            Nov 23 '18 at 6:22






          • 1





            1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

            – Robin Moffatt
            Nov 23 '18 at 9:01






          • 2





            2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

            – Robin Moffatt
            Nov 23 '18 at 9:01



















          • I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

            – Achaius
            Nov 23 '18 at 5:59













          • The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

            – Achaius
            Nov 23 '18 at 6:22






          • 1





            1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

            – Robin Moffatt
            Nov 23 '18 at 9:01






          • 2





            2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

            – Robin Moffatt
            Nov 23 '18 at 9:01

















          I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

          – Achaius
          Nov 23 '18 at 5:59







          I saw your post. But I have 2 questions here. 1. I am using confluent JDBC connector which sends records in delimited format. Is it still work? 2. What happened if schema was updated. Is KSQL able to refer the updated schema?

          – Achaius
          Nov 23 '18 at 5:59















          The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

          – Achaius
          Nov 23 '18 at 6:22





          The answer for first questtion is yes, but confirm it. Kindly respond for the 2nd question

          – Achaius
          Nov 23 '18 at 6:22




          1




          1





          1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

          – Robin Moffatt
          Nov 23 '18 at 9:01





          1. The JDBC Connector uses the Kafka Connect API, which supports multiple converters, including JSON, Avro, etc. So it's not the case that "JDBC connector sends records in delimited format". It is configurable the format in which it sends the data.

          – Robin Moffatt
          Nov 23 '18 at 9:01




          2




          2





          2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

          – Robin Moffatt
          Nov 23 '18 at 9:01





          2. KSQL should pull the latest version of the schema when you create a new stream against the topic, yes.

          – Robin Moffatt
          Nov 23 '18 at 9:01


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425385%2fhow-to-fetch-kafka-source-connector-schema-based-on-connector-name%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          To store a contact into the json file from server.js file using a class in NodeJS

          Redirect URL with Chrome Remote Debugging Android Devices

          Dieringhausen