Hadoop data node shutting down with message 'libhadoop cannot be loaded'












1















I'm trying to start datanode on new slaves in order to add them to a live hadoop cluster. But failing to start datanode with command hadoop-daemon.sh start datanode



I have created file /var/lib/hadoop-hdfs/dn_socket manually and also changed its permissions. I have also checked libraries which are all there.



Have checked other questions related to datanode shutting down but unable to solve issue. Also the error logs I'm getting is different as follows.



Any help is appreciated.



2018-11-22 13:31:03,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2018-11-22 13:31:03,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s
2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
2018-11-22 13:31:03,516 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a loca
lDataXceiverServer because libhadoop cannot be loaded.
at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:1166)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:1137)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1369)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:495)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2695)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2598)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2645)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2789)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2813)
2018-11-22 13:31:03,519 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: Although a UNIX domai
n socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be lo
aded.
2018-11-22 13:31:03,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at HDP-slave-7/987.654.32.10









share|improve this question





























    1















    I'm trying to start datanode on new slaves in order to add them to a live hadoop cluster. But failing to start datanode with command hadoop-daemon.sh start datanode



    I have created file /var/lib/hadoop-hdfs/dn_socket manually and also changed its permissions. I have also checked libraries which are all there.



    Have checked other questions related to datanode shutting down but unable to solve issue. Also the error logs I'm getting is different as follows.



    Any help is appreciated.



    2018-11-22 13:31:03,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
    2018-11-22 13:31:03,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
    2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s
    2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
    2018-11-22 13:31:03,516 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
    java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a loca
    lDataXceiverServer because libhadoop cannot be loaded.
    at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:1166)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:1137)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1369)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:495)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2695)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2598)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2645)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2789)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2813)
    2018-11-22 13:31:03,519 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: Although a UNIX domai
    n socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be lo
    aded.
    2018-11-22 13:31:03,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at HDP-slave-7/987.654.32.10









    share|improve this question



























      1












      1








      1








      I'm trying to start datanode on new slaves in order to add them to a live hadoop cluster. But failing to start datanode with command hadoop-daemon.sh start datanode



      I have created file /var/lib/hadoop-hdfs/dn_socket manually and also changed its permissions. I have also checked libraries which are all there.



      Have checked other questions related to datanode shutting down but unable to solve issue. Also the error logs I'm getting is different as follows.



      Any help is appreciated.



      2018-11-22 13:31:03,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
      2018-11-22 13:31:03,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
      2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s
      2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
      2018-11-22 13:31:03,516 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
      java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a loca
      lDataXceiverServer because libhadoop cannot be loaded.
      at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:1166)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:1137)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1369)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:495)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2695)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2598)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2645)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2789)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2813)
      2018-11-22 13:31:03,519 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: Although a UNIX domai
      n socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be lo
      aded.
      2018-11-22 13:31:03,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
      /************************************************************
      SHUTDOWN_MSG: Shutting down DataNode at HDP-slave-7/987.654.32.10









      share|improve this question
















      I'm trying to start datanode on new slaves in order to add them to a live hadoop cluster. But failing to start datanode with command hadoop-daemon.sh start datanode



      I have created file /var/lib/hadoop-hdfs/dn_socket manually and also changed its permissions. I have also checked libraries which are all there.



      Have checked other questions related to datanode shutting down but unable to solve issue. Also the error logs I'm getting is different as follows.



      Any help is appreciated.



      2018-11-22 13:31:03,485 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
      2018-11-22 13:31:03,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
      2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s
      2018-11-22 13:31:03,515 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
      2018-11-22 13:31:03,516 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
      java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a loca
      lDataXceiverServer because libhadoop cannot be loaded.
      at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:1166)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:1137)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1369)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:495)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2695)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2598)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2645)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2789)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2813)
      2018-11-22 13:31:03,519 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: Although a UNIX domai
      n socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be lo
      aded.
      2018-11-22 13:31:03,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
      /************************************************************
      SHUTDOWN_MSG: Shutting down DataNode at HDP-slave-7/987.654.32.10






      hadoop datanode






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 22 '18 at 15:11







      akshay naidu

















      asked Nov 22 '18 at 14:03









      akshay naiduakshay naidu

      3310




      3310
























          0






          active

          oldest

          votes











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53432684%2fhadoop-data-node-shutting-down-with-message-libhadoop-cannot-be-loaded%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes
















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53432684%2fhadoop-data-node-shutting-down-with-message-libhadoop-cannot-be-loaded%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          If I really need a card on my start hand, how many mulligans make sense? [duplicate]

          Alcedinidae

          Can an atomic nucleus contain both particles and antiparticles? [duplicate]