Failing to build apache spark: “ Cannot run program ”javac“: error=20, Not a directory”





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I'm trying to install Spark on a Linux box. I downloaded it from http://spark.apache.org/docs/latest/building-spark.html and am trying to build it with this command:



root# build/mvn -e -X -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package


The build seems to start fine:



Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
Maven home: /some_path_here/spark-1.5.0/build/apache-maven-3.3.3
Java version: 1.7.0_05, vendor: Oracle Corporation
Java home: /usr/local/mytools-tools/java/jdk64/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-573.1.1.el6.x86_64", arch: "amd64", family: "unix"
[DEBUG] Created new class realm maven.api
[DEBUG] Importing foreign packages into class realm maven.api


But then it fails:



[debug] Recompiling all 8 sources: invalidated sources (8) exceeded 50.0% of all sources
[info] Compiling 8 Java sources to /some_path_here/spark-1.5.0/launcher/target/scala-2.10/classes...
[debug] Attempting to call javac directly...
[debug] com.sun.tools.javac.Main not found with appropriate method signature; forking javac instead
[debug] Forking javac: javac @/tmp/sbt_6c9436e4/argfile
[error] Cannot run program "javac": error=20, Not a directory
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 2.056 s]
[INFO] Spark Project Launcher ............................. FAILURE [ 4.832 s]


and so on.



I'm pretty sure I have JAVA_HOME and PATH defined appropriately.



This box has multiple versions of Java installed, which might be related to the problem.










share|improve this question





























    0















    I'm trying to install Spark on a Linux box. I downloaded it from http://spark.apache.org/docs/latest/building-spark.html and am trying to build it with this command:



    root# build/mvn -e -X -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package


    The build seems to start fine:



    Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
    Maven home: /some_path_here/spark-1.5.0/build/apache-maven-3.3.3
    Java version: 1.7.0_05, vendor: Oracle Corporation
    Java home: /usr/local/mytools-tools/java/jdk64/jre
    Default locale: en_US, platform encoding: UTF-8
    OS name: "linux", version: "2.6.32-573.1.1.el6.x86_64", arch: "amd64", family: "unix"
    [DEBUG] Created new class realm maven.api
    [DEBUG] Importing foreign packages into class realm maven.api


    But then it fails:



    [debug] Recompiling all 8 sources: invalidated sources (8) exceeded 50.0% of all sources
    [info] Compiling 8 Java sources to /some_path_here/spark-1.5.0/launcher/target/scala-2.10/classes...
    [debug] Attempting to call javac directly...
    [debug] com.sun.tools.javac.Main not found with appropriate method signature; forking javac instead
    [debug] Forking javac: javac @/tmp/sbt_6c9436e4/argfile
    [error] Cannot run program "javac": error=20, Not a directory
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Spark Project Parent POM ........................... SUCCESS [ 2.056 s]
    [INFO] Spark Project Launcher ............................. FAILURE [ 4.832 s]


    and so on.



    I'm pretty sure I have JAVA_HOME and PATH defined appropriately.



    This box has multiple versions of Java installed, which might be related to the problem.










    share|improve this question

























      0












      0








      0








      I'm trying to install Spark on a Linux box. I downloaded it from http://spark.apache.org/docs/latest/building-spark.html and am trying to build it with this command:



      root# build/mvn -e -X -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package


      The build seems to start fine:



      Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
      Maven home: /some_path_here/spark-1.5.0/build/apache-maven-3.3.3
      Java version: 1.7.0_05, vendor: Oracle Corporation
      Java home: /usr/local/mytools-tools/java/jdk64/jre
      Default locale: en_US, platform encoding: UTF-8
      OS name: "linux", version: "2.6.32-573.1.1.el6.x86_64", arch: "amd64", family: "unix"
      [DEBUG] Created new class realm maven.api
      [DEBUG] Importing foreign packages into class realm maven.api


      But then it fails:



      [debug] Recompiling all 8 sources: invalidated sources (8) exceeded 50.0% of all sources
      [info] Compiling 8 Java sources to /some_path_here/spark-1.5.0/launcher/target/scala-2.10/classes...
      [debug] Attempting to call javac directly...
      [debug] com.sun.tools.javac.Main not found with appropriate method signature; forking javac instead
      [debug] Forking javac: javac @/tmp/sbt_6c9436e4/argfile
      [error] Cannot run program "javac": error=20, Not a directory
      [INFO] ------------------------------------------------------------------------
      [INFO] Reactor Summary:
      [INFO]
      [INFO] Spark Project Parent POM ........................... SUCCESS [ 2.056 s]
      [INFO] Spark Project Launcher ............................. FAILURE [ 4.832 s]


      and so on.



      I'm pretty sure I have JAVA_HOME and PATH defined appropriately.



      This box has multiple versions of Java installed, which might be related to the problem.










      share|improve this question














      I'm trying to install Spark on a Linux box. I downloaded it from http://spark.apache.org/docs/latest/building-spark.html and am trying to build it with this command:



      root# build/mvn -e -X -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package


      The build seems to start fine:



      Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
      Maven home: /some_path_here/spark-1.5.0/build/apache-maven-3.3.3
      Java version: 1.7.0_05, vendor: Oracle Corporation
      Java home: /usr/local/mytools-tools/java/jdk64/jre
      Default locale: en_US, platform encoding: UTF-8
      OS name: "linux", version: "2.6.32-573.1.1.el6.x86_64", arch: "amd64", family: "unix"
      [DEBUG] Created new class realm maven.api
      [DEBUG] Importing foreign packages into class realm maven.api


      But then it fails:



      [debug] Recompiling all 8 sources: invalidated sources (8) exceeded 50.0% of all sources
      [info] Compiling 8 Java sources to /some_path_here/spark-1.5.0/launcher/target/scala-2.10/classes...
      [debug] Attempting to call javac directly...
      [debug] com.sun.tools.javac.Main not found with appropriate method signature; forking javac instead
      [debug] Forking javac: javac @/tmp/sbt_6c9436e4/argfile
      [error] Cannot run program "javac": error=20, Not a directory
      [INFO] ------------------------------------------------------------------------
      [INFO] Reactor Summary:
      [INFO]
      [INFO] Spark Project Parent POM ........................... SUCCESS [ 2.056 s]
      [INFO] Spark Project Launcher ............................. FAILURE [ 4.832 s]


      and so on.



      I'm pretty sure I have JAVA_HOME and PATH defined appropriately.



      This box has multiple versions of Java installed, which might be related to the problem.







      java maven build apache-spark sbt






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Sep 25 '15 at 13:38









      user1071847user1071847

      279514




      279514
























          2 Answers
          2






          active

          oldest

          votes


















          0














          I fixed this by turning off Zinc, per this answer to a related question: https://stackoverflow.com/a/32766960






          share|improve this answer

































            0














            A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.



            When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).



            This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.



            Some key things:



            When you run ./build/mvn clean package, check which version of maven it is using
            When maven runs it does its own traversal to figure out which JAVA_HOME is used
            Before trying to run the spark build, check JAVA_HOME is set as a variable
            Check that the JAVA_HOME version is a full jdk, not just a jre
            Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
            Thanks






            share|improve this answer
























              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32783518%2ffailing-to-build-apache-spark-cannot-run-program-javac-error-20-not-a-dir%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              0














              I fixed this by turning off Zinc, per this answer to a related question: https://stackoverflow.com/a/32766960






              share|improve this answer






























                0














                I fixed this by turning off Zinc, per this answer to a related question: https://stackoverflow.com/a/32766960






                share|improve this answer




























                  0












                  0








                  0







                  I fixed this by turning off Zinc, per this answer to a related question: https://stackoverflow.com/a/32766960






                  share|improve this answer















                  I fixed this by turning off Zinc, per this answer to a related question: https://stackoverflow.com/a/32766960







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited May 23 '17 at 10:27









                  Community

                  11




                  11










                  answered Sep 25 '15 at 14:22









                  user1071847user1071847

                  279514




                  279514

























                      0














                      A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.



                      When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).



                      This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.



                      Some key things:



                      When you run ./build/mvn clean package, check which version of maven it is using
                      When maven runs it does its own traversal to figure out which JAVA_HOME is used
                      Before trying to run the spark build, check JAVA_HOME is set as a variable
                      Check that the JAVA_HOME version is a full jdk, not just a jre
                      Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
                      Thanks






                      share|improve this answer




























                        0














                        A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.



                        When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).



                        This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.



                        Some key things:



                        When you run ./build/mvn clean package, check which version of maven it is using
                        When maven runs it does its own traversal to figure out which JAVA_HOME is used
                        Before trying to run the spark build, check JAVA_HOME is set as a variable
                        Check that the JAVA_HOME version is a full jdk, not just a jre
                        Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
                        Thanks






                        share|improve this answer


























                          0












                          0








                          0







                          A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.



                          When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).



                          This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.



                          Some key things:



                          When you run ./build/mvn clean package, check which version of maven it is using
                          When maven runs it does its own traversal to figure out which JAVA_HOME is used
                          Before trying to run the spark build, check JAVA_HOME is set as a variable
                          Check that the JAVA_HOME version is a full jdk, not just a jre
                          Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
                          Thanks






                          share|improve this answer













                          A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.



                          When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).



                          This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.



                          Some key things:



                          When you run ./build/mvn clean package, check which version of maven it is using
                          When maven runs it does its own traversal to figure out which JAVA_HOME is used
                          Before trying to run the spark build, check JAVA_HOME is set as a variable
                          Check that the JAVA_HOME version is a full jdk, not just a jre
                          Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
                          Thanks







                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Nov 23 '18 at 16:25









                          DavidDavid

                          407210




                          407210






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f32783518%2ffailing-to-build-apache-spark-cannot-run-program-javac-error-20-not-a-dir%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              "Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

                              Alcedinidae

                              RAC Tourist Trophy