In build.sbt, dependencies in parent project not reflected in child modules
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
- I am looking for something similar to dependencies tag in pom.xml with Maven build.
- Will it make a difference if I use separate build.sbt for each of the child modules?
- Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
scala apache-spark module sbt
add a comment |
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
- I am looking for something similar to dependencies tag in pom.xml with Maven build.
- Will it make a difference if I use separate build.sbt for each of the child modules?
- Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
scala apache-spark module sbt
add a comment |
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
- I am looking for something similar to dependencies tag in pom.xml with Maven build.
- Will it make a difference if I use separate build.sbt for each of the child modules?
- Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
scala apache-spark module sbt
I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.
- I am looking for something similar to dependencies tag in pom.xml with Maven build.
- Will it make a difference if I use separate build.sbt for each of the child modules?
- Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.
Please help me with a solution to fix these.
scala apache-spark module sbt
scala apache-spark module sbt
edited Nov 23 '18 at 14:51
user6910411
35.8k1090111
35.8k1090111
asked Nov 23 '18 at 11:52
Satish SrinivasSatish Srinivas
968
968
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings
file for the dependencies.
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also inbuild.sbt
only. Asbuild.sbt
is just an enhancedScala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53446212%2fin-build-sbt-dependencies-in-parent-project-not-reflected-in-child-modules%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings
file for the dependencies.
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also inbuild.sbt
only. Asbuild.sbt
is just an enhancedScala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
add a comment |
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings
file for the dependencies.
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also inbuild.sbt
only. Asbuild.sbt
is just an enhancedScala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
add a comment |
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings
file for the dependencies.
My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
The settings (e.g. dependencies) I grouped in another file, e.g.:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
Now each sub-module just adds whatever is needed, e.g.:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
The whole project you find here: https://github.com/pme123/scala-adapters
See the project/Settings
file for the dependencies.
answered Nov 23 '18 at 13:25
pmepme
3,39111832
3,39111832
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also inbuild.sbt
only. Asbuild.sbt
is just an enhancedScala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
add a comment |
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also inbuild.sbt
only. Asbuild.sbt
is just an enhancedScala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
Thanks for the answer, but I want to make changes only in the build.sbt. I have now found out that the problem, is with scope I've given: "provided", seems it doesn't support transitive dependency and only if we give scope as "compile", the dependencies are propagated to the children. i am still wondering if it's possible to propagate the dependencies from parent to children even if the scope is "provided".
– Satish Srinivas
Nov 26 '18 at 14:54
@SatishSrinivas my solution works also in
build.sbt
only. As build.sbt
is just an enhanced Scala class
– pme
Nov 26 '18 at 15:01
@SatishSrinivas my solution works also in
build.sbt
only. As build.sbt
is just an enhanced Scala class
– pme
Nov 26 '18 at 15:01
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
Thanks, but I did a similar thing in my build.sbt, but when I give .aggregate (sub-projects) in the parent project definition, it says not able to resolve : recursive value sub-project needs type. Is it because the sub-projects are defined after the parent or due to sbt version problem?
– Satish Srinivas
Dec 4 '18 at 11:34
I never had this problem. according to the message you can set a type for the sub-projects, like:
lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
I never had this problem. according to the message you can set a type for the sub-projects, like:
lazy val server: Project = (project in file("server")).....
– pme
Dec 4 '18 at 13:46
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
thanks for your help. I have figured out a way to execute the tasks separately. But I think it affects when we build jars separately for the sub-projects. When I do package command for sub-project, it only gives jar with classes from child, but not from parent classes it is dependent on. Is building a fat jar with assembly the only way around this?
– Satish Srinivas
Dec 6 '18 at 6:26
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53446212%2fin-build-sbt-dependencies-in-parent-project-not-reflected-in-child-modules%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown