You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have troubles with per-configuration classpath dependencies. I want not only the compile phase to rely on the other project, but also on the test phase.
As soon as I specify the configuration, for example replace dependsOn(commonSpark) with dependsOn(commonSpark % "compile->compile") compilaton will fail with
the following error:
~/multiproject/build.sbt:90: error: type mismatch;
found : com.lucidchart.sbtcross.BaseProject
required: sbt.ProjectReference
aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer, console, dashboard)
^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
My end goal is to have not only the compilation but also the test phase depend on the other project,
eg. dependsOn(commonSpark % "compile->compile;test->test") which is what I want)
build.sbt snippet that fails:
...
lazy val root = (project in file(".")).
settings(settings: _*).
aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer, dashboard)
lazy val commonConfig = (project in file("config")).cross
lazy val commonConfig_2_11 = commonConfig("2.11.8").
settings(libraryDependencies ++= (config ++ specs2))
lazy val commonConfig_2_12 = commonConfig("2.12.2").
settings(libraryDependencies ++= (config ++ specs2))
lazy val commonSpark = (project in file("common-spark")).
dependsOn(commonConfig_2_11).
settings(settings: _*).
settings(libraryDependencies ++= (config ++ logging ++ spark ++ specs2))
lazy val consumer = project.
dependsOn(commonSpark % "compile->compile").
settings(settings: _*).
settings(mainClass in assembly := Some("consumer.consumerRunner")).
settings(assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)).
settings(sparkAssembly).
settings(libraryDependencies ++= (config ++ logging ++ kafka ++ specs2))
...
Is there another way to depend also on the test configuration of another project?
PS
Using sbt-cross 3.0, sbt 0.13.15
The text was updated successfully, but these errors were encountered:
I ran into something similar to this (or perhaps the same issue) where adding sbt-cross seems to hijack the default Project type that is returned by calling project(...) in an SBT file, even if not using .cross on that particular project or one of its dependencies.
The error I got was
error: value settings is not a member of com.lucidchart.sbtcross.BaseProject
possible cause: maybe a semicolon is missing before `value settings'?
.settings(commonSettings:_*)
And I fixed it by chaining a .project call onto my initial definition like @oschrenk mentioned.
I have troubles with per-configuration classpath dependencies. I want not only the compile phase to rely on the other project, but also on the test phase.
As soon as I specify the configuration, for example replace
dependsOn(commonSpark)
withdependsOn(commonSpark % "compile->compile")
compilaton will fail withthe following error:
My end goal is to have not only the compilation but also the test phase depend on the other project,
eg.
dependsOn(commonSpark % "compile->compile;test->test")
which is what I want)build.sbt
snippet that fails:Is there another way to depend also on the test configuration of another project?
PS
Using sbt-cross 3.0, sbt 0.13.15
The text was updated successfully, but these errors were encountered: