Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with per-configuration classpath dependencies #8

Open
oschrenk opened this issue Jul 19, 2017 · 2 comments
Open

Problems with per-configuration classpath dependencies #8

oschrenk opened this issue Jul 19, 2017 · 2 comments

Comments

@oschrenk
Copy link

oschrenk commented Jul 19, 2017

I have troubles with per-configuration classpath dependencies. I want not only the compile phase to rely on the other project, but also on the test phase.

As soon as I specify the configuration, for example replace dependsOn(commonSpark) with
dependsOn(commonSpark % "compile->compile") compilaton will fail with
the following error:

~/multiproject/build.sbt:90: error: type mismatch;
 found   : com.lucidchart.sbtcross.BaseProject
 required: sbt.ProjectReference
  aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer, console, dashboard)
                                                               ^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression

My end goal is to have not only the compilation but also the test phase depend on the other project,
eg. dependsOn(commonSpark % "compile->compile;test->test") which is what I want)

build.sbt snippet that fails:

...
lazy val root = (project in file(".")).
  settings(settings: _*).
  aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer, dashboard)

lazy val commonConfig = (project in file("config")).cross

lazy val commonConfig_2_11 = commonConfig("2.11.8").
  settings(libraryDependencies ++= (config ++ specs2))

lazy val commonConfig_2_12 = commonConfig("2.12.2").
  settings(libraryDependencies ++= (config ++ specs2))

lazy val commonSpark = (project in file("common-spark")).
  dependsOn(commonConfig_2_11).
  settings(settings: _*).
  settings(libraryDependencies ++= (config ++ logging ++ spark ++ specs2))

lazy val consumer = project.
  dependsOn(commonSpark % "compile->compile").
  settings(settings: _*).
  settings(mainClass in assembly := Some("consumer.consumerRunner")).
  settings(assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)).
  settings(sparkAssembly).
  settings(libraryDependencies ++= (config ++ logging ++ kafka ++ specs2))
...

Is there another way to depend also on the test configuration of another project?

PS
Using sbt-cross 3.0, sbt 0.13.15

@oschrenk
Copy link
Author

oschrenk commented Jul 19, 2017

I found a way by referring to the underlying project

-  aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer, console)
+  aggregate(commonConfig_2_11, commonConfig_2_12, commonSpark, consumer.project, console)

and

-  dependsOn(commonSpark % "compile->compile;test->test").
+  dependsOn(commonSpark % "compile->compile;test->test").project.

@worace
Copy link

worace commented Aug 20, 2020

I ran into something similar to this (or perhaps the same issue) where adding sbt-cross seems to hijack the default Project type that is returned by calling project(...) in an SBT file, even if not using .cross on that particular project or one of its dependencies.

The error I got was

error: value settings is not a member of com.lucidchart.sbtcross.BaseProject
possible cause: maybe a semicolon is missing before `value settings'?
  .settings(commonSettings:_*)

And I fixed it by chaining a .project call onto my initial definition like @oschrenk mentioned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants