You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Scala S2I builder assemble script scans projects for a build.sbt and a projects directory and exits if it doesn't find them. However, there are older projects such as: https://github.com/databricks/spark-perf/tree/master/spark-tests
which can potentially be built using sbt if we view our S2I as a generic sbt builder.
The trouble is that it obviates the value of a radanalytics openshift-spark image which includes Spark 2.x, and in turn the S2I images that use it (only Scala the moment).
Perhaps it is enough to just add a build parameter that relaxes the noted strict requirement.
The text was updated successfully, but these errors were encountered:
The Scala S2I builder assemble script scans projects for a build.sbt and a projects directory and exits if it doesn't find them. However, there are older projects such as:
https://github.com/databricks/spark-perf/tree/master/spark-tests
which can potentially be built using sbt if we view our S2I as a generic sbt builder.
The trouble is that it obviates the value of a radanalytics openshift-spark image which includes Spark 2.x, and in turn the S2I images that use it (only Scala the moment).
Perhaps it is enough to just add a build parameter that relaxes the noted strict requirement.
The text was updated successfully, but these errors were encountered: