Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-canonical sbt projects #106

Open
pdmack opened this issue Jun 13, 2017 · 0 comments
Open

Non-canonical sbt projects #106

pdmack opened this issue Jun 13, 2017 · 0 comments

Comments

@pdmack
Copy link

pdmack commented Jun 13, 2017

The Scala S2I builder assemble script scans projects for a build.sbt and a projects directory and exits if it doesn't find them. However, there are older projects such as:
https://github.com/databricks/spark-perf/tree/master/spark-tests
which can potentially be built using sbt if we view our S2I as a generic sbt builder.

The trouble is that it obviates the value of a radanalytics openshift-spark image which includes Spark 2.x, and in turn the S2I images that use it (only Scala the moment).

Perhaps it is enough to just add a build parameter that relaxes the noted strict requirement.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant