Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dependency com.ankurdave#part_2.10;0.1: #8

Open
hywUMD opened this issue Sep 10, 2015 · 16 comments
Open

dependency com.ankurdave#part_2.10;0.1: #8

hywUMD opened this issue Sep 10, 2015 · 16 comments

Comments

@hywUMD
Copy link

hywUMD commented Sep 10, 2015

Hi All,

When I added indexedrdd dependency in the sbt configuration file and compiled my scala, it keeps giving me this error:

error sbt.ResolveException: unresolved dependency: com.ankurdave#part_2.10;0.1: not found
[error] Total time: 6 s, completed Sep 9, 2015 11:01:37 PM

What does this mean? Any ideas?
Thank you!
Hong

@ankurdave
Copy link
Member

IndexedRDD depends on PART, which is stored in a custom repository. I think an update to Spark Packages might have caused it to stop adding that repository automatically along with the IndexedRDD package.

For now, you should be able to work around this by adding this repository to your build.sbt:

resolvers += "Repo at github.com/ankurdave/maven-repo" at "https://github.com/ankurdave/maven-repo/raw/master"

@hywUMD
Copy link
Author

hywUMD commented Sep 10, 2015

Hi Ankur,

Thank you for your prompt response! Yes, that solved my problem. BTW, I should take a look at the your build.sbt before asking the question. It has some clues there.

Thank you!
Hong

@ankurdave
Copy link
Member

Thanks for reporting this! I wouldn't have noticed there was a problem otherwise.

@ankurdave
Copy link
Member

I just released version 0.3, which should fix this problem and remove the need for the workaround.

@hywUMD
Copy link
Author

hywUMD commented Sep 10, 2015

Thanks, it works smoothly right now.

@ankurdave
Copy link
Member

Oh, actually I didn't clear the cache properly after applying the workaround so the problem is still there.

@ankurdave ankurdave reopened this Sep 10, 2015
@hywUMD
Copy link
Author

hywUMD commented Sep 10, 2015

Huh~, I didn't get the the compiling error though.

@brkyvz
Copy link
Contributor

brkyvz commented Sep 10, 2015

@hywUMD That's probably because you now have part in your cache. If you delete ~/.ivy2/cache/com.ankurdave you would observe the error

@pazooki
Copy link

pazooki commented Oct 13, 2015

I still have issue using this package:

/opt/spark-1.5.1-bin-hadoop2.6# /opt/spark-1.5.1-bin-hadoop2.6/bin/spark-shell --packages amplab:spark-indexedrdd:0.3
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.5.1-bin-hadoop2.6/lib/spark-assembly-1.5.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
amplab#spark-indexedrdd added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found amplab#spark-indexedrdd;0.3 in spark-packages
downloading http://dl.bintray.com/spark-packages/maven/amplab/spark-indexedrdd/0.3/spark-indexedrdd-0.3.jar ...
        [SUCCESSFUL ] amplab#spark-indexedrdd;0.3!spark-indexedrdd.jar (243ms)
:: resolution report :: resolve 1219ms :: artifacts dl 245ms
        :: modules in use:
        amplab#spark-indexedrdd;0.3 from spark-packages in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   2   |   1   |   1   |   0   ||   1   |   1   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
                module not found: com.ankurdave#part_2.10;0.1

        ==== local-m2-cache: tried

          file:/root/.m2/repository/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom

          -- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:

          file:/root/.m2/repository/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar

        ==== local-ivy-cache: tried

          /root/.ivy2/local/com.ankurdave/part_2.10/0.1/ivys/ivy.xml

        ==== central: tried

          https://repo1.maven.org/maven2/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom

          -- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:

          https://repo1.maven.org/maven2/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar

        ==== spark-packages: tried

          http://dl.bintray.com/spark-packages/maven/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom

          -- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:

          http://dl.bintray.com/spark-packages/maven/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar

                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: com.ankurdave#part_2.10;0.1: not found

                ::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.ankurdave#part_2.10;0.1: not found]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

@hywUMD
Copy link
Author

hywUMD commented Oct 13, 2015

@pazooki You can manually solve this by adding the repository dependency to the .sbt file
resolvers += "Repo at github.com/ankurdave/maven-repo" at "https://github.com/ankurdave/maven-repo/raw/master"
libraryDependencies += "com.ankurdave" %% "part" % "0.1"

@ankurdave
Copy link
Member

@pazooki Another workaround if you're using spark-shell or spark-submit is to invoke it with --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master.

@pazooki
Copy link

pazooki commented Oct 13, 2015

I will try it. Thanks. Would there be a difference if I'm using pyspark?

On Tue, Oct 13, 2015 at 4:03 PM Ankur Dave [email protected] wrote:

Another workaround if you're using spark-shell or spark-submit is to
invoke it with --repositories
https://raw.githubusercontent.com/ankurdave/maven-repo/master.


Reply to this email directly or view it on GitHub
#8 (comment)
.

@brkyvz
Copy link
Contributor

brkyvz commented Oct 13, 2015

nope ./pyspark --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master ... should work

@shermilaguerra
Copy link

with this code ./pyspark --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master ... should work it initialize with scala>

@sapthrishi
Copy link

Maven Compilation error when using latest spark which comes for Scala 2.11. Plz help.

part_2.10-0.1.jar of MyProject build path is cross-compiled with an incompatible version of Scala (2.10.0). Unknown Scala Version Problem

@bitdivine
Copy link

This is still an issue in 0.4.0. It would be most helpful if this could be bundled and published as 0.4.1. Can this be done please? I am happy to help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants