-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dependency com.ankurdave#part_2.10;0.1: #8
Comments
IndexedRDD depends on PART, which is stored in a custom repository. I think an update to Spark Packages might have caused it to stop adding that repository automatically along with the IndexedRDD package. For now, you should be able to work around this by adding this repository to your build.sbt: resolvers += "Repo at github.com/ankurdave/maven-repo" at "https://github.com/ankurdave/maven-repo/raw/master" |
Hi Ankur, Thank you for your prompt response! Yes, that solved my problem. BTW, I should take a look at the your build.sbt before asking the question. It has some clues there. Thank you! |
Thanks for reporting this! I wouldn't have noticed there was a problem otherwise. |
I just released version 0.3, which should fix this problem and remove the need for the workaround. |
Thanks, it works smoothly right now. |
Oh, actually I didn't clear the cache properly after applying the workaround so the problem is still there. |
Huh~, I didn't get the the compiling error though. |
@hywUMD That's probably because you now have part in your cache. If you delete |
I still have issue using this package:
|
@pazooki You can manually solve this by adding the repository dependency to the .sbt file |
@pazooki Another workaround if you're using spark-shell or spark-submit is to invoke it with |
I will try it. Thanks. Would there be a difference if I'm using pyspark? On Tue, Oct 13, 2015 at 4:03 PM Ankur Dave [email protected] wrote:
|
nope |
with this code ./pyspark --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master ... should work it initialize with scala> |
Maven Compilation error when using latest spark which comes for Scala 2.11. Plz help. part_2.10-0.1.jar of MyProject build path is cross-compiled with an incompatible version of Scala (2.10.0). Unknown Scala Version Problem |
This is still an issue in 0.4.0. It would be most helpful if this could be bundled and published as 0.4.1. Can this be done please? I am happy to help. |
Hi All,
When I added indexedrdd dependency in the sbt configuration file and compiled my scala, it keeps giving me this error:
error sbt.ResolveException: unresolved dependency: com.ankurdave#part_2.10;0.1: not found
[error] Total time: 6 s, completed Sep 9, 2015 11:01:37 PM
What does this mean? Any ideas?
Thank you!
Hong
The text was updated successfully, but these errors were encountered: