2016-08-11 5 views
3

私はいくつかのサブモジュールを持つsbtプロジェクトを持っています。私はSparkを使用していますが、最近Spark 2.0.0にアップグレードしようとしていますが、Scala 2.10の代わりにScala 2.11が必要です。ここに私のsbt構成されている:sbt elasticsearch-hadoopを使用してSparkで「バージョン間の接尾辞が矛盾しています」エラー

project/commons.scala

import sbt._ 
import Keys._ 

object Commons { 
    val appVersion = "0.0.2" 

    val settings: Seq[Def.Setting[_]] = Seq(
    version := appVersion, 
    scalaVersion := "2.11.8", 
    resolvers += Opts.resolver.mavenLocalFile, 
    resolvers += "conjars" at "http://conjars.org/repo", 
    resolvers += "clojars" at "https://clojars.org/repo" 
) 
} 

project/dependencies.scala

import sbt._ 
import Keys._ 

object Dependencies { 
    val sparkVersion = "2.0.0" 
    val awsVersion = "1.11.12" 
    val sprayVersion = "1.3.2" 
    val hiveVersion = "2.1.0" 
    val hadoopVersion = "2.7.2" 
    val esVersion = "2.3.2" 

    val sparkCoreDependency: ModuleID = "org.apache.spark" %% "spark-core" % sparkVersion % "provided" 
    val sparkMLDependency: ModuleID = "org.apache.spark" %% "spark-mllib" % sparkVersion % "provided" 
    val awsDependency: ModuleID = "com.amazonaws" % "aws-java-sdk" % awsVersion 
    val sprayJsonDependency: ModuleID = "io.spray" %% "spray-json" % sprayVersion 
    val hiveExecDependency: ModuleID = "org.apache.hive" % "hive-exec" % hiveVersion % "provided" 
    val hadoopCommonDependency: ModuleID = "org.apache.hadoop" % "hadoop-common" % hadoopVersion % "provided" 
    val esHadoopDependency: ModuleID = "org.elasticsearch" % "elasticsearch-hadoop" % esVersion % "provided" 
} 

build.sbt

import Dependencies._ 

lazy val utils = (project in file("utils")). 
    settings(Commons.settings: _*). 
    settings(
    name := "hadoop-utils", 
    libraryDependencies ++= Seq(
     hiveExecDependency, 
     sparkCoreDependency, 
     awsDependency, 
     hadoopCommonDependency, 
     esHadoopDependency 
    ) 
) 

lazy val ingestion = (project in file("ingestion")). 
    settings(Commons.settings: _*). 
    settings(
    name := "datascience-ingestion", 
    libraryDependencies ++= Seq(
     sparkCoreDependency, 
     awsDependency, 
     sprayJsonDependency 
    ) 
). 
    dependsOn(utils) 

lazy val hlda = (project in file("hlda")). 
    settings(Commons.settings: _*). 
    settings(
    name := "hlda", 
    libraryDependencies ++= Seq(
     sparkCoreDependency, 
     sparkMLDependency, 
     awsDependency, 
     sprayJsonDependency 
    ), 
    assemblyShadeRules in assembly := Seq(
     ShadeRule.rename("org.apache.http.**" -> "[email protected]").inAll 
    ) 
) 

私がコンパイルしようとすると、私が手:

$ sbt utils/compile 
[info] Loading global plugins from ~/.sbt/0.13/plugins 
[info] Loading project definition from /<snip>/project 
[info] Compiling 2 Scala sources to /<snip>/project/target/scala-2.10/sbt-0.13/classes... 
[info] Set current project to datascience-ingestion (in build file:/<snip>/) 
[info] Updating {file:/<snip>/}utils... 
[info] Resolving jline#jline;2.12.1 ... 
[info] Done updating. 
[error] Modules were resolved with conflicting cross-version suffixes in {file:/<snip>/}utils: 
[error] org.apache.spark:spark-launcher _2.11, _2.10 
[error] org.json4s:json4s-ast _2.11, _2.10 
[error] org.apache.spark:spark-network-shuffle _2.11, _2.10 
[error] com.twitter:chill _2.11, _2.10 
[error] org.json4s:json4s-jackson _2.11, _2.10 
[error] com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10 
[error] org.json4s:json4s-core _2.11, _2.10 
[error] org.apache.spark:spark-unsafe _2.11, _2.10 
[error] org.apache.spark:spark-core _2.11, _2.10 
[error] org.apache.spark:spark-network-common _2.11, _2.10 
java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common 
    at scala.sys.package$.error(package.scala:27) 
    at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46) 
    at sbt.ConflictWarning$.apply(ConflictWarning.scala:32) 
    at sbt.Classpaths$$anonfun$69.apply(Defaults.scala:1219) 
    at sbt.Classpaths$$anonfun$69.apply(Defaults.scala:1216) 
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) 
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) 
    at sbt.std.Transform$$anon$4.work(System.scala:63) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228) 
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) 
    at sbt.Execute.work(Execute.scala:237) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228) 
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) 
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
[error] (utils/*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common 
[error] Total time: 4 s, completed Aug 11, 2016 10:43:16 AM 

私はバージョンの接尾辞を "ハードコーディング"しようとしました。例:

val sparkCoreDependency: ModuleID = "org.apache.spark" % "spark-core_2.11" % sparkVersion % "provided" 

しかし、私は同じエラーが発生しました。スパークとスカラのバージョン番号を変更する前に、これはすべて動作していました。

+0

コンパイルの前に 'sbt clean clean-files'を試しましたか? –

+0

はい。何も変えなかった –

+0

https://github.com/jrudolph/sbt-dependency-graphを使用して、2.10の依存関係の原因を調べることができます。 –

答えて

3

elasticsearch-hadoop 2.3.2は、spark-core_2.10とspark-sql_2.10に依存するためです。おそらく、Scala 2.10またはforkse elasticsearch-hadoopを使用する必要があります。

関連する問題