警告:检测到多个版本的scala库?

时间:2017-04-10 09:58:58

标签: scala maven apache-spark intellij-idea

我的项目是maven + intellij。 我在Windows系统中开发。 首先我使用最新的scala libray版本2.12.2。为了获取类SQLContext等,我必须导入spark jar: enter image description here

但后来有人告诉我,如果我想使用这个火花罐,我必须降低我的scala版本,所以我删除了libiray并更改为2.10 ....但是现在当我mvn_clean_install。我得到了这个:< / p>

[WARNING]  Expected all dependencies to require Scala version: 2.11.7
[WARNING]  com.twitter:chill_2.11:0.5.0 requires scala version: 2.11.7
[WARNING]  com.typesafe.akka:akka-remote_2.11:2.3.11 requires scala version: 2.11.7
[WARNING]  com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.7
[WARNING]  com.typesafe.akka:akka-slf4j_2.11:2.3.11 requires scala version: 2.11.7
[WARNING]  org.apache.spark:spark-core_2.11:1.6.1 requires scala version: 2.11.7
[WARNING]  org.json4s:json4s-jackson_2.11:3.2.10 requires scala version: 2.11.7
[WARNING]  org.json4s:json4s-core_2.11:3.2.10 requires scala version: 2.11.7
[WARNING]  org.json4s:json4s-ast_2.11:3.2.10 requires scala version: 2.11.7
[WARNING]  org.json4s:json4s-core_2.11:3.2.10 requires scala version: 2.11.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] E:\...\src\main\scala:-1: info: compiling
[INFO] Compiling 4 source files to E:\...\target\classes at 1491813951772
[ERROR] E:\...\qubole\mapreduce\ConvertToParquetFormat.scala:2: error: object sql is not a member of package org.apache.spark
[ERROR] import org.apache.spark.sql.SQLContext
[ERROR]                         ^
[ERROR] E:\...\qubole\mapreduce\ConvertToParquetFormat.scala:15: error: not found: type SQLContext
[ERROR]   val sqlContext = new SQLContext(sc)
[ERROR]                        ^
[ERROR] E:\...\mapreduce\ConvertToParquetFormat.scala:24: error: value toDF is not a member of org.apache.spark.rdd.RDD[....qubole.mapreduce.ConvertToParquetFormat.OmnitureHit]
[ERROR] possible cause: maybe a semicolon is missing before `value toDF'?
[ERROR]   .toDF().write.parquet ("file:///C:/Users/Desktop/456")
[ERROR]    ^
[ERROR] three errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:00 min
[INFO] Finished at: 2017-04-10T16:45:58+08:00
[INFO] Final Memory: 33M/360M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project packages-omniture-qubole-mapreduce: wrap: org.apache.commons.exec.ExecuteException: Process exited with an er
ror: 1 (Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

E:\github\mia\packages-omniture-qubole-mapreduce>

所以我删除了scala 10版本,它告诉我我没有一个scala libray。然后我添加一个scala 2.11.因为它看起来像一些罐子还需要2.11版本Multiple versions of scala libraries detected?

但是当我按住Ctrl + left_cilck这个单词SQLContext时,我可以转到该页面,但它是匿名的。

enter image description here enter image description here 那是因为我忘了删除有关旧罐子或图书馆的东西吗? 这是我的左依赖列表: enter image description here enter image description here enter image description here

这是我的pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.company.www</groupId>
    <artifactId>packages-omniture-qubole-mapreduce</artifactId>
    <version>1.0-SNAPSHOT</version>

    <parent>
        <groupId>com.company.www.platform</groupId>
        <artifactId>platform-parent-spark</artifactId>
        <version>0.1.41</version>
    </parent>

    <properties>
        <spark.mapreduce.mainclass>com.company.www.packages.omniture.qubole.mapreduce.SampleMapReduceJob</spark.mapreduce.mainclass>
    </properties>


    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_${scala.major.minor.version}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.company.www.commons</groupId>
            <artifactId>commons-spark</artifactId>
            <version>[1.0.17, ]</version>
        </dependency>
        <dependency>
            <groupId>com.company.www</groupId>
            <artifactId>exp-user-interaction-messages-v1</artifactId>
            <version>[1.4,]</version>
        </dependency>
        <dependency>
            <groupId>org.scalaj</groupId>
            <artifactId>scalaj-http_${scala.major.minor.version}</artifactId>
            <version>1.1.4</version>
        </dependency>
        <dependency>
            <groupId>com.google.code.gson</groupId>
            <artifactId>gson</artifactId>
            <version>2.3</version>
        </dependency>
        <dependency>
            <groupId>org.parboiled</groupId>
            <artifactId>parboiled-java</artifactId>
            <version>1.0.2</version>
            <scope>test</scope>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.4</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <finalName>packages-omniture-qubole-mapreduce</finalName>
                            <shadedArtifactAttached>false</shadedArtifactAttached>
                            <artifactSet>
                                <includes>
                                    <include>*:*</include>
                                </includes>
                            </artifactSet>
                            <filters>
                                <filter>
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                            <transformers>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
                                <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                    <resource>reference.conf</resource>
                                </transformer>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer">
                                    <resource>log4j.properties</resource>
                                </transformer>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <mainClass>${spark.mapreduce.mainclass}</mainClass>
                                </transformer>
                            </transformers>
                            <relocations>
                                <relocation>
                                    <pattern>org.eclipse.jetty</pattern>
                                    <shadedPattern>org.spark-project.jetty</shadedPattern>
                                    <includes>
                                        <include>org.eclipse.jetty.**</include>
                                    </includes>
                                </relocation>
                                <relocation>
                                    <pattern>com.google.common</pattern>
                                    <shadedPattern>org.spark-project.guava</shadedPattern>
                                    <excludes>
                                        <exclude>com/google/common/base/Absent*</exclude>
                                        <exclude>com/google/common/base/Function</exclude>
                                        <exclude>com/google/common/base/Optional*</exclude>
                                        <exclude>com/google/common/base/Present*</exclude>
                                        <exclude>com/google/common/base/Supplier</exclude>
                                    </excludes>
                                </relocation>
                            </relocations>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

</project>

2 个答案:

答案 0 :(得分:5)

在您的pom中包含以下内容

$myList

答案 1 :(得分:0)

问题可能是那些com.company.www.依赖项(并且没有指定它们的确切版本,你会让它变得更糟)。他们对特定的Scala版本进行硬编码,您必须查看其依赖项以查找哪些(查找_2.10_2.11_2.12后缀)。

假设这些是您公司的软件包,您需要为不同的Scala版本构建单独的工件或者解决特定的Scala版本(例如,通过为所有Spark项目要求公共父POM)。