让Spark,Java和MongoDB协同工作

时间:2015-12-04 17:19:47

标签: java mongodb maven hadoop apache-spark

my question here类似,但这次它是Java,而不是Python,导致我出现问题。

我已按照建议的步骤(据我所知)here但是因为我使用hadoop-2.6.1我认为我应该使用旧的API,而不是新的API在例子中提到。

我正在使用Ubuntu以及我拥有的各种组件版本

  • Spark spark-1.5.1-bin-hadoop2.6
  • Hadoop hadoop-2.6.1
  • Mongo 3.0.8
  • 通过Maven 包含
  • Mongo-Hadoop连接器罐子
  • Java 1.8.0_66
  • Maven 3.0.5

我的Java程序是基本的

$controller->model

使用Maven(import org.apache.spark.api.java.*; import org.apache.spark.SparkConf; import org.apache.spark.api.java.function.Function; import com.mongodb.hadoop.MongoInputFormat; import org.apache.hadoop.conf.Configuration; import org.bson.BSONObject; public class SimpleApp { public static void main(String[] args) { Configuration mongodbConfig = new Configuration(); mongodbConfig.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat"); mongodbConfig.set("mongo.input.uri", "mongodb://localhost:27017/db.collection"); SparkConf conf = new SparkConf().setAppName("Simple Application"); JavaSparkContext sc = new JavaSparkContext(conf); JavaPairRDD<Object, BSONObject> documents = sc.newAPIHadoopRDD( mongodbConfig, // Configuration MongoInputFormat.class, // InputFormat: read from a live cluster. Object.class, // Key class BSONObject.class // Value class ); } } )和以下pom文件构建正常

mvn package

然后我提交jar

<project>
<groupId>edu.berkeley</groupId>
  <artifactId>simple-project</artifactId>
  <modelVersion>4.0.0</modelVersion>
  <name>Simple Project</name>
  <packaging>jar</packaging>
  <version>1.0</version>
  <dependencies>
    <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.5.1</version>
    </dependency>
    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>mongo-java-driver</artifactId>
        <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.mongodb.mongo-hadoop</groupId>
      <artifactId>mongo-hadoop-core</artifactId>
      <version>1.4.2</version>
    </dependency>
  </dependencies>
  <build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
    </plugins>
</build>
</project>

并收到以下错误

/usr/local/share/spark-1.5.1-bin-hadoop2.6/bin/spark-submit --class "SimpleApp" --master local[4] target/simple-project-1.0.jar

注意

我在12月18日编辑了这个问题,因为它变得过于混乱和冗长。以前的评论可能看起来无关紧要然而,问题的背景是相同的。

1 个答案:

答案 0 :(得分:3)

我遇到了同样的问题,但经过大量试验和更改,我完成了我的工作与此代码。 我在ubuntu&amp;上运行带有netbeans的Maven项目Java 7 希望这会有所帮助。

如果b / w类有任何冲突,请包含maven-shade-plugin

P.S:我不知道你的特殊错误,但面对这么多。这段代码运行得很好。

   <dependencies>
              <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.14</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb.mongo-hadoop</groupId>
            <artifactId>mongo-hadoop-core</artifactId>
            <version>1.4.1</version>
        </dependency>
    </dependencies>

Java代码

  Configuration conf = new Configuration();
    conf.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat");
    conf.set("mongo.input.uri", "mongodb://localhost:27017/databasename.collectionname");
    SparkConf sconf = new SparkConf().setMaster("local").setAppName("Spark UM Jar");

    JavaRDD<User> UserMaster = sc.newAPIHadoopRDD(conf, MongoInputFormat.class, Object.class, BSONObject.class)
            .map(new Function<Tuple2<Object, BSONObject>, User>() {
                @Override
                public User call(Tuple2<Object, BSONObject> v1) throws Exception {
                    //return User
                }

            }