java.lang.ClassNotFoundException:无法找到数据源:com.cloudant.spark。在IBM BigInsights集群中

时间:2016-08-05 19:16:50

标签: ibm-cloud apache-spark-sql cloudant biginsights

我创建了一个IBM BigInsights服务实例,其中包含5个节点的hadoop集群(包括Apache Spark)。我尝试使用SparkR连接Cloudant数据库,获取一些数据并进行一些处理。

我已经启动了一个SparkR shell(终端)并运行了以下代码:

sparkR.stop()
# Creating SparkConext and connecting to Cloudant DB
sc <- sparkR.init(sparkEnv = list("cloudant.host"="<<cloudant-host-name>>","<<><<cloudant-user-name>>>","cloudant.password"="<<cloudant-password>>", "jsonstore.rdd.schemaSampleSize"="-1"))

# Database to be connected to extract the data
database <- "testdata"
# Creating Spark SQL Context
sqlContext <- sparkRSQL.init(sc)
# Creating DataFrame for the "testdata" Cloudant DB
testDataDF <- read.df(sqlContext, database, header='true', source = "com.cloudant.spark",inferSchema='true')

我收到以下错误消息:

16/08/05 19:00:27 ERROR RBackendHandler: loadDF on org.apache.spark.sql.api.r.SQLUtils failed
Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
  java.lang.ClassNotFoundException: Failed to find data source: com.cloudant.spark. Please find packages at http://spark-packages.org
        at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
        at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
        at org.apache.spark.sql.api.r.SQLUtils$.loadDF(SQLUtils.scala:160)
        at org.apache.spark.sql.api.r.SQLUtils.loadDF(SQLUtils.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
        at org.apache.spark.api.r.RBackendHandler.channelRead0(RBacke

如何在IBM BigInsights中安装spark-cloudant连接器并解决问题?任何帮助将不胜感激。

1 个答案:

答案 0 :(得分:0)

您需要将包的名称传递给sparkR.init:

sc <- sparkR.init(sparkPackages="com.databricks:spark-csv_2.11:1.0.3")

见这里:

https://spark.apache.org/docs/1.6.0/sparkr.html#from-data-sources

火花云层包在这里:

https://spark-packages.org/package/cloudant-labs/spark-cloudant

对于4.2群集,我认为您需要:

cloudant-labs:spark-cloudant:1.6.4-s_2.10