凤凰城“org.apache.phoenix.spark.DefaultSource”错误

时间:2016-12-20 10:37:11

标签: apache-spark hbase phoenix

我是凤凰新手,我正试图将java.lang.ClassNotFoundException: org.apache.phoenix.spark.DefaultSource表加载到凤凰城。当我尝试加载Phoenix时,我遇到了错误。

package com.vas.reports import org.apache.spark.SparkContext import org.apache.spark.sql.{SQLContext, SaveMode} import org.apache.phoenix.spark import java.sql.DriverManager import com.google.common.collect.ImmutableMap import org.apache.hadoop.hbase.filter.FilterBase import org.apache.phoenix.query.QueryConstants import org.apache.phoenix.filter.ColumnProjectionFilter; import org.apache.phoenix.hbase.index.util.ImmutableBytesPtr; import org.apache.phoenix.hbase.index.util.VersionUtil; import org.apache.hadoop.hbase.filter.Filter object PhoenixRead { case class Record(NO:Int,NAME:String,DEPT:Int) def main(args: Array[String]) { val sc= new SparkContext("local","phoenixsample") val sqlcontext=new SQLContext(sc) val numWorkers = sc.getExecutorStorageStatus.map(_.blockManagerId.executorId).filter(_ != "driver").length import sqlcontext.implicits._ val df1=sc.parallelize(List((2,"Varun", 58), (3,"Alice", 45), (4,"kumar", 55))). toDF("NO", "NAME", "DEPT") df1.show() println(numWorkers) println("pritning df2") val df =sqlcontext.load("org.apache.phoenix.spark",Map("table"->"udm_main","zkUrl"->"phoenix url:2181/hbase-unsecure")) df.show()

我的代码:

spark-submit --class com.vas.reports.PhoenixRead --jars /home/hadoop1/phoenix-core-4.4.0-HBase-1.1.jar /shared/test/ratna-0.0.1-SNAPSHOT.jar

SPARK-SUBMIT ~~~~~~~~~~~~

add

请仔细研究并建议我。

1 个答案:

答案 0 :(得分:0)

这是因为,您需要在HBASE_HOME / libs和SPARK_HOME / lib中添加以下库文件。

在HBASE_HOME / libs中:

  • 凤火花4.7.0-HBase的-1.1.jar
  • 凤-4.7.0-HBase的-1.1-的server.jar
SPARK_HOME / lib中的

  • 凤火花4.7.0-HBase的-1.1.jar
  • 凤-4.7.0-HBase的-1.1-client.jar中