java.lang.NoClassDefFoundError:无法初始化类 - spark / scala

时间:2017-11-01 22:14:11

标签: scala maven apache-spark dataframe apache-spark-sql

我是spark / scala开发的新手。我正在使用maven来构建我的项目,IDE是智能的。我正在尝试查询hive表,然后迭代结果数据帧(使用foreach)。这是我的代码:

try{
    val DF_1 = hiveContext.sql("select distinct(address) from 
     test_table where trim(address)!=''")
    println("number of rows: "+DF_1.count)
    DF_1.foreach(x => {
      val y =hiveContext.sql("select place from test_table where address='"+x(0).toString+"'")
      if(y.count > 1){
        println("Multiple place values for address: "+x(0).toString)
        y.foreach(r => println(r))
        println("*************")
      }
    })} 
catch {case e: Exception => e.printStackTrace()}

每次迭代时,我都会查询同一个表以获取另一列,尝试查看places中每个address是否有多个test_table值。我没有编译错误,应用程序构建成功。但是,当我运行上面的代码时,我收到以下错误:

java.lang.NoClassDefFoundError: Could not initialize class xxxxxxxx

应用程序成功启动,在count中打印DF_1行,然后在foreach循环中出现上述错误。我在我的罐子上做了一个罐子xvf,可以看到主要的类 - driver.class

com/.../driver$$anonfun$1$$anonfun$apply$1.class
com/.../driver$$anonfun$1.class
com/.../driver$$anonfun$2.class
com/.../driver$$anonfun$3.class
com/.../driver$$anonfun$4.class
com/.../driver$$anonfun$5.class
com/.../driver$$anonfun$main$1$$anonfun$apply$1.class
com/.../driver$$anonfun$main$1$$anonfun$apply$2.class
com/.../driver$$anonfun$main$1$$anonfun$apply$3.class
com/.../driver$$anonfun$main$1.class
com/.../driver$$anonfun$main$10$$anonfun$apply$9.class
com/.../driver$$anonfun$main$10.class
com/.../driver$$anonfun$main$11.class
com/.../driver$$anonfun$main$12.class
com/.../driver$$anonfun$main$13.class
com/.../driver$$anonfun$main$14.class
com/.../driver$$anonfun$main$15.class
com/.../driver$$anonfun$main$16.class
com/.../driver$$anonfun$main$17.class
com/.../driver$$anonfun$main$18.class
com/.../driver$$anonfun$main$19.class
com/.../driver$$anonfun$main$2$$anonfun$apply$4.class
com/.../driver$$anonfun$main$2$$anonfun$apply$5.class
com/.../driver$$anonfun$main$2$$anonfun$apply$6.class
com/.../driver$$anonfun$main$2.class
com/.../driver$$anonfun$main$20.class
com/.../driver$$anonfun$main$21.class
com/.../driver$$anonfun$main$22.class
com/.../driver$$anonfun$main$23.class
com/.../driver$$anonfun$main$3$$anonfun$apply$7.class
com/.../driver$$anonfun$main$3$$anonfun$apply$8.class
com/.../driver$$anonfun$main$3.class
com/.../driver$$anonfun$main$4$$anonfun$apply$9.class
com/.../driver$$anonfun$main$4.class
com/.../driver$$anonfun$main$5.class
com/.../driver$$anonfun$main$6$$anonfun$apply$1.class
com/.../driver$$anonfun$main$6$$anonfun$apply$2.class
com/.../driver$$anonfun$main$6$$anonfun$apply$3.class
com/.../driver$$anonfun$main$6$$anonfun$apply$4.class
com/.../driver$$anonfun$main$6$$anonfun$apply$5.class
com/.../driver$$anonfun$main$6.class
com/.../driver$$anonfun$main$7$$anonfun$apply$1.class
com/.../driver$$anonfun$main$7$$anonfun$apply$2.class
com/.../driver$$anonfun$main$7$$anonfun$apply$3.class
com/.../driver$$anonfun$main$7$$anonfun$apply$4.class
com/.../driver$$anonfun$main$7$$anonfun$apply$5.class
com/.../driver$$anonfun$main$7$$anonfun$apply$6.class
com/.../driver$$anonfun$main$7$$anonfun$apply$7.class
com/.../driver$$anonfun$main$7$$anonfun$apply$8.class
com/.../driver$$anonfun$main$7.class
com/.../driver$$anonfun$main$8$$anonfun$apply$10.class
com/.../driver$$anonfun$main$8$$anonfun$apply$4.class
com/.../driver$$anonfun$main$8$$anonfun$apply$5.class
com/.../driver$$anonfun$main$8$$anonfun$apply$6.class
com/.../driver$$anonfun$main$8$$anonfun$apply$7.class
com/.../driver$$anonfun$main$8$$anonfun$apply$8.class
com/.../driver$$anonfun$main$8$$anonfun$apply$9.class
com/.../driver$$anonfun$main$8.class
com/.../driver$$anonfun$main$9$$anonfun$apply$11.class
com/.../driver$$anonfun$main$9$$anonfun$apply$7.class
com/.../driver$$anonfun$main$9$$anonfun$apply$8.class
com/.../driver$$anonfun$main$9$$anonfun$apply$9.class
com/.../driver$$anonfun$main$9.class
com/.../driver$.class
com/.../driver.class

当我以local模式而不是yarn启动作业时,我没有遇到错误。造成这个问题的原因是什么以及如何纠正?

任何帮助将不胜感激,谢谢。

1 个答案:

答案 0 :(得分:0)

看起来您的jar或某些依赖项未在工作节点之间分配。在AVCodecContext* OpenH264Codec(int width, int height, int bitrate, int framerate) { AVCodecContext* ctx = 0; AVCodec* c = 0; c = avcodec_find_encoder_by_name("h264_qsv"); if (c == NULL) return NULL; ctx = avcodec_alloc_context3(c); if (ctx == NULL) return NULL; ctx->width = width; ctx->height = height; ctx->pix_fmt = c->pix_fmts[0]; ctx->bit_rate = bitrate; ctx->bit_rate_tolerance = ctx->bit_rate / 2; ctx->rc_min_rate = 32000; ctx->rc_max_rate = ctx->bit_rate*1.5; ctx->time_base.den = framerate; ctx->time_base.num = 1; ctx->framerate.den = 1; ctx->framerate.num = framerate; ctx->gop_size = framerate*5; ctx->max_b_frames = 0; av_opt_set(ctx->priv_data, "preset", "veryfast", 0); av_opt_set(ctx->priv_data, "avbr_accuracy", "1", 0); av_opt_set(ctx->priv_data, "async_depth", "1", 0); av_opt_set(ctx->priv_data, "profile", "main", 0); ctx->flags |= AV_CODEC_FLAG_QSCALE; if (avcodec_open2(ctx, c, 0) < 0) { avcodec_free_context(&ctx); return NULL; } return ctx; } 模式下它可以工作,因为你在这个地方有罐子。在local模式下,您需要构建一个包含所有依赖项的胖jar,其中包括hive和spark库。