无法在Spark中打印堆栈跟踪

时间:2016-12-13 17:52:09

标签: apache-spark apache-spark-sql spark-dataframe

我在代码下面运行,在Failure部分,我在纱线日志中将异常堆栈跟踪打印为INFO。我的代码在sql中有语法错误,因此会生成一个异常。但是当我看到纱线日志时,它会显示一些意外的事情,如下所示“在已停止的SparkContext上调用方法”。需要帮助,如果我做错了什么。

Code Snipet:

var ret:String= Try {
    DbUtil.dropTable("cls_mkt_tracker_split_rownum", batchDatabase)
    SparkEnvironment.hiveContext.sql(
      s"""CREATE TABLE ${batchDatabase}.CLS_MKT_TRACKER_SPLIT_ROWNUM
        AS SELECT ROW_NUMBER() OVER(PARTITION BY XREF_IMS_PAT_NBR,MOLECULE ORER BY IMS_DSPNSD_DT ) AS ROWNUM,*
        FROM ${batchDatabase}.CLS_MKT_TRACKER_SPLIT
     """)
    true
  } match {
    case Success (b:Boolean) => ""
    case Failure (t :Throwable) => logger.info("I am in failure" + t.getMessage + t.getStackTraceString) ; "failure return"
  }

纱线日志: -

16/12/13 11:19:42 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
16/12/13 11:19:43 INFO DateAdjustment: I am in failureCannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.SparkContext.<init>(SparkContext.scala:83)
SparkEnvironment$.<init>(SparkEnvironment.scala:12)
SparkEnvironment$.<clinit>(SparkEnvironment.scala)
DbUtil$.dropTable(DbUtil.scala:8)
DateAdjustment$$anonfun$1.apply$mcZ$sp(DateAdjustment.scala:126)
DateAdjustment$$anonfun$1.apply(DateAdjustment.scala:125)
DateAdjustment$$anonfun$1.apply(DateAdjustment.scala:125)
scala.util.Try$.apply(Try.scala:161)
DateAdjustment$delayedInit$body.apply(DateAdjustment.scala:125)
scala.Function0$class.apply$mcV$sp(Function0.scala:40)
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
scala.App$$anonfun$main$1.apply(App.scala:71)
scala.App$$anonfun$main$1.apply(App.scala:71)
scala.collection.immutable.List.foreach(List.scala:318)
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
scala.App$class.main(App.scala:71)
DateAdjustment$.main(DateAdjustment.scala:14)
DateAdjustment.main(DateAdjustment.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

The currently active SparkContext was created at:

(No active SparkContext.)
         org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)

1 个答案:

答案 0 :(得分:0)

我调查并发现,在我的应用程序的某个地方,已经调用了SC.stop()并导致了问题。