模型制作工具-RandomForestClassifier,DecisionTreeClassifier和NaiveBayes-java.util.NoSuchElementException:找不到键:120.0

时间:2018-06-26 09:35:09

标签: ibm-cloud watson-studio

我正在尝试使用Watson Studio模型构建器构建模型。在训练模型期间失败了。

此错误是什么意思?是否在功能部件字段中寻找此键?如果是这样,哪个字段?

如何调试此问题以便解决?

  

评估错误       错误:无法计算模型指标;说明:作业由于阶段故障而中止:阶段29.0中的任务0失败10次,最近一次失败:阶段29.0中的任务0.9丢失(TID 38,xx-xxxx-xxxx-xxxx-xxxxx,执行程序xxxxx-xxxx-xxxxx-xxxx -xxxxxxx):java.util.NoSuchElementException:找不到密钥:scala处的scala.collection.AbstractMap.default(Map.scala:59)处的scala.collection.MapLike $ class.default(MapLike.scala:228)为120.0。位于scala.collection.AbstractMap.apply(Map.scala:59)的collection.MapLike $ class.apply(MapLike.scala:141)位于com.ibm.analytics.wml.features.NumericIndexerModel $$ anonfun $ 3 $$ anonfun $ 4。在com.ibm.analytics.wml.features.NumericIndexerModel $$ anonfun $ 3 $$ anonfun $ 4.apply(DecodableNumericIndexer.scala:116)处适用于scala.collection.TraversableLike $$ anonfun $ map $ 1的apply(DecodableNumericIndexer.scala:118)。在scala.collection处套用(TraversableLike.scala:234)。在scala.collection.immutable.Range.foreach(Range.scala:160)处的scala.collection.TraversableLike $$ anonfun $ map $ 1.apply(TraversableLike.scala:234) .TraversableLike $ class.map(T在scala.collection.AbstractTraversable.map(Traversable.scala:104)处在com.ibm.analytics.wml.features.NumericIndexerModel $$ anonfun $ 3.apply(DecodableNumericIndexer.scala:116)处在scala.collection.AbstractTraversable.map(Traversable.scala:104)处.analytics.wml.features.NumericIndexerModel $$ anonfun $ 3.apply(DecodableNumericIndexer.scala:115)在scala.collection.Iterator $$ anon $ 11.next(Iterator.scala:409)在scala.collection.Iterator $$ anon $ 11 .next(Iterator.scala:409)在scala.collection.Iterator $$ anon $ 11.next(Iterator.scala:409)在org.apache.spark.sql.catalyst.expressions.GeneratedClass $ GeneratedIterator.processNext(未知来源)在org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)在org.apache.spark.sql.execution.WholeStageCodegenExec $$ anonfun $ 8 $ anon $ 1.hasNext(WholeStageCodegenExec.scala:377)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext (Iterator.scala: 408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在org.apache.spark.sql.catalyst org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)上的.expressions.GeneratedClass $ GeneratedIterator.processNext(未知源)在org.apache.spark.sql.execution.WholeStageCodegenExec $$ anonfun $ 8 $ $ anon $ 1.hasNext(WholeStageCodegenExec.scala:377)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在scala.collection.Iterator $$ anon $ 11.hasNext (Iterator.scala:408)在org.apache.spark.sql.catalyst.expressions.GeneratedClass $ GeneratedClass $ GeneratedIterator.agg_doAggregateWithoutKey $(未知源)在org.apache.spark.sql.catalyst.expressions.GeneratedClass $ GeneratedIterator.processNext(未知小号ource)在org.apache.spark.sql.execution.WholeStageCodegenExec $$ anonfun $ 8 $ anan $ 1.hasNext(WholeStageCodegen:org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) 377)在scala.collection.Iterator $$ anon $ 11.hasNext(Iterator.scala:408)在org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:126)在org.apache.spark.scheduler位于org.apache.spark.scheduler的.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)位于org.apache.spark.scheduler.Task.run(Task.scala:99)处的ShuffleMapTask.runTask(ShuffleMapTask.scala:53) org.apache.spark.executor.Executor $ TaskRunner.run(Executor.scala:326)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1160)at java.util.concurrent.ThreadPoolExecutor $ Worker.run( ThreadPoolExecutor.java:635)at java.lang.Thread.run(Thread.java:811)驱动程序stacktrace:

0 个答案:

没有答案