class not found错误 - scala

时间:2016-03-04 11:46:17

标签: scala maven noclassdeffounderror

我正在运行下面的scala程序。我正在使用maven进行构建,我已正确设置依赖项并且maven安装成功。但是当我运行jar文件时,我得到java.lang.NoClassDefFoundError。

程序:

private void chart1_PrePaint(object sender, ChartPaintEventArgs e)
{
    LegendCell cell = e.ChartElement as LegendCell;
    if (cell != null && cell.Tag == null) 
    {
        RectangleF r = e.ChartGraphics.GetAbsoluteRectangle(e.Position.ToRectangleF());
        e.ChartGraphics.Graphics.DrawRectangle(Pens.DimGray,Rectangle.Round(r));
        // Let's hide the left border when there is a cell span!
        if (cell.CellSpan != 1) 
                e.ChartGraphics.Graphics.DrawLine(Pens.White, 
                                r.Left, r.Top+1, r.Left, r.Bottom-1);
    }
}

的pom.xml:

package RasterDataIngest.RasterDataIngestIntoHadoop

import geotrellis.spark._
import geotrellis.spark.ingest._
import geotrellis.spark.io.hadoop._
import geotrellis.spark.io.index._
import geotrellis.spark.tiling._
import geotrellis.spark.utils.SparkUtils
import geotrellis.vector._
import org.apache.hadoop.fs.Path
import org.apache.spark._
import com.quantifind.sumac.ArgMain
import com.quantifind.sumac.validation.Required

class HadoopIngestArgs extends IngestArgs {
  @Required var catalog: String = _
  def catalogPath = new Path(catalog)
}

object HadoopIngest extends ArgMain[HadoopIngestArgs] with Logging {
  def main(args: HadoopIngestArgs): Unit = {
   System.setProperty("com.sun.media.jai.disableMediaLib", "true")

    implicit val sparkContext = SparkUtils.createSparkContext("Ingest")
    val conf = sparkContext.hadoopConfiguration
    conf.set("io.map.index.interval", "1")

    val catalog = HadoopRasterCatalog(args.catalogPath)
    val source = sparkContext.hadoopGeoTiffRDD(args.inPath)
    val layoutScheme = ZoomedLayoutScheme()

    Ingest[ProjectedExtent, SpatialKey](source, args.destCrs, layoutScheme, args.pyramid){ (rdd, level) => 
      catalog
        .writer[SpatialKey](RowMajorKeyIndexMethod, args.clobber)
        .write(LayerId(args.layerName, level.zoom), rdd)
    }
  }
}

错误:

<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-library</artifactId>
  <version>${scala.version}</version>
</dependency>
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.5.2</version>
</dependency>
<dependency>
  <groupId>com.azavea.geotrellis</groupId>
  <artifactId>geotrellis-spark_2.10</artifactId> //this is the one
  <version>0.10.0-M1</version>
</dependency>
<dependency>
  <groupId>org.scalaz.stream</groupId>
  <artifactId>scalaz-stream_2.10</artifactId>
  <version>0.7.2a</version>
</dependency>
<dependency>
  <groupId>org.apache.hadoop</groupId>
  <artifactId>hadoop-core</artifactId>
  <version>0.20.2</version>
</dependency>
<dependency>
  <groupId>com.quantifind</groupId>
  <artifactId>sumac_2.10</artifactId>
  <version>0.3.0</version>
</dependency>

请告诉我哪里出错了.. 先谢谢!

2 个答案:

答案 0 :(得分:0)

听起来你没有构建一个实际上包含你的依赖项的jar。如果这是你的问题,也许这个答案会有所帮助:

How can I create an executable JAR with dependencies using Maven?

答案 1 :(得分:0)

我遇到了类似的问题,并将其添加到我的pom.xml为我修复了它:

<plugin>
  <artifactId>maven-assembly-plugin</artifactId>
  <configuration>
    <descriptorRefs>
      <descriptorRef>jar-with-dependencies</descriptorRef>
    </descriptorRefs>
  </configuration>
  <executions>
    <execution>
      <id>make-assembly</id>
      <phase>package</phase>
      <goals>
        <goal>single</goal>
      </goals>
    </execution>
  </executions>
</plugin>

此插件有助于构建包含所有依赖项的JAR。资料来源:https://www.cloudera.com/documentation/enterprise/5-5-x/topics/spark_building.html#building