我试图在Spark 1.6.2
上构建CentOS7
并遇到以下错误:
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:45: not found: type SparkFlumeProtocol
[error] val transactionTimeout: Int, val backOffInterval: Int) extends SparkFlumeProtocol with Logging {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:70: not found: type EventBatch
[error] override def getEventBatch(n: Int): EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala:80: not found: type EventBatch
[error] def getEventBatch: EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala:25: not found: type EventBatch
[error] def isErrorBatch(batch: EventBatch): Boolean = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:85: not found: type EventBatch
[error] new EventBatch("Spark sink has been stopped!", "", java.util.Collections.emptyList())
[error] ^
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelPipelineFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.group.ChannelGroup not found - continuing with a stub.
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala:86: not found: type SparkFlumeProtocol
[error] val responder = new SpecificResponder(classOf[SparkFlumeProtocol], handler.get)
答案 0 :(得分:0)
我在Spark 2.0.0上遇到了同样的问题。我认为原因是文件'external \ flume-sink \ src \ main \ avro \ sparkflume.avdl'编写得不好。
问题可以通过以下方式解决: