我正在使用Apache Kafka 0.8.2.1将Web事件流式传输到其他数据源。我写的Kafka Producer工作得很好,当我运行kafka-console-consumer.sh时,我能够看到数据通过我的主题流式传输。但是,我没有任何运气试图让我的Kafka Consumer检索消息。有什么想法吗?
当我的代码尝试运行consumer.createMessageStreams(topicCountMap)时输出以下关于不正确路径的错误
Exception in thread "main" java.lang.IllegalArgumentException: Path must not end with / character
at org.apache.zookeeper.common.PathUtils.validatePath(PathUtils.java:58)
at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1024)
at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1073)
at org.I0Itec.zkclient.ZkConnection.exists(ZkConnection.java:95)
at org.I0Itec.zkclient.ZkClient$11.call(ZkClient.java:827)
at org.I0Itec.zkclient.ZkClient.retryUntilConnected(ZkClient.java:675)
at org.I0Itec.zkclient.ZkClient.watchForData(ZkClient.java:824)
at org.I0Itec.zkclient.ZkClient.subscribeDataChanges(ZkClient.java:136)
at kafka.consumer.ZookeeperConsumerConnector$$anonfun$kafka$consumer$ZookeeperConsumerConnector$$reinitializeConsumer$4.apply(ZookeeperConsume
rConnector.scala:901)
at kafka.consumer.ZookeeperConsumerConnector$$anonfun$kafka$consumer$ZookeeperConsumerConnector$$reinitializeConsumer$4.apply(ZookeeperConsume
rConnector.scala:898)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at kafka.consumer.ZookeeperConsumerConnector.kafka$consumer$ZookeeperConsumerConnector$$reinitializeConsumer(ZookeeperConsumerConnector.scala:
898)
at kafka.consumer.ZookeeperConsumerConnector.consume(ZookeeperConsumerConnector.scala:240)
at kafka.javaapi.consumer.ZookeeperConsumerConnector.createMessageStreams(ZookeeperConsumerConnector.scala:85)
at kafka.javaapi.consumer.ZookeeperConsumerConnector.createMessageStreams(ZookeeperConsumerConnector.scala:97)
以下是我的Kafka Consumer的代码。
val consumer: ConsumerConnector = kafka.consumer.Consumer.createJavaConsumerConnector(createConsumerConfig())
var executor: ExecutorService = null
def run(a_numThreads: Integer) {
var topicCountMap: java.util.Map[String, Integer] = new java.util.HashMap[String, Integer]()
topicCountMap.put("testEvent", new Integer(a_numThreads))
var consumerMap = consumer.createMessageStreams(topicCountMap)
var streams = consumerMap.get("testEvent")
// now launch all the threads
executor = Executors.newFixedThreadPool(a_numThreads)
// now create an object to consume the messages
//
var threadNumber: Integer = 0
var streamsItr = streams.iterator()
while (streamsItr.hasNext()) {
var stream = streamsItr.next()
executor.submit(new EventConsumer(stream, threadNumber))
threadNumber = threadNumber + 1
}
}
def createConsumerConfig(): ConsumerConfig = {
var props: Properties = new Properties()
props.put("zookeeper.connect", "127.0.0.1:2181")
props.put("zk.connect", "127.0.0.1:2181")
props.put("group.id", "testConsumer")
props.put("groupid", "tesConsumer")
props.put("zookeeper.session.timeout.ms", "400")
props.put("zookeeper.sync.time.ms", "200")
props.put("auto.commit.interval.ms", "1000")
return new ConsumerConfig(props)
}
答案 0 :(得分:1)
当Spark CheckpointWriter无法访问存储的检查点路径时,会生成此异常消息。请确保禁用检查点或提供正确的路径。在
中成功连接后发生异常at org.apache.zookeeper.common.PathUtils.validatePath(PathUtils.java:58)
似乎编写者无法访问将保存检查点信息的目录。
https://spark.apache.org/docs/1.3.0/streaming-programming-guide.html#checkpointing