我是编程和风暴中的一个菜鸟。我在书中有一个例子:" Apache Strom入门"。我使用的是Storm 1.1.0和jdk 1.8。当我尝试在eclipse霓虹灯或命令"暴风罐..."中运行我的代码时,我收到此错误:
[main] ERROR o.a.s.s.o.a.z.s.NIOServerCnxnFactory - Thread Thread[main,5,main] died org.apache.storm.generated.InvalidTopologyException:null
有人知道这个错误是什么,我该如何解决?
我写了这样的代码:
喷口:
public class WordReader implements IRichSpout{
TopologyContext context;
SpoutOutputCollector collector;
FileReader filereader;
private boolean completed = false;
public void ack(Object msgId){
System.out.println("OK: "+msgId);
}
public void fail(Object msgId){
System.out.println("FAIL: "+msgId);
}
public void nextTuple(){
if (completed){
try {
Thread.sleep(1000);
} catch (Exception e) {
//do nothing
}
//it should return to function
return;
}
String str;
try {
while ((str = reader.readLine()) != null){
this.collector.emit(new Values(str),str);
}
}catch (Exception e) {
throw new RuntimeException("Error reading tuple",e);
} finally {
completed = true;
}
}
public void open(Map conf, TopologyContext context,
SpoutOutputCollector collector){
try {
this.context = context;
this.filereader = new FileReader(conf.get("words").toString());
}catch(FileNotFoundException e) {
throw new RuntimeException("Error!");
}
this.collector = collector;
}
public void declareOutputFileds(OutputFieldsDeclarer declarer)
{
declarer.declare(new Fields("line"));
}
public void close() {
// TODO Auto-generated method stub
}
public void activate() {
// TODO Auto-generated method stub
}
public void deactivate() {
// TODO Auto-generated method stub
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// TODO Auto-generated method stub
}
public Map<String, Object> getComponentConfiguration() {
// TODO Auto-generated method stub
return null;
}
}
我有两个螺栓: 第一个是:public class WordNormalizer implements IRichBolt{
private OutputCollector collector;
public void execute (Tuple input){
String sentence = input.getString(0);
String words[] = sentence.split(" ");
//this is foreach structure to navigate in array
for (String word: words){
word= word.trim();
word = word.toLowerCase();
this.collector.emit(new Values(word));
}
collector.ack(input);
}
public void declareOutputFields(OutputFieldsDeclarer declarer){
declarer.declare(new Fields("word"));
}
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
}
public void cleanup() {
// TODO Auto-generated method stub
}
public Map<String, Object> getComponentConfiguration() {
// TODO Auto-generated method stub
return null;
}}
,最后一个是:
公共类WordCounter实现了IRichBolt {
String name;
Integer id;
Map<String, Integer> counters;
private OutputCollector collector;
public void execute(Tuple input) {
String str = input.getString(0);
if(!counters.containsKey(str)){
counters.put(str, 1);
}else{
Integer c = counters.get(str) + 1;
counters.put(str, c);
}
collector.ack(input);
}
public void prepare(Map conf, TopologyContext context, OutputCollector collector)
{
this.counters = new HashMap<String, Integer>();
this.collector = collector;
this.name = context.getThisComponentId();
this.id = context.getThisTaskId();
}
public void cleanup() {
System.out.println("-- Word Counter ["+name+"-"+id+"] --");
for(Map.Entry<String, Integer> entry : counters.entrySet()){
System.out.println(entry.getKey()+": "+entry.getValue());
}
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// TODO Auto-generated method stub
}
public Map<String, Object> getComponentConfiguration() {
// TODO Auto-generated method stub
return null;
}}
我的主要课程是:
public class TopologyMain {
public static void main(String[] args) {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("word-reader", new WordReader());
builder.setBolt("word-normalizer", new WordNormalizer()).shuffleGrouping("word-reader");
builder.setBolt("word-counter", new WordCounter()).shuffleGrouping("word-normalizer");
Config conf = new Config();
conf.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
conf.put("word", 0);
conf.setDebug(true);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("word", conf, builder.createTopology());
try{
Thread.sleep(2000);
} catch (Exception e) {
// TODO: handle exception
}
cluster.shutdown();
}}
吨
答案 0 :(得分:0)
构建拓扑时,wordnormalizer
中有拼写错误。应该word-normalizer
与word-counter
:
builder.setBolt("word-normalizer", new WordNormalizer()).shuffleGrouping("word-reader");
builder.setBolt("word-counter", new WordCounter()).shuffleGrouping("word-normalizer");