KafkaSpout不断抛出OutOfMemory错误

时间:2015-02-07 05:43:08

标签: apache-kafka apache-storm apache-zookeeper

每当我尝试部署多个拓扑时,KafkaSpout都会抛出OutOfMemory错误。我检查了文件描述符,内存,检查了工作日志。

java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:714) at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1360) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:132) at org.apache.curator.framework.imps.CuratorFrameworkImpl.start(CuratorFrameworkImpl.java:237) at storm.kafka.ZkState.<init>(ZkState.java:62) at storm.kafka.KafkaSpout.open(KafkaSpout.java:85) at backtype.storm.daemon.executor$fn__3373$fn__3388.invoke(executor.clj:522) at backtype.storm.util$async_loop$fn__464.invoke(util.clj:461) at clojure.lang.AFn.run(AFn.java:24) at java.lang.Thread.run(Thread.java:745)

工作日志文件出错:

2015-02-07T05:43:48.657+0000 o.a.z.ClientCnxn [DEBUG] Reading reply sessionid:0x34afd2eb46d25ec, packet:: clientPath:null serverPath:null finished:false header:: 33,4 replyHeader:: 33,-1,0 request:: '/brokers/ids/2,F response:: #7b226a6d785f706f7274223a31313036312c2274696d657374616d70223a2231343137353737373732343937222c22686f7374223a2264616c2d6b61666b612d62726f6b657230312e6266642e77616c6d6172742e636f6d222c2276657273696f6e223a312c22706f7274223a393039327d,s{30064774364,30064774364,1417577772497,1417577772497,0,0,0,164959970529443843,114,0,30064774364} 2015-02-07T05:43:48.657+0000 b.s.util [ERROR] Halting process: ("Worker died") java.lang.RuntimeException: ("Worker died") at backtype.storm.util$exit_process_BANG_.doInvoke(util.clj:325) [storm-core-0.9.3.jar:0.9.3] at clojure.lang.RestFn.invoke(RestFn.java:423) [clojure-1.5.1.jar:na] kill 5232: No such processorker$fn__3812$fn__3813.invoke(worker.clj:456) [storm-core-0.9.3.jar:0.9.3] kill 5232: No such process at backtype.storm.daemon.executor$mk_executor_data$fkill 5491: No such processecutor.clj:240) [storm-core-0.9.3.jar:0.9.3] kill 5491: No such processacktype.storm.daemon.executor$mk_executor$fn__3312.invoke(executor.clj:334) [storm-core-0.9.3.jar:0.9.3] at backtype.storm.daemon.executor$mk_executor.invoke(executor.clj:334) [storm-core-0.9.3.jar:0.9.3] at backtype.storm.daemon.worker$fn__3743$exec_fn__1108__auto____3744$iter__3749__3753$fn__3754.invoke(worker.clj:382) [storm-core-0.9.3.jar:0.9.3]

1 个答案:

答案 0 :(得分:0)

我找到了解决方案的答案。原因是最大用户进程数量的限制,可以使用ulimit -a看到。 这里讨论增加这个数字的解决方案: How do I change the number of open files limit in Linux?