kafka流-打开文件描述符的数量不断增加

时间:2019-03-14 15:41:29

标签: apache-kafka apache-kafka-streams rocksdb rocksdb-java

我们的kafka流媒体应用程序会不断打开新文件描述符,只要它们是新传入的消息,而不会关闭旧的文件描述符。最终导致异常。我们已经将开放fds的限制提高到65k,但这似乎无济于事。

Kafka代理和Kafka流库均为2.1版

在日志中不断显示的错误消息是:

  

org.apache.kafka.streams.processor.internals.StreamThread.run   StreamThread.java:747   org.apache.kafka.streams.processor.internals.StreamThread.runLoop   StreamThread.java:777   org.apache.kafka.streams.processor.internals.StreamThread.runOnce   StreamThread.java:883   org.apache.kafka.streams.processor.internals.StreamThread.maybeCommit   StreamThread.java:1029   org.apache.kafka.streams.processor.internals.TaskManager.commitAll   TaskManager.java:405   org.apache.kafka.streams.processor.internals.AssignedTasks.commit   AssignedTasks.java:346   org.apache.kafka.streams.processor.internals.StreamTask.commit   StreamTask.java:431   org.apache.kafka.streams.processor.internals.StreamTask.commit   StreamTask.java:443   org.apache.kafka.streams.processor.internals.StreamTask.flushState   StreamTask.java:491   org.apache.kafka.streams.processor.internals.AbstractTask.flushState   AbstractTask.java:204   org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush   ProcessorStateManager.java:217   org.apache.kafka.streams.state.internals.MeteredKeyValueStore.flush   MeteredKeyValueStore.java:226   org.apache.kafka.streams.state.internals.WrappedStateStore $ AbstractStateStore.flush   WrappedStateStore.java:85   org.apache.kafka.streams.state.internals.RocksDBStore.flush   RocksDBStore.java:388   org.apache.kafka.streams.state.internals.RocksDBStore.flushInternal   RocksDBStore.java:395 org.rocksdb.RocksDB.flush RocksDB.java:1743   org.rocksdb.RocksDB.flush RocksDB.java org.rocksdb.RocksDBException:   打开文件进行追加时:   /tmp/kafka-streams/s4l-notifications-test/5_1/rocksdb/main-store/002052.sst:   打开的文件状态过多:#object [org.rocksdb.Status 0x1cca4c5c   “ org.rocksdb.Status@1cca4c5c”]   org.apache.kafka.streams.errors.ProcessorStateException:时出错   从商店主商店执行刷新

有什么想法要调试吗?

0 个答案:

没有答案