我正在尝试运行java -Xmx5g -cp stanford-corenlp-3.8.0.jar:stanford-corenlp-models-3.8.0.jar:* edu.stanford.nlp.pipeline.StanfordCoreNLP -annotators tokenize,ssplit,pos,lemma,ner,parse,mention,coref -coref.algorithm neural -file example_file.txt
来查找文本中相同实体的提及。
但是当我在终端中运行该命令时,进程被终止并且错误被写入日志中,表示Java Runtime Environment的内存不足以继续。
我正在使用Ubuntu:
java版“1.8.0_151”。
Java(TM)SE运行时环境(版本1.8.0_151-b12)
Java Hotspot(TM)64位服务器VM(版本25.151-b12,混合模式)
由于日志很长,问题正文无法适应所有日志的详细信息。
以下是日志:error log
[更新]我增加了虚拟机的物理内存。现在我收到了这个错误:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649)
at java.lang.StringBuilder.append(StringBuilder.java:202)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:186)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:169)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:148)
at edu.stanford.nlp.pipeline.ParserAnnotator.doOneSentence(ParserAnnotator.java:360)
at edu.stanford.nlp.pipeline.ParserAnnotator.doOneSentence(ParserAnnotator.java:254)
at edu.stanford.nlp.pipeline.SentenceAnnotator.annotate(SentenceAnnotator.java:102)
at edu.stanford.nlp.pipeline.AnnotationPipeline.annotate(AnnotationPipeline.java:76)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.annotate(StanfordCoreNLP.java:599)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.annotate(StanfordCoreNLP.java:609)
at edu.stanford.nlp.pipeline.StanfordCoreNLP$$Lambda$55/45416784.accept(Unknown Source)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.processFiles(StanfordCoreNLP.java:1172)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.processFiles(StanfordCoreNLP.java:945)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.run(StanfordCoreNLP.java:1274)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.main(StanfordCoreNLP.java:1345)
有没有办法解决这个问题?
答案 0 :(得分:1)
错误报告说明了这一点:
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 335785984 bytes
for committing reserved memory.
# Possible reasons:
# The system is out of physical RAM or swap space
# In 32 bit mode, the process size limit was hit
从表面看这个:
第一种解释意味着操作系统拒绝来自JVM的请求以分配大量本机内存,因为资源(物理内存或交换空间)不可用。
您使用的是64位JVM,因此第二种可能的解释不适用。
第一种解释似乎有道理。可能的修复方法可能是: