进程在向Neo4J导入500万行时崩溃

时间:2014-05-14 20:45:51

标签: neo4j

摘要 我使用csv batch-import导入3节点文件和2个关系文件。我正在运行Neo4J 2.0.3并且我有2.0版本的批量导入程序编译程序集。该文件在第3个文件上崩溃,并且似乎没有达到关系。

环境   - 在SSD上运行的VM。 我认为它可能太小了?   - Windows 7 64
  - 4.5 gb ram   - JDK version = jdk1.7.0_45

以下是我的Batch.Properties。

cache_type =无 use_memory_mapped_buffers =真 neostore.nodestore.db.mapped_memory = 1G neostore.relationshipstore.db.mapped_memory =1克 neostore.propertystore.db.mapped_memory = 3G neostore.propertystore.db.strings.mapped_memory = 500M neostore.propertystore.db.arrays.mapped_memory = 0M neostore.propertystore.db.index.keys.mapped_memory = 150M neostore.propertystore.db.index.mapped_memory = 150M

batch_import.node_index.clients =确切 batch_import.node_index.systems =确切 batch_import.node_index.technicalobjectkey =确切 batch_import.rel_index.clientrelation =确切

示例csv格式

我导入3个节点文件和2个关系csv文件

以下是第一个文件的示例,它有238万行。

ClientId:string:clients name    l:label ClientKey
499999999   JUNE  BUG   CLIENT  0000622P32J13106wfe
499999998   MORTIMER  FELICIEN  CLIENT  0001FKV2FBY35273wfe
499999997   ELIAS  REDMAN   CLIENT  0002SFUVAUI1443wfe
499999996   JITENDRA  ISMAIL    CLIENT  0002SFUVAUI17583wfew

第二个文件有6行

第三个文件有280万行,看起来像这样

technicalobjectkey:string:technicalobjectkey
009DLSMO6N0SPREM
IFPQFPE6
P1T63GUGC10SPREM
SMSD8FDX
0T4BIAHX

该进程在第三个文件崩溃。

以下是整个错误消息。

Using: Importer C:\Users\Steven.Suing\Documents\Neo4j\ClientGraph data/ClientNodes.csv,data/SystemNodes.csv,data/ObjectNodes.csv ,data/MainRelations.csv,data/ObjectToSystemRel.csv

Using Existing Configuration File
.......................
Importing 2381482 Nodes took 114 seconds

Importing 6 Nodes took 0 seconds
...................Exception in thread "main" java.lang.RuntimeException: Writer thread failed
        at org.mapdb.AsyncWriteEngine.checkState(AsyncWriteEngine.java:245)
        at org.mapdb.AsyncWriteEngine.close(AsyncWriteEngine.java:391)
        at org.mapdb.EngineWrapper.close(EngineWrapper.java:72)
        at org.mapdb.EngineWrapper.close(EngineWrapper.java:72)
        at org.mapdb.CacheHashTable.close(CacheHashTable.java:169)
        at org.mapdb.DB.close(DB.java:401)
        at org.neo4j.batchimport.index.MapDbCachingIndexProvider.shutdown(MapDbCachingIndexProvider.java:51)
        at org.neo4j.batchimport.Importer.finish(Importer.java:87)
        at org.neo4j.batchimport.Importer.doImport(Importer.java:239)
        at org.neo4j.batchimport.Importer.main(Importer.java:83)
Caused by: java.lang.RuntimeException: File could not be mapped to memory, common problem on 32bit JVM. Use `DBMaker.newRandomAccessFileDB()` as workaround
        at org.mapdb.Volume$MappedFileVol.makeNewBuffer(Volume.java:496)
        at org.mapdb.Volume$ByteBufferVol.ensureAvailable(Volume.java:245)
        at org.mapdb.StoreDirect.freePhysTake(StoreDirect.java:724)
        at org.mapdb.StoreDirect.physAllocate(StoreDirect.java:408)
        at org.mapdb.StoreDirect.update(StoreDirect.java:270)
        at org.mapdb.EngineWrapper.update(EngineWrapper.java:55)
        at org.mapdb.AsyncWriteEngine.access$101(AsyncWriteEngine.java:68)
        at org.mapdb.AsyncWriteEngine.runWriter(AsyncWriteEngine.java:201)
        at org.mapdb.AsyncWriteEngine$2.run(AsyncWriteEngine.java:143)
        at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.IOException: Map failed
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:888)
        at org.mapdb.Volume$MappedFileVol.makeNewBuffer(Volume.java:491)
        ... 9 more
Caused by: java.lang.OutOfMemoryError: Map failed
        at sun.nio.ch.FileChannelImpl.map0(Native Method)
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:885)
        ... 10 more
Exception in thread "MapDB shutdown" java.lang.RuntimeException: Writer thread failed
        at org.mapdb.AsyncWriteEngine.checkState(AsyncWriteEngine.java:245)
        at org.mapdb.AsyncWriteEngine.close(AsyncWriteEngine.java:391)
        at org.mapdb.EngineWrapper.close(EngineWrapper.java:72)
        at org.mapdb.EngineWrapper.close(EngineWrapper.java:72)
        at org.mapdb.CacheHashTable.close(CacheHashTable.java:169)
        at org.mapdb.DBMaker$1.run(DBMaker.java:654)
Caused by: java.lang.RuntimeException: File could not be mapped to memory, common problem on 32bit JVM. Use `DBMaker.newRandomAccessFileDB()` as workaround
        at org.mapdb.Volume$MappedFileVol.makeNewBuffer(Volume.java:496)
        at org.mapdb.Volume$ByteBufferVol.ensureAvailable(Volume.java:245)
        at org.mapdb.StoreDirect.freePhysTake(StoreDirect.java:724)
        at org.mapdb.StoreDirect.physAllocate(StoreDirect.java:408)
        at org.mapdb.StoreDirect.update(StoreDirect.java:270)
        at org.mapdb.EngineWrapper.update(EngineWrapper.java:55)
        at org.mapdb.AsyncWriteEngine.access$101(AsyncWriteEngine.java:68)
        at org.mapdb.AsyncWriteEngine.runWriter(AsyncWriteEngine.java:201)
        at org.mapdb.AsyncWriteEngine$2.run(AsyncWriteEngine.java:143)
        at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.IOException: Map failed
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:888)
        at org.mapdb.Volume$MappedFileVol.makeNewBuffer(Volume.java:491)
        ... 9 more
Caused by: java.lang.OutOfMemoryError: Map failed
        at sun.nio.ch.FileChannelImpl.map0(Native Method)
        at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:885)
        ... 10 more

2 个答案:

答案 0 :(得分:0)

它使用的是哪个版本的MapDB?这是32位操作系统的常见错误,但你写的是64位。

答案 1 :(得分:0)

感谢您的回复。当我在64位机器上时,我意识到我一直在使用32位JVM进行我正在进行的项目。我卸载了JDK并安装了最新的64位JDK并成功导入了所有内容。