我尝试使用以下命令在生产环境中运行的MarkLogic服务器中使用MarkLogic内容泵导入文件。
C:\Users\Admin\Desktop\mlcp-1.3-3\bin>mlcp.bat import -host localhost -port 8891 -username admin -password admin -mode local -input_file_type archive -input_file_path /d:/NewFolder/
输入文件路径包含二进制文件和XML文件。
"D:\NewFolder\20150626200126+0800-000000-BINARY.zip"
"D:\NewFolder\20150626200126+0800-000001-XML.zip"
当我尝试在命令提示符下运行导入命令时。我收到了以下我不熟悉的回复。
15/06/29 16:53:11 INFO contentpump.ContentPump: Hadoop library version: 2.6.0
15/06/29 16:53:11 INFO contentpump.LocalJobRunner: Content type: XML
15/06/29 16:53:11 ERROR contentpump.ContentPump: Error running a ContentPump job
java.lang.RuntimeException: Error while running command to get file permissions
: ExitCodeException exitCode=-1073741515:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
715)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
loadPermissionInfo(RawLocalFileSystem.java:582)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
getPermission(RawLocalFileSystem.java:557)
at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:
42)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1699)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedL
istStatus(FileInputFormat.java:303)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(File
InputFormat.java:264)
at com.marklogic.contentpump.FileAndDirectoryInputFormat.getSplits(FileA
ndDirectoryInputFormat.java:80)
at com.marklogic.contentpump.ArchiveInputFormat.getSplits(ArchiveInputFo
rmat.java:56)
at com.marklogic.contentpump.LocalJobRunner.run(LocalJobRunner.java:128)
at com.marklogic.contentpump.ContentPump.runJobLocally(ContentPump.java:
307)
at com.marklogic.contentpump.ContentPump.runCommand(ContentPump.java:204
)
at com.marklogic.contentpump.ContentPump.main(ContentPump.java:67)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
loadPermissionInfo(RawLocalFileSystem.java:620)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
getPermission(RawLocalFileSystem.java:557)
at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:
42)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1699)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedL
istStatus(FileInputFormat.java:303)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(File
InputFormat.java:264)
at com.marklogic.contentpump.FileAndDirectoryInputFormat.getSplits(FileA
ndDirectoryInputFormat.java:80)
at com.marklogic.contentpump.ArchiveInputFormat.getSplits(ArchiveInputFo
rmat.java:56)
at com.marklogic.contentpump.LocalJobRunner.run(LocalJobRunner.java:128)
at com.marklogic.contentpump.ContentPump.runJobLocally(ContentPump.java:
307)
at com.marklogic.contentpump.ContentPump.runCommand(ContentPump.java:204
)
at com.marklogic.contentpump.ContentPump.main(ContentPump.java:67)
java.lang.RuntimeException: Error while running command to get file permissions
: ExitCodeException exitCode=-1073741515:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
715)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
loadPermissionInfo(RawLocalFileSystem.java:582)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
getPermission(RawLocalFileSystem.java:557)
at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:
42)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1699)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedL
istStatus(FileInputFormat.java:303)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(File
InputFormat.java:264)
at com.marklogic.contentpump.FileAndDirectoryInputFormat.getSplits(FileA
ndDirectoryInputFormat.java:80)
at com.marklogic.contentpump.ArchiveInputFormat.getSplits(ArchiveInputFo
rmat.java:56)
at com.marklogic.contentpump.LocalJobRunner.run(LocalJobRunner.java:128)
at com.marklogic.contentpump.ContentPump.runJobLocally(ContentPump.java:
307)
at com.marklogic.contentpump.ContentPump.runCommand(ContentPump.java:204
)
at com.marklogic.contentpump.ContentPump.main(ContentPump.java:67)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
loadPermissionInfo(RawLocalFileSystem.java:620)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
getPermission(RawLocalFileSystem.java:557)
at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:
42)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1699)
at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1681)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedL
istStatus(FileInputFormat.java:303)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(File
InputFormat.java:264)
at com.marklogic.contentpump.FileAndDirectoryInputFormat.getSplits(FileA
ndDirectoryInputFormat.java:80)
at com.marklogic.contentpump.ArchiveInputFormat.getSplits(ArchiveInputFo
rmat.java:56)
at com.marklogic.contentpump.LocalJobRunner.run(LocalJobRunner.java:128)
at com.marklogic.contentpump.ContentPump.runJobLocally(ContentPump.java:
307)
at com.marklogic.contentpump.ContentPump.runCommand(ContentPump.java:204
)
at com.marklogic.contentpump.ContentPump.main(ContentPump.java:67)
任何人都可以帮我解决这个问题吗?
感谢。
答案 0 :(得分:1)
错误消息与此处列出的错误消息类似:Hadoop error stalling job reduce process
决议是增加堆大小。将-Xmx512m
添加到名为JVM_OPTS
..
HTH!
答案 1 :(得分:1)
在执行MarkLogic大学课程XQuery II时遇到了同样的错误。 他们可用于课程的vm非常慢,所以我在Parallels中设置了自己的Windows 7 VM。
当我从第6单元运行时:
mlcp.bat import -host localhost -port 8012 -username admin -password admin -input_file_path C:\mls-developer-2\socialmedia\content\enriched -mode local -input_file_pattern "disqus.*\.xml" -output_uri_replace "C:/mls-developer-2/socialmedia/content/enriched, 'socialmedia/disqus'"
我看到同样的错误...... 我尝试设置_JAVA_OPTIONS来增加JVM可用的内存,因为有些帖子已经建议但是没有效果。
我最终将命令改为:
mlcp.bat import -host localhost -port 8012 -username admin -password admin -input_file_path "C:\mls-developer-2\socialmedia\content\enriched\*.xml" -mode local -output_uri_replace "C:/mls-developer-2/socialmedia/content/enriched, 'socialmedia/disqus'"
有一些关于它不喜欢的input_file_pattern“disqus。*。xml”。看起来在你的情况下看起来不一样,但我想我会把这个问题发给别人。
答案 2 :(得分:0)
输入文档是否是使用 MLCP 导出内容的结果?这似乎是-input_file_type
存档的目的。如果那不是文件的来源,请尝试-input_file_type documents -input_compressed true
。