解析参数时出错,amazon aws emr

时间:2016-07-12 10:06:44

标签: hadoop amazon-s3 hive amazon-emr s3distcp

我正在尝试通过Linux控制台创建一个步骤:

aws emr add-steps --cluster-id j-XXXXXXXXXX --steps Type=CUSTOM_JAR,Name="S3DistCp step",Jar=/home/hadoop/lib/emr-s3distcp-1.0.jar,\ 
Args=["--s3Endpoint,s3-eu-west-1.amazonaws.com","--src,s3://folder-name/logs/j-XXXXXXXXXX/node/","--dest,hdfs:///output","--srcPattern,.*[a-zA-Z,]+"]

我跳转了以下错误

  

解析参数'--steps'时出错:预期:',',收到:'+'表示输入

我该如何解决?

我正在寻找将多个文件上传到S3和S3DistCp的解决方案,以便为Amazon EMR收集Hive。还有其他办法吗?

我有另一个问题: 现在我正在创建一个连接到Hive的SSH隧道,我如何与PHP连接?

目前我通过删除“src Pattern”解决了错误,但是又给了我一个错误,我在下面包含了图片

Image error

这是出现的错误

INFO Synchronously wait child process to complete : hadoop jar /var/lib/aws/emr/step-runner/hadoop- 
INFO waitProcessCompletion ended with exit code 1 : hadoop jar
/var/lib/aws/emr/step-runner/hadoop-
INFO total process run time: 2 seconds
2016-07-12T14:26:48.744Z INFO Step created jobs:
2016-07-12T14:26:48.744Z WARN Step failed with exitCode 1 and took 2 seconds

THX !!!

1 个答案:

答案 0 :(得分:1)

尝试JSON配置

[
    {
        "Name":"S3DistCp step",
        "Args":["s3-dist-cp","--s3Endpoint=s3.amazonaws.com","--src=s3://mybucket/logs/j-3GYXXXXXX9IOJ/node/","--dest=hdfs:///output","--srcPattern=.*[a-zA-Z,]+"],
        "ActionOnFailure":"CONTINUE",
        "Type":"CUSTOM_JAR",
        "Jar":"command-runner.jar"        
    }
]

aws emr add-steps --cluster-id j-3GYXXXXXX9IOK --steps file://./myStep.json

http://docs.aws.amazon.com/emr/latest/ReleaseGuide/UsingEMR_s3distcp.html#UsingEMR_s3distcp.step