无法在Mac OS X上运行hadoop 1.2.1示例

时间:2014-02-03 11:19:39

标签: macos hadoop

我在运行OS X 10.8.5的iMAC上安装了hadoop 1.2.1,运行jps之后我可以看到所有预期的进程都已启动。我遇到的问题是,当我尝试运行map-reduce作业时,我遇到了重复错误:“错误:无法连接到窗口服务器 - 没有足够的权限”。

这些行在我的hadoop-env.sh中:

export JAVA_HOME=`/usr/libexec/java_home -v 1.6`
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.awt.headless=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc="

这是我得到的输出:

bash-3.2$ hadoop jar hadoop-examples-1.2.1.jar pi 10 100
Warning: $HADOOP_HOME is deprecated.

Number of Maps  = 10
Samples per Map = 100
Wrote input for Map #0
Wrote input for Map #1
Wrote input for Map #2
Wrote input for Map #3
Wrote input for Map #4
Wrote input for Map #5
Wrote input for Map #6
Wrote input for Map #7
Wrote input for Map #8
Wrote input for Map #9
Starting Job
14/02/03 13:11:20 INFO mapred.FileInputFormat: Total input paths to process : 10
14/02/03 13:11:21 INFO mapred.JobClient: Running job: job_201402031302_0002
14/02/03 13:11:22 INFO mapred.JobClient:  map 0% reduce 0%
14/02/03 13:11:23 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000011_0, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000011_0: 2014-02-03 13:11:21.878 java[8245:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:24 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000011_1, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000011_1: 2014-02-03 13:11:22.627 java[8252:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:24 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000011_2, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000011_2: 2014-02-03 13:11:23.558 java[8269:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:26 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000010_0, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000010_0: 2014-02-03 13:11:25.353 java[8301:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:27 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000010_1, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000010_1: 2014-02-03 13:11:26.259 java[8309:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:28 INFO mapred.JobClient: Task Id : attempt_201402031302_0002_m_000010_2, Status : FAILED
Error: Can't connect to window server - not enough permissions.
attempt_201402031302_0002_m_000010_2: 2014-02-03 13:11:27.179 java[8325:1203] Unable to load realm info from SCDynamicStore
14/02/03 13:11:28 INFO mapred.JobClient: Job complete: job_201402031302_0002
14/02/03 13:11:28 INFO mapred.JobClient: Counters: 4
14/02/03 13:11:28 INFO mapred.JobClient:   Job Counters 
14/02/03 13:11:28 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=5846
14/02/03 13:11:28 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
14/02/03 13:11:28 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
14/02/03 13:11:28 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
14/02/03 13:11:28 INFO mapred.JobClient: Job Failed: JobCleanup Task Failure, Task: task_201402031302_0002_m_000010
java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
    at org.apache.hadoop.examples.PiEstimator.estimate(PiEstimator.java:297)
    at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:342)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:351)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

0 个答案:

没有答案