当我尝试在远程火花机中运行程序时,我得到以下异常:
15/01/16 11:14:39 ERROR UserGroupInformation: PriviledgedActionException as:user1 (auth:SIMPLE) cause:java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
Exception in thread "main" java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:115)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:163)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:127)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
我在一台机器上执行这个驱动程序类说(abc)但是火花安装在另一台机器上(xyz:7077)。 当我执行指向“本地”的相同驱动程序类时,它可以正常工作。
public class TestSpark implements Serializable{
public static void main(String[] args) throws IOException {
TestSpark test = new TestSpark();
test.testReduce();
}
public void testReduce() throws IOException {
SparkConf conf = new SparkConf().setMaster("spark://xyz:7077").setAppName("Sample App");
String[] pathToJar = {"/home/user1/Desktop/Jars/TestSpark.jar"};
//SparkConf conf = new SparkConf().setMaster("spark://abc:7077").setAppName("Sample App").setJars(pathToJar);
//SparkConf conf = new SparkConf().setMaster("local").setAppName("Sample App");
JavaSparkContext jsc = new JavaSparkContext(conf);
List<Integer> data = new ArrayList<Integer>();
for(int i=1;i<500;i++){
data.add(i);
}
System.out.println("Size : "+data.size());
JavaRDD<Integer> distData = jsc.parallelize(data);
Integer total = distData.reduce(new Function2<Integer, Integer, Integer>() {
@Override
public Integer call(Integer v1, Integer v2) throws Exception {
return v1 + v2;
}
});
System.out.println("-->"+total );
}
}
我也试过设置spark.driver.host :: xyz&amp; spark.driver.port :: 7077在创建spark上下文时,但没有帮助。
请咨询