在pyspark上运行sql查询时出现MetaException(消息:java.lang.IllegalArgumentException:java.net.UnknownHostException)

时间:2015-01-23 15:42:15

标签: java hive apache-spark apache-spark-sql pyspark

我正在码头上运行pyspark。我可以用pyspark进行基本操作。但是当我尝试执行sql查询时,我得到以下异常

[IN] from pyspark.sql import SQLContext, HiveContext
     sqlContext = HiveContext(sc)
     rdd = sqlContext.parquetFile("/2014122x.parquet")
     rdd.count()
[OUT] 53855299L

但是当我执行以下查询时,我收到以下错误

[IN] rdd.registerAsTable("tweets")
     sqlContext.sql("drop table if exists tweets_filtered")
     %time sqlContext.sql("create table tweets_filtered as "\
            +" select floor(cast(timestamp_ms as decimal)/(900*1000)) as ts, source"\
            +", user.geo_enabled, user.followers_count, user.friends_count, user.id"\
            +", user.lang, user.location, user.verified"\
            +" from tweets a where timestamp_ms is not null")
sqlContext.cacheTable("tweets_filtered")

    Py4JJavaError: An error occurred while calling o18.sql.
: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.lang.IllegalArgumentException: java.net.UnknownHostException: 3f8c07a0e645)
    at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:309)
    at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
    at org.apache.spark.sql.hive.execution.DropTable.sideEffectResult$lzycompute(commands.scala:58)
    at org.apache.spark.sql.hive.execution.DropTable.sideEffectResult(commands.scala:56)
    at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
    at org.apache.spark.sql.hive.execution.DropTable.execute(commands.scala:51)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
    at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
    at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
    at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:744)

有人可以帮忙解决这个问题吗?

1 个答案:

答案 0 :(得分:0)

如果您最近升级了docker(或者可能从boot2docker迁移到docker-machine),那么您的工作目录中的metastore_db可能是使用旧的docker vm的主机名配置的(3f8c07a0e645?)

要在我的docker上修复此问题,我完全删除了metastore_db,重新创建它,当我再次尝试我的命令时,一切都顺利进行。可能有更好的方法来处理它。