如何通过AWS胶水连接到安装在EC2实例上的Hive?

时间:2019-03-27 14:22:51

标签: amazon-web-services amazon-ec2 hive pyspark aws-glue

我想通过在AWS Glue上运行spark作业来访问配置单元metastore。这样做需要我放置配置单元实例的ip并访问它。在我本地,它可以工作,但不能在AWS Glue中工作。

我尝试使用以下代码访问Hive:

spark_session = (
    glueContext.spark_session
    .builder
    .appName('example-pyspark-read-and-write-from-hive')
    .config(
        "hive.metastore.uris",
        "thrift://172.16.12.34:9083",
        conf=SparkConf()
    )
    .enableHiveSupport()
    .getOrCreate()
)

我也看过各种文档,但是没人能告诉我如何在特定端口连接到ec2实例。

代码是:

import sys

from awsglue.context import GlueContext
from awsglue.job import Job
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark import SparkConf, SparkContext
from pyspark.conf import SparkConf
from pyspark.context import SparkConf, SparkContext
from pyspark.sql import (DataFrameReader, DataFrameWriter, HiveContext,
                         SparkSession)

"""
SparkSession ss = SparkSession
.builder()
.appName(" Hive example")
.config("hive.metastore.uris", "thrift://localhost:9083")
.enableHiveSupport()
.getOrCreate();
"""
args = getResolvedOptions(sys.argv, ['JOB_NAME'])
sc = SparkContext()
glueContext = GlueContext(sc)
spark_session = (
    glueContext.spark_session
    .builder
    .appName('example-pyspark-read-and-write-from-hive')
    .config(
        "hive.metastore.uris",
        "thrift://172.16.12.34:9083",
        conf=SparkConf()
    )
    .enableHiveSupport()
    .getOrCreate()
)
job = Job(glueContext)
job.init(args['JOB_NAME'], args)
data = [('First', 1), ('Second', 2), ('Third', 3), ('Fourth', 4), ('Fifth', 5)]
df = spark_session.createDataFrame(data)
df.write.saveAsTable('example_2')
job.commit()

我希望用Hive编写表,但是我从Glue中收到以下错误:

An error occurred while calling o239.saveAsTable. No Route to Host from ip-172-31-14-64/172.31.14.64 to ip-172-31-15-11.ap-south-1.compute.internal:8020 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; 

0 个答案:

没有答案