我正在使用Kafka,Druid和SuperSet测试数据流。
我正处于德鲁伊的一些数据(见1. picutre)。
之后,我可以通过选项"刷新德鲁伊元数据"在Superset中生成德鲁伊数据源。 (见2.pic) 问题是,当我想查询数据时,我收到此错误消息:
URLError: <urlopen error [Errno -2] Name or service not known>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/superset/viz.py", line 329, in get_df_payload
df = self.get_df(query_obj)
File "/usr/lib/python2.7/site-packages/superset/viz.py", line 142, in get_df
self.results = self.datasource.query(query_obj)
File "/usr/lib/python2.7/site-packages/superset/connectors/druid/models.py", line 1238, in query
client=client, query_obj=query_obj, phase=2)
File "/usr/lib/python2.7/site-packages/superset/connectors/druid/models.py", line 959, in get_query_str
return self.run_query(client=client, phase=phase, **query_obj)
File "/usr/lib/python2.7/site-packages/superset/connectors/druid/models.py", line 1126, in run_query
client.timeseries(**qry)
File "/usr/lib/python2.7/site-packages/pydruid/client.py", line 167, in timeseries
return self._post(query)
File "/usr/lib/python2.7/site-packages/pydruid/client.py", line 484, in _post
res = urllib.request.urlopen(req)
File "/usr/lib64/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib64/python2.7/urllib2.py", line 431, in open
response = self._open(req, data)
File "/usr/lib64/python2.7/urllib2.py", line 449, in _open
'_open', req)
File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain
result = func(*args)
File "/usr/lib64/python2.7/urllib2.py", line 1244, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib64/python2.7/urllib2.py", line 1214, in do_open
raise URLError(err)
URLError: <urlopen error [Errno -2] Name or service not known>
另见3. pic
知道什么可能是个问题?
我正在通过NiFi喂Kafka,然后我将SAM卡夫卡源连接到SAM的德鲁伊目标。
谢谢!
答案 0 :(得分:1)
似乎Superset连接到您的Broker-Node时遇到了问题。检查群集运行状况。特别是Broker-和Coordinator Node日志。
答案 1 :(得分:0)
问题解决了, 问题是在超集UI中的集群配置中没有定义代理主机。 我将其设置为值: localhost 现在它已经启动并运行了。