Apache Spark - ImportError:没有名为_winreg的模块

时间:2015-11-02 12:32:13

标签: python apache-spark pyspark

一个对我有用的脚本完全停止工作大约一周左右。 当我编译一个lambda函数时会出现问题,我后来用它来创建我的RDD。

请考虑以下代码:

class RDDUtils(object):
@staticmethod
def map_builder(*fields):
    """
    Creates a compiled lambda function for use in spark keyBy using the specified field names
    :param fields: The name of the fields to create the function with
    :return: A compiled python function
    """
    func = FunctionType(
        compile("lambda x: {" + ',\n'.join('"'"{}"'" : x[{}]'.format(c, i) for i, c in enumerate(fields, 0)) + "}",
                "<string>",
                "eval"), {})
    return func()

@staticmethod
def rdd_creator(context, fields, source_file, delim='\t'):
    """
    Method which creates an RDD
    :param context: spark context
    :param fields: fields / columns in our csv file
    :param source_file: location of csv file
    :return: RDD
    """
    build = RDDUtils.map_builder(*fields)
    rdd = context.textFile(source_file).map(lambda x: x.split(delim)).map(build)
    return rdd


rdd = RDDUtils()

sc = context('demo1', 'local')
fields = ['username', 'full_name', 'src_id']
source_file = '/home/aaron/dim_operator.csv'

create_rdd = rdd.rdd_creator(sc, fields,source_file)

print create_rdd.first()

以下回溯:

File "/usr/lib/python2.7/dist-packages/six.py", line 116, in __getattr__
    _module = self._resolve()
  File "/usr/lib/python2.7/dist-packages/six.py", line 105, in _resolve
    return _import_module(self.mod)
  File "/usr/lib/python2.7/dist-packages/six.py", line 76, in _import_module
    __import__(name)
ImportError: No module named _winreg

是什么原因导致突然停止工作?

在Ubuntu 14.04.3上运行

我通过显式调用lambda然后在我的字符串周围包装eval()来解决这个问题,我在没有lambda的情况下动态创建。

以下更新的代码:

class RDDUtils(object):

def map_builder(*fields):
    lambda_dict = "{" + ','.join('"'"{}"'" : x[{}]'.format(c, i) for i, c in enumerate(fields, 0)) + "}"
    return lambda_dict

def rdd_creator(context, fields, source_file, delim='\t'):
    """
    Creates an RDD
    """
    build = map_builder(*fields)
    rdd = context.textFile(source_file).map(lambda x: x.split(delim)).map(lambda x: eval(build))
    return rdd


if __name__ == "__main__":

    rdd = RDDUtils()

    sc = context('demo1', 'local')
    fields = ['username', 'full_name', 'src_id']
    source_file = '/home/aaron/dim_operator.csv'

    create_rdd = rdd.rdd_creator(sc, fields,source_file)

    print create_rdd.first()

预期结果如下:

{&#39;用户名&#39;:你&#39; dev&#39;,&#39; src_id&#39;:你&#39; 1&#39;,&#39; full_name&#39;:你& #39;主要用户&#39;}

编辑:下面的完整回溯......

Traceback (most recent call last):
  File "/home/aaron/apps/pycharm-3.0.2/helpers/pydev/pydevd.py", line 1532, in <module>
    debugger.run(setup['file'], None, None)
  File "/home/aaron/apps/pycharm-3.0.2/helpers/pydev/pydevd.py", line 1143, in run
    pydev_imports.execfile(file, globals, locals) #execute the script
  File "/home/aaron/PycharmProjects/fetl/dim_operator.py", line 127, in <module>
    print create_rdd.first()
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 1242, in first
    rs = self.take(1)
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 1194, in take
    totalParts = self._jrdd.partitions().size()
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 2288, in _jrdd
    pickled_cmd, bvars, env, includes = _prepare_for_python_RDD(self.ctx, command, self)
  File "/home/aaron/apps/spark/python/pyspark/rdd.py", line 2206, in _prepare_for_python_RDD
    pickled_command = ser.dumps(command)
  File "/home/aaron/apps/spark/python/pyspark/serializers.py", line 411, in dumps
    return cloudpickle.dumps(obj, 2)
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 816, in dumps
    cp.dump(obj)
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 133, in dump
    return pickle.Pickler.dump(self, obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 562, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 254, in save_function
    self.save_function_tuple(obj, [themodule])
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 304, in save_function_tuple
    save((code, closure, base_globals))
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 600, in save_list
    self._batch_appends(iter(obj))
  File "/usr/lib/python2.7/pickle.py", line 633, in _batch_appends
    save(x)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 254, in save_function
    self.save_function_tuple(obj, [themodule])
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 304, in save_function_tuple
    save((code, closure, base_globals))
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 600, in save_list
    self._batch_appends(iter(obj))
  File "/usr/lib/python2.7/pickle.py", line 636, in _batch_appends
    save(tmp[0])
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/aaron/apps/spark/python/pyspark/cloudpickle.py", line 209, in save_function
    modname = pickle.whichmodule(obj, name)
  File "/usr/lib/python2.7/pickle.py", line 817, in whichmodule
    if name != '__main__' and getattr(module, funcname, None) is func:
  File "/usr/lib/python2.7/dist-packages/six.py", line 116, in __getattr__
    _module = self._resolve()
  File "/usr/lib/python2.7/dist-packages/six.py", line 105, in _resolve
    return _import_module(self.mod)
  File "/usr/lib/python2.7/dist-packages/six.py", line 76, in _import_module
    __import__(name)
ImportError: No module named _winreg

1 个答案:

答案 0 :(得分:0)

此错误是由您的操作系统引起的。 winreg无法在Linux(Ubuntu)上运行。这是一个仅限Windows的模块。

https://docs.python.org/release/2.1.2/lib/module--winreg.html

  

可用性:Windows。

     

2.0版中的新功能。

     

这些函数将Windows注册表API公开给Python。而不是使用整数作为注册表句柄,句柄对象用于确保句柄正确关闭,即使程序员忽略显式关闭它们。

     

此模块向Windows注册表公开了一个非常低级的接口;预计将来会创建一个新的winreg模块,为注册表API提供更高级别的接口。