使用hdfs3读取文件失败

时间:2016-10-19 13:55:52

标签: python hadoop hdfs

我正在尝试使用hdfs3模块在Python上读取HDFS上的文件。

import hdfs3
hdfs = hdfs3.HDFileSystem(host='xxx.xxx.com', port=12345)
hdfs.ls('/projects/samplecsv/part-r-00000')

这会产生

[{'block_size': 134345348,
  'group': 'supergroup',
  'kind': 'file',
  'last_access': 1473453452,
  'last_mod': 1473454723,
  'name': '/projects/samplecsv/part-r-00000/',
  'owner': 'dr',
  'permissions': 420,
  'replication': 3,
  'size': 98765631}]

所以它似乎能够访问HDFS并读取目录结构。但是,读取文件失败。

with hdfs.open('/projects/samplecsv/part-r-00000', 'rb') as f:
    print(f.read(100))

给出

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-94-46f0db8e87dd> in <module>()
      1 with hdfs.open('/projects/samplecsv/part-r-00000', 'rb') as f:
----> 2     print(f.read(100))

/anaconda3/lib/python3.5/site-packages/hdfs3/core.py in read(self, length)
    615                     length -= ret
    616                 else:
--> 617                     raise IOError('Read file %s Failed:' % self.path, -ret)
    618 
    619         return b''.join(buffers)
OSError: [Errno Read file /projects/samplecsv/part-r-00000 Failed:] 1

可能是什么问题?我使用的是Python3.5。

2 个答案:

答案 0 :(得分:2)

如果您想对文件进行任何操作,则必须传递完整的文件路径。

import hdfs3
hdfs = hdfs3.HDFileSystem(host='xxx.xxx.com', port=12345)
hdfs.ls('/projects/samplecsv/part-r-00000')

#you have to add file to location
hdfs.put('local-file.txt', '/projects/samplecsv/part-r-00000')

with hdfs.open('projects/samplecsv/part-r-00000/local-file.txt', 'rb') as f:
    print(f.read(100))

答案 1 :(得分:1)

如果您想从hdfs direcotory中读取多个文件,可以尝试以下示例:

  import hdfs3
  hdfs = hdfs3.HDFileSystem(host='xxx.xxx.com', port=12345)
  hdfs.ls('/projects/samplecsv/part-r-00000')

  #you have to add file to location if its not present.
  hdfs.put('local-file.txt', '/projects/samplecsv/part-r-00000')

  file_loc = '/projects/samplecsv/part-r-00000'
  for file in hdfs.glob(os.path.join(file_loc , '*.txt')):
      with hdfs.open(file) as f:
          print(f.read(100))