HDFS:尝试编辑已挂载的NFS卷中的现有文件的Errno 22

时间:2018-08-21 07:15:25

标签: hadoop hdfs nfs jupyter-lab

摘要:我在OSX中安装了HDFS nfs卷,它不允许我编辑现有文件。我可以添加并创建包含内容的文件,但不能“使用write标志打开它们”。

最初,我问到JupyterLab未能将笔记本保存到nfs装入的卷中的一个特殊问题,但是在尝试深入研究根源时,我(希望是正确的)意识到这与编辑现有文件有关。

我在OSX上安装了HDFS nfs,并且可以访问文件,进行读写操作。 JupyterLab虽然可以做几乎所有事情,但不能真正保存笔记本。

我能够确定实际情况的模式,问题归结为:您无法打开nfs卷中的现有文件进行写入:

这将与文件一起使用:

with open("rand.txt", 'w') as f:
    f.write("random text")

但是,如果您尝试再次运行它(文件已经创建并且内容在那里),则会出现以下异常:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-15-94a46812fad4> in <module>()
----> 1 with open("rand.txt", 'w') as f:
      2     f.write("random text")

OSError: [Errno 22] Invalid argument: 'rand.txt'

我非常确定权限,一切正常:

with open("seven.txt", 'w') as f:
    f.write("random text")
    f.writelines(["one","two","three"])

r = open("seven.txt", 'r')
print(r.read())
  

随机textonetwothree

我也可以将文件追加到其他位置

aleksandrs-mbp:direct sasha$ echo "Another line of text" >> seven.txt && cat seven.txt 
random textonetwothreeAnother line of text

我通过以下选项安装它:

  

aleksandrs-mbp:hadoop sasha $ mount -t nfs -o   vers = 3,proto = tcp,nolock,noacl,同步本地主机:/   / srv / ti / jupyter-samples /〜Hadoop

Apache文档建议NFS网关不支持随机写入。我尝试查看mount文档,但找不到指向强制执行顺序编写的特定内容。我尝试使用不同的选项,但似乎并没有太大帮助。

这是我从JupyterLab尝试保存笔记本时得到的异常:

[I 03:03:33.969 LabApp] Saving file at /~Hadoop/direct/One.ipynb
[E 03:03:33.980 LabApp] Error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
    Traceback (most recent call last):
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/filemanager.py", line 471, in save
        self._save_notebook(os_path, nb)
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 293, in _save_notebook
        with self.atomic_writing(os_path, encoding='utf-8') as f:
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/contextlib.py", line 82, in __enter__
        return next(self.gen)
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 213, in atomic_writing
        with atomic_writing(os_path, *args, log=self.log, **kwargs) as f:
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/contextlib.py", line 82, in __enter__
        return next(self.gen)
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 103, in atomic_writing
        copy2_safe(path, tmp_path, log=log)
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 51, in copy2_safe
        shutil.copyfile(src, dst)
      File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/shutil.py", line 115, in copyfile
        with open(dst, 'wb') as fdst:
    OSError: [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
[W 03:03:33.981 LabApp] 500 PUT /api/contents/~Hadoop/direct/One.ipynb?1534835013966 (::1): Unexpected error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
[W 03:03:33.981 LabApp] Unexpected error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'

这是我同时在NFS日志中看到的内容:

2018-08-21 03:05:34,006 ERROR org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3: Setting file size is not supported when setattr, fileId: 16417
2018-08-21 03:05:34,006 ERROR org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3: Setting file size is not supported when setattr, fileId: 16417

不确定是什么意思,但是如果我理解the RFC,它应该是实现的一部分:

  

服务器必须支持通过SETATTR扩展文件大小。

我了解挂载hdfs和让客户端写所有所需内容,同时保持这些文件分发并保持完整性的复杂性。是否有折衷办法可以通过nfs进行写操作?

0 个答案:

没有答案