上传到云端硬盘的Jupyter笔记本项目在Colab中出现目录问题

时间:2019-07-29 20:38:04

标签: jupyter-notebook google-colaboratory

我将Jupyter项目文件夹上传到了我的Google云端硬盘。该项目文件夹包含笔记本文件以及一个支持的tf_utils.py模块以及笔记本调用的子文件夹中的各种数据集。我在顶部添加了一条语句来安装Drive文件夹,然后将sys.path.append附加到我的项目文件夹中。 tf_utils.py模块可以很好地加载,但是随后会调用此模块中的load_dataset函数从/ datasets子文件夹加载数据集:

from google.colab import drive
drive.mount('/content/gdrive', force_remount=True)
#### Updated ####
### The following two lines caused the issue:
import sys
sys.path.append('/content/gdrive/My Drive/Tensorflow_Tutorial/')
### Replacing the above with these two lines solved the issue:
import os
os.chdir('/content/gdrive/My Drive/Tensorflow_Tutorial') # Ch working directory to project folder
###
import math
import numpy as np
import h5py
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow.python.framework import ops
from tf_utils import load_dataset, random_mini_batches, convert_to_one_hot, predict
from datetime import datetime

# Load the dataset
X_train_orig, Y_train_orig, X_test_orig, Y_test_orig, classes = load_dataset()

但是在运行此单元格时,出现以下错误:

OSError                                   Traceback (most recent call last)
<ipython-input-32-2058ac3cd5de> in <module>()
----> 1 X_train_orig, Y_train_orig, X_test_orig, Y_test_orig, classes = load_dataset()

2 frames
/content/gdrive/My Drive/Tensorflow_Tutorial/tf_utils.py in load_dataset()
      5 
      6 def load_dataset():
----> 7     train_dataset = h5py.File('datasets/train_signs.h5', "r")
      8     train_set_x_orig = np.array(train_dataset["train_set_x"][:]) # your train set features
      9     train_set_y_orig = np.array(train_dataset["train_set_y"][:]) # your train set labels

/usr/local/lib/python3.6/dist-packages/h5py/_hl/files.py in __init__(self, name, mode, driver, libver, userblock_size, swmr, **kwds)
    310             with phil:
    311                 fapl = make_fapl(driver, libver, **kwds)
--> 312                 fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
    313 
    314                 if swmr_support:

/usr/local/lib/python3.6/dist-packages/h5py/_hl/files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
    140         if swmr and swmr_support:
    141             flags |= h5f.ACC_SWMR_READ
--> 142         fid = h5f.open(name, flags, fapl=fapl)
    143     elif mode == 'r+':
    144         fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/h5f.pyx in h5py.h5f.open()

OSError: Unable to open file (unable to open file: name = 'datasets/train_signs.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

但是当我查询/ datasets子文件夹时,出现了datafiles文件:

!ls /content/gdrive/My\ Drive/Tensorflow_Tutorial/datasets/*.*

'/content/gdrive/My Drive/Tensorflow_Tutorial/datasets/test_signs.h5'
'/content/gdrive/My Drive/Tensorflow_Tutorial/datasets/train_signs.h5'

任何指导都值得赞赏,因为我还不是专家。我想知道“我的云端硬盘”中的空间是否导致load_datasets()函数调用出现问题。我尝试将名称“我的云端硬盘”更改为“ My_Drive”,但显然无法更改。

0 个答案:

没有答案