在深度学习中无法序列化大于4 GiB的字节对象

时间:2019-03-10 10:26:27

标签: tensorflow keras neural-network deep-learning google-colaboratory

我正在Google Colab 环境中使用Keras验证图像来创建 Siamese 网络。我使用了来自GitHub的this代码。但是当我尝试运行pickle.dump代码时出现错误:

with open(os.path.join(save_path,"train.pickle"), "wb") as f:
    pickle.dump((X,c),f)

错误消息是:

---------------------------------------------------------------------------
OverflowError                             Traceback (most recent call last)
<ipython-input-7-af9d0618d385> in <module>()
      3 
      4 with open(os.path.join(save_path,"train.pickle"), "wb") as f:
----> 5         pickle.dump((X,c),f)
      6 
      7 

OverflowError: cannot serialize a bytes object larger than 4 GiB

我在此网站上找到了一些相关问题,但找不到有用的答案。如何解决此错误?

1 个答案:

答案 0 :(得分:1)

def pairwise(iterable): "s -> (s0,s1), (s1,s2), (s2, s3), ..." a, b = tee(iterable) next(b, None) return zip(a, b) 一起使用泡菜,例如

protocol=4