即使内存设置正确完成,git clone也会因远程端可能存储库损坏而中止

时间:2015-03-02 13:19:51

标签: git github

由于远程端可能存在存储库损坏,git clone正在中止 即使内存设置正确完成

能够将我的代码提取并推送到同一个仓库。当我尝试在另一台机器上克隆它时会说错误。

以下是.gitconfig设置

[pack]
    windowMemory = 1000m
    SizeLimit = 1000m
    threads = 1
    window = 0

错误:

   Cloning into 'auto_shop'...
    stdin: is not a tty
    remote: Counting objects: 3043, done.
    remote: Compressing objects: 100% (2872/2872), done.
    error: pack-objects died of signal 94.62 MiB | 89.00 KiB/s
    error: git upload-pack: git-pack-objects died with error.
    fatal: git upload-pack: aborting due to possible repository corruption on the remote side.
    fratal: early EOF:  31% (966/3043), 5.68 MiB | 223.00 KiB/s
    emote: aborting due to possible repository corruption on the remote side.
    fatal: index-pack failed

此外,git fsck不会出现任何错误。

# git fsck
Checking object directories: 100% (256/256), done.
Checking objects: 100% (2218/2218), done.
dangling commit 7ae478bea3aa6c42cc8fe865c9fc26b35ea9e15d
dangling commit a657b57b65f63f4ffea1c25c77ff62c94471d41a
dangling commit 3c9ef0ff7818812f506fa1d18ef4af4a90a4938d

请帮我解决这个问题?

2 个答案:

答案 0 :(得分:18)

它工作,我也在远程端设置相同的配置。它现在有效..

git config --global pack.windowMemory "100m"
git config --global pack.SizeLimit "100m" 
git config --global pack.threads "1"
git config --global pack.window "0"

答案 1 :(得分:0)

我遇到了同样的问题。尝试所有解决方案后,它仍然存在。将配置与另一个存储库进行比较后,我发现此配置有效:

git config core.bigfilethreshold 200K

我认为是因为有一个较大的sql备份文件backup.sql(size: 305M),git尝试将其备份为文本文件并查看其中的区别。
使用git config core.bigfilethreshold 200K后,git不会将其放缩并进行压缩。

因此,如果pack.windowMemory, pack.SizeLimit之类的配置对您不起作用,请尝试使用
git config core.bigfilethreshold 200K