我正在尝试使用批量加载程序功能将一个容易调整大小的csv文件上传到谷歌应用引擎,并且它看起来会因某些方式而导致以下结果:
[INFO ] Logging to bulkloader-log-20110328.181531
[INFO ] Throttling transfers:
[INFO ] Bandwidth: 250000 bytes/second
[INFO ] HTTP connections: 8/second
[INFO ] Entities inserted/fetched/modified: 20/second
[INFO ] Batch Size: 10
[INFO ] Opening database: bulkloader-progress-20110328.181531.sql3
[INFO ] Connecting to notmyrealappname.appspot.com/_ah/remote_api
[INFO ] Starting import; maximum 10 entities per post
...............................................................[INFO ] Unexpected thread death: WorkerThread-7
[INFO ] An error occurred. Shutting down...
.........[ERROR ] Error in WorkerThread-7: <urlopen error [Errno -2] Name or service not known>
[INFO ] 1740 entites total, 0 previously transferred
[INFO ] 720 entities (472133 bytes) transferred in 32.3 seconds
[INFO ] Some entities not successfully transferred
它上传了我要上传的19k条目中的大约700条,我想知道它为什么会失败。我检查了csv文件是否有错误,比如可以抛弃python csv阅读器的其他逗号和非ascii字符已被删除。
答案 0 :(得分:6)
提升批次限制(batch_size)和rps限制(rps_limit)有效,我使用1000作为批量大小,rps限制为500:
appcfg.py upload_data --url= --application= --filename= --email= --batch_size=1000 --rps_limit=500