避免读出blob

时间:2013-04-25 15:38:49

标签: python google-app-engine blob

我目前正在尝试使用blob中的数据来处理这些数据,然后通过电子邮件发送。截至目前,我收到以下错误 - 我的猜测是它与我读入内存的blob的大小有关,因为它只发生在更大的blob上:

Traceback (most recent call last):

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1535, in __call__
rv = self.handle_exception(request, response, e)

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1529, in __call__
rv = self.router.dispatch(request, response)

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1278, in default_dispatcher
  return route.handler_adapter(request, response)

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1102, in __call__
return handler.dispatch()

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 572, in dispatch
return self.handle_exception(e, self.app.debug)

File "/python27_runtime/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 570, in dispatch
return method(*args, **kwargs)

File "/base/data/home/apps/xxxx/main.py", line 154, in post
db.run_in_transaction(filtering)

File "/python27_runtime/python27_lib/versions/1/google/appengine/api/datastore.py", line 2461, in RunInTransaction
return RunInTransactionOptions(None, function, *args, **kwargs)

File "/python27_runtime/python27_lib/versions/1/google/appengine/api/datastore.py", line 2599, in RunInTransactionOptions
ok, result = _DoOneTry(new_connection, function, args, kwargs)

File "/python27_runtime/python27_lib/versions/1/google/appengine/api/datastore.py", line 2621, in _DoOneTry
result = function(*args, **kwargs)

File "/base/data/home/apps/xxxx/main.py", line 128, in filtering
for k in liwkws:

File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 2326, in next
return self.__model_class.from_entity(self.__iterator.next())

File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_query.py", line 2892, in next
next_batch = self.__batcher.next()

File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_query.py", line 2754, in next
return self.next_batch(self.AT_LEAST_ONE)

File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_query.py", line 2791, in next_batch
batch = self.__next_batch.get_result()

File "/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 604, in get_result
return self.__get_result_hook(self)

File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_query.py", line 2528, in __query_result_hook
self._batch_shared.conn.check_rpc_success(rpc)

File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_rpc.py", line 1224, in check_rpc_success
raise _ToDatastoreError(err)
BadRequestError: invalid handle: 16023202376614806719

这是代码:

    #Reads in all the variables from former process

    filter_name = self.request.get('filter_name')
    user = self.request.get('user')
    lowkey = self.request.get('lowkey')

    def filtering():

            # This is where I read the Blob into memory

        low = blobstore.BlobReader(lowkey).read()
        liwkws = db.GqlQuery("SELECT * FROM FilterList WHERE ANCESTOR IS :1", filter_key(filter_name))

            # Preparing the data for processing

            low = unicode(low, 'utf8').encode('utf-8').replace('\r', '').split('\n')

        for j in range(len(low)):
            for k in liwkws:
                if k.newkey.encode('utf-8').lower() in low[j].lower(): 
                    low[j] = 'delete'

        cuent = low.count('delete')

        for i in range(cuent):
            low.remove('delete')

        output_buffer = StringIO.StringIO()
        csv_output = csv.writer(output_buffer, delimiter=",")
        for i in low:
            csv_output.writerow([i])
        result = output_buffer.getvalue()

有什么想法吗?在这种情况下blob的大小不是那么大(3MB),即使我不直接读取它,我将不得不保存一个相同大小的列表。

0 个答案:

没有答案