在复杂环境中测试异步龙卷风RequestHandler方法

时间:2014-07-22 20:37:34

标签: python unit-testing testing asynchronous tornado

我正在尝试为tornado.web.RequestHandler的子代写单元测试代码,该代码对数据库运行聚合查询。我已经浪费了几天试图让测试工作。 测试使用pytest和factoryboy。很多重要的龙卷风班都有测试工厂。 这是正在测试的类: class AggregateRequestHandler(StreamlyneRequestHandler):     “””     '''

SUPPORTED_METHODS = (
    "GET", "POST", "OPTIONS")


def get(self):
    self.aggregate()


@auth.hmac_auth
#@tornado.web.asynchronous
@tornado.web.removeslash
@tornado.gen.coroutine
def aggregate(self):
    '''
    '''
    self.logger.info('api aggregate')
    data = self.data
    print("Data: {0}".format(data))


    pipeline = data['pipeline']


    self.logger.debug('pipeline : {0}'.format(pipeline))
    self.logger.debug('utc tz : {0}'.format(tz_util.utc))


    # execute pipeline query
    print(self.collection)
    try:
        cursor_future = self.collection.aggregate(pipeline, cursor={})
        print(cursor_future)
        cursor = yield cursor_future
        print("Cursor: {0}".format(cursor))
    except Exception as e:
        print(e)
    documents = yield cursor.to_list(length=None)


    self.logger.debug('results : {0}'.format(documents))


    # process MongoDB JSON extended
    results = json.loads(json_util.dumps(documents))
    pipeline = json.loads(json_util.dumps(pipeline))


    response_data = {
        'pipeline': pipeline,
        'results': results
    }


    self.respond(response_data)

用于测试它的方法如下:

#@tornado.testing.gen_test
def test_time_inside(self):
    current_time = gen_time()
    past_time =  gen_time() - datetime.timedelta(minutes=20)


    test_query = copy.deepcopy(QUERY)
    oid = ObjectId("53a72de12fb05c0788545ed6")
    test_query[0]['$match']['attribute'] = oid
    test_query[0]['$match']['date_created']['$gte'] = past_time
    test_query[0]['$match']['date_created']['$lte'] = current_time


    request = produce.HTTPRequest(
        method="GET",
        headers=produce.HTTPHeaders(
            kwargs = {
                "Content-Type": "application/json",
                "Accept": "application/json",
                "X-Sl-Organization": "test",
                "Hmac": "83275edec557e2a339e0ec624201db604645e1e1",
                "X-Sl-Username": "test@test.co",
                "X-Sl-Expires": 1602011725
            }
        ),
        uri="/api/v1/attribute-data/aggregate?{0}".format(json_util.dumps({
            "pipeline": test_query
        }))
    )


    self.ARH = produce.AggregateRequestHandler(request=request)


    #io_loop = tornado.ioloop.IOLoop.instance()
    self.io_loop.run_sync(self.ARH.get)


    #def stop_test():
        #self.stop()


    #self.ARH.test_get(stop_test)
    #self.wait()


    output = self.ARH.get_written_output()


    assert output == ""

这是我为请求处理程序设置工厂的方式:

class OutputTestAggregateRequestHandler(slapi.rest.AggregateRequestHandler, tornado.testing.AsyncTestCase):
    '''
    '''


    _written_output = []




    def write(self, chunk):
        print("Previously written: {0}".format(self._written_output))
        print("Len: {0}".format(len(self._written_output)))
        if self._finished:
            raise RuntimeError("Cannot write() after finish().  May be caused "
                               "by using async operations without the "
                               "@asynchronous decorator.")
        if isinstance(chunk, dict):
            print("Going to encode a chunk")
            chunk = escape.json_encode(chunk)
            self.set_header("Content-Type", "application/json; charset=UTF-8")
        chunk = escape.utf8(chunk)
        print("Writing")
        self._written_output = []
        self._written_output.append(chunk)
        print(chunk)




    def flush(self, include_footers=False, callback=None):
        pass




    def get_written_output(self):
        for_return = self._written_output
        self._written_output = []
        return for_return




class AggregateRequestHandler(StreamlyneRequestHandler):
    '''
    '''


    class Meta:
        model = OutputTestAggregateRequestHandler


    model = slapi.model.AttributeDatum

运行测试时,测试只会在def aggregate(self):print(cursor_future)之间的print("Cursor: {0}".format(cursor))停止。

你在stdout中看到

MotorCollection(Collection(Database(MongoClient([]), u'test'), u'attribute_datum'))
<tornado.concurrent.Future object at 0x7fbc737993d0>

并且在测试失败后没有其他任何内容出现

>       assert output == ""
E       AssertionError: assert [] == ''

经过大量时间查看文档和示例以及堆栈溢出后,我设法通过将以下代码添加到OutputTestAggregateRequestHandler来获得正常运行的测试:

def set_io_loop(self):
    self.io_loop = tornado.ioloop.IOLoop.instance()


def ioloop(f):
    @functools.wraps(f)
    def wrapper(self, *args, **kwargs):
        print(args)
        self.set_io_loop()
        return f(self, *args, **kwargs)
    return wrapper


def runTest(self):
    pass

然后将AggregateRequestHandler.aggregate中的所有代码复制到OutputTestAggregateRequestHandler中,但使用不同的装饰器:

@ioloop
@tornado.testing.gen_test
def _aggregate(self):
    ......

然后我收到了输出:

assert output == ""
E       AssertionError: assert ['{\n    "pipeline": [\n        {\n            "$match": {\n                "attribute": {\n                    "$oid"...                "$oid": "53cec0e72dc9832c4c4185f2"\n            }, \n            "quality": 9001\n        }\n    ]\n}'] == ''

这实际上是成功的,但我只是故意触发断言错误以查看输出。

我遇到的最大问题是如何实现所需的结果,即通过添加额外代码和复制聚合方法获得的输出。 显然,当从聚合方法中复制代码时,在对实际方法进行更改后,测试不再有用。如何在测试中使实际聚合方法正常运行,而不是在遇到异步代码时看似停止? 谢谢你的帮助, 干杯! -Liam

1 个答案:

答案 0 :(得分:0)

通常,测试RequestHandlers的预期方法是使用AsyncHTTPTestCase,而不是AsyncTestCase。这将为您设置HTTP客户端和服务器,一切都将通过HTTP管道。虽然在Tornado 4.0中使用虚拟HTTPConnection来避免整个服务器堆栈,但是不完全支持在Application和HTTP服务器之外使用RequestHandler。这可能会更快,尽管此时它是一种未知领域。