在python中分别查找每行代码所花费的时间

时间:2017-06-26 16:57:31

标签: python

我有pyspark脚本,如下所示。

#!/usr/bin/env python

from datetime import datetime
from pyspark import SparkContext, SparkConf
from pyspark.sql import HiveContext

conf = SparkConf()
sc = SparkContext(conf=conf)
sqlContext = HiveContext(sc)

hivedb='MySql'
table='abc_123'

df = sqlContext.table("{}.{}".format(hivedb,table))

# Register the Data Frame as a TempTable
df.registerTempTable('mytempTable')

#Time:
date=datetime.now().strftime('%Y-%m-%d %H:%M:%S')

#Find min value ID:
min_id = sqlContext.sql("select nvl(min(id),0) as minval from mytempTable").collect()[0].asDict()['minval']

sc.stop()

现在我想分别找出每行代码所花费的时间。像下面的东西

df = sqlContext.table("{}.{}".format(hivedb,table))

Time taken for `df` to create was 10 seconds 

date=datetime.now().strftime('%Y-%m-%d %H:%M:%S')

Time taken for finding `date` was 1 second

min_id = sqlContext.sql("select nvl(min(id),0) as minval from mytempTable").collect()[0].asDict()['minval']

Time taken for `min_id` query to execute was 3 seconds

我怎样才能做到这一点。

如果可能的话,我也想打印这些值

1 个答案:

答案 0 :(得分:1)

您可以使用内置cProfile。如果您想要显示信息,可以使用Snakeviz

TLDR: 使用python -m cProfile [-o output_file] [-s sort_order] myscript.py命令运行脚本并下载Snakeviz并运行snakeviz output_file