我终于弄明白了,想分享知识并节省一些时间,所以请看下面的答案。 但是,我仍然需要Linux的答案,所以如果您知道,请回答,因为我的答案中的代码仅适用于Windows。
更新:我也已经考虑过Linux,包括前Python 3.3(例如:Raspberry Pi),我在下面的答案中发布了我的新模块/代码。
我原来的问题:如何在Python中获得毫秒和微秒分辨率的时间戳? 我也喜欢类似Arduino的延迟和delayMicroseconds()函数。
几个月前,这个问题被标记为this one的副本。看这里:
它说,“这个问题在这里已有答案。”不幸的是,这不是真的。几年前我在问这个问题之前就读过这些答案了,他们没有回答我的问题,也没有满足我的需要。它们与我的问题一样不适用,因为这里的最低调的答案是灰色的,因为它依赖于time
模块,因为它依赖于time
模块,因为它在Python 3.3之前没有任何类型的保证分辨率无论如何:
请重新打开我的问题。它不是重复的。它没有其他问题的事先答案。链接为已包含答案的问题依赖于{{1}}模块,甚至表明其解决方案已到处。最热门的答案引用了Windows分辨率,使用了16 ms的答案,这比我在此提供的答案(0.5 us分辨率) 32000次差。同样,我需要 1 ms 和 1 us (或类似)分辨率,而不是 16000 us 分辨率。因此,它不是重复的。
感谢您的时间。 :)
答案 0 :(得分:14)
对于Windows:这是一个功能齐全的模块,适用于Linux(也适用于Python之前的3.3版本)和Windows:
功能和代码示例 功能包括:
Python代码模块:
"""
GS_timing.py
-create some low-level Arduino-like millis() (milliseconds) and micros()
(microseconds) timing functions for Python
By Gabriel Staples
http://www.ElectricRCAircraftGuy.com
-click "Contact me" at the top of my website to find my email address
Started: 11 July 2016
Updated: 13 Aug 2016
History (newest on top):
20160813 - v0.2.0 created - added Linux compatibility, using ctypes, so that it's compatible with pre-Python 3.3 (for Python 3.3 or later just use the built-in time functions for Linux, shown here: https://docs.python.org/3/library/time.html)
-ex: time.clock_gettime(time.CLOCK_MONOTONIC_RAW)
20160711 - v0.1.0 created - functions work for Windows *only* (via the QPC timer)
References:
WINDOWS:
-personal (C++ code): GS_PCArduino.h
1) Acquiring high-resolution time stamps (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx
2) QueryPerformanceCounter function (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644904(v=vs.85).aspx
3) QueryPerformanceFrequency function (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644905(v=vs.85).aspx
4) LARGE_INTEGER union (Windows)
-https://msdn.microsoft.com/en-us/library/windows/desktop/aa383713(v=vs.85).aspx
-*****https://stackoverflow.com/questions/4430227/python-on-win32-how-to-get-
absolute-timing-cpu-cycle-count
LINUX:
-https://stackoverflow.com/questions/1205722/how-do-i-get-monotonic-time-durations-in-python
"""
import ctypes, os
#Constants:
VERSION = '0.2.0'
#-------------------------------------------------------------------
#FUNCTIONS:
#-------------------------------------------------------------------
#OS-specific low-level timing functions:
if (os.name=='nt'): #for Windows:
def micros():
"return a timestamp in microseconds (us)"
tics = ctypes.c_int64()
freq = ctypes.c_int64()
#get ticks on the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))
#get the actual freq. of the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))
t_us = tics.value*1e6/freq.value
return t_us
def millis():
"return a timestamp in milliseconds (ms)"
tics = ctypes.c_int64()
freq = ctypes.c_int64()
#get ticks on the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))
#get the actual freq. of the internal ~2MHz QPC clock
ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))
t_ms = tics.value*1e3/freq.value
return t_ms
elif (os.name=='posix'): #for Linux:
#Constants:
CLOCK_MONOTONIC_RAW = 4 # see <linux/time.h> here: https://github.com/torvalds/linux/blob/master/include/uapi/linux/time.h
#prepare ctype timespec structure of {long, long}
class timespec(ctypes.Structure):
_fields_ =\
[
('tv_sec', ctypes.c_long),
('tv_nsec', ctypes.c_long)
]
#Configure Python access to the clock_gettime C library, via ctypes:
#Documentation:
#-ctypes.CDLL: https://docs.python.org/3.2/library/ctypes.html
#-librt.so.1 with clock_gettime: https://docs.oracle.com/cd/E36784_01/html/E36873/librt-3lib.html #-
#-Linux clock_gettime(): http://linux.die.net/man/3/clock_gettime
librt = ctypes.CDLL('librt.so.1', use_errno=True)
clock_gettime = librt.clock_gettime
#specify input arguments and types to the C clock_gettime() function
# (int clock_ID, timespec* t)
clock_gettime.argtypes = [ctypes.c_int, ctypes.POINTER(timespec)]
def monotonic_time():
"return a timestamp in seconds (sec)"
t = timespec()
#(Note that clock_gettime() returns 0 for success, or -1 for failure, in
# which case errno is set appropriately)
#-see here: http://linux.die.net/man/3/clock_gettime
if clock_gettime(CLOCK_MONOTONIC_RAW , ctypes.pointer(t)) != 0:
#if clock_gettime() returns an error
errno_ = ctypes.get_errno()
raise OSError(errno_, os.strerror(errno_))
return t.tv_sec + t.tv_nsec*1e-9 #sec
def micros():
"return a timestamp in microseconds (us)"
return monotonic_time()*1e6 #us
def millis():
"return a timestamp in milliseconds (ms)"
return monotonic_time()*1e3 #ms
#Other timing functions:
def delay(delay_ms):
"delay for delay_ms milliseconds (ms)"
t_start = millis()
while (millis() - t_start < delay_ms):
pass #do nothing
return
def delayMicroseconds(delay_us):
"delay for delay_us microseconds (us)"
t_start = micros()
while (micros() - t_start < delay_us):
pass #do nothing
return
#-------------------------------------------------------------------
#EXAMPLES:
#-------------------------------------------------------------------
#Only executute this block of code if running this module directly,
#*not* if importing it
#-see here: http://effbot.org/pyfaq/tutor-what-is-if-name-main-for.htm
if __name__ == "__main__": #if running this module as a stand-alone program
#print loop execution time 100 times, using micros()
tStart = micros() #us
for x in range(0, 100):
tNow = micros() #us
dt = tNow - tStart #us; delta time
tStart = tNow #us; update
print("dt(us) = " + str(dt))
#print loop execution time 100 times, using millis()
print("\n")
tStart = millis() #ms
for x in range(0, 100):
tNow = millis() #ms
dt = tNow - tStart #ms; delta time
tStart = tNow #ms; update
print("dt(ms) = " + str(dt))
#print a counter once per second, for 5 seconds, using delay
print("\nstart")
for i in range(1,6):
delay(1000)
print(i)
#print a counter once per second, for 5 seconds, using delayMicroseconds
print("\nstart")
for i in range(1,6):
delayMicroseconds(1000000)
print(i)
如果您知道如何在Linux中获得上述毫秒和微秒分辨率的时间戳,请发布,因为这也非常有用。
这也适用于Linux,包括在Python之前的3.3版,因为我通过ctypes模块使用C函数来读取时间戳。
(注:以上代码最初发布在此处:http://www.electricrcaircraftguy.com/2016/07/arduino-like-millisecond-and-microsecond-timestamps-in-python.html)
特别感谢@ArminRonacher在这里提出的精彩的Python前3.3 Linux答案:https://stackoverflow.com/a/1205762/4561887
更新:在Python 3.3之前,内置的Python时间库(https://docs.python.org/3.5/library/time.html)没有任何明确的高分辨率函数。现在,它确实提供了其他选项,包括一些高分辨率函数。
然而,上面的模块为Python 3.3之前的Python代码提供了高分辨率的时间戳,以及和之后在Linux和Windows上都提供了高分辨率的时间戳。
这是我的意思的一个例子,表明time.sleep()
函数不一定是高分辨率函数。 *在我的Windows机器上,它的分辨率可能 8ms ,而我上面的模块 0.5us分辨率( 16000倍更好! )在同一台机器上。
代码演示:
import time
import GS_timing as timing
def delayMicroseconds(n):
time.sleep(n / 1000000.)
def delayMillisecond(n):
time.sleep(n / 1000.)
t_start = 0
t_end = 0
#using time.sleep
print('using time.sleep')
print('delayMicroseconds(1)')
for x in range(10):
t_start = timing.micros() #us
delayMicroseconds(1)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
print('delayMicroseconds(2000)')
for x in range(10):
t_start = timing.micros() #us
delayMicroseconds(2000)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
#using GS_timing
print('\nusing GS_timing')
print('timing.delayMicroseconds(1)')
for x in range(10):
t_start = timing.micros() #us
timing.delayMicroseconds(1)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
print('timing.delayMicroseconds(2000)')
for x in range(10):
t_start = timing.micros() #us
timing.delayMicroseconds(2000)
t_end = timing.micros() #us
print('dt (us) = ' + str(t_end - t_start))
我的WINDOWS 8.1 MACHINE上的样本结果(注意时间差多少):
using time.sleep
delayMicroseconds(1)
dt (us) = 2872.059814453125
dt (us) = 886.3939208984375
dt (us) = 770.4649658203125
dt (us) = 1138.7698974609375
dt (us) = 1426.027099609375
dt (us) = 734.557861328125
dt (us) = 10617.233642578125
dt (us) = 9594.90576171875
dt (us) = 9155.299560546875
dt (us) = 9520.526611328125
delayMicroseconds(2000)
dt (us) = 8799.3056640625
dt (us) = 9609.2685546875
dt (us) = 9679.5439453125
dt (us) = 9248.145263671875
dt (us) = 9389.721923828125
dt (us) = 9637.994262695312
dt (us) = 9616.450073242188
dt (us) = 9592.853881835938
dt (us) = 9465.639892578125
dt (us) = 7650.276611328125
using GS_timing
timing.delayMicroseconds(1)
dt (us) = 53.3477783203125
dt (us) = 36.93310546875
dt (us) = 36.9329833984375
dt (us) = 34.8812255859375
dt (us) = 35.3941650390625
dt (us) = 40.010986328125
dt (us) = 38.4720458984375
dt (us) = 56.425537109375
dt (us) = 35.9072265625
dt (us) = 36.420166015625
timing.delayMicroseconds(2000)
dt (us) = 2039.526611328125
dt (us) = 2046.195068359375
dt (us) = 2033.8841552734375
dt (us) = 2037.4747314453125
dt (us) = 2032.34521484375
dt (us) = 2086.2059326171875
dt (us) = 2035.4229736328125
dt (us) = 2051.32470703125
dt (us) = 2040.03955078125
dt (us) = 2027.215576171875
我的RASPBERRY PI VERSION 1 B +上的样本结果(注意使用time.sleep和我的模块之间的结果基本相同......显然time
中的低级函数已经访问了更好的分辨率计时器在这里,因为它是一台Linux机器(运行Raspbian)...但是在我的GS_timing
模块中,我明确地调用了CLOCK_MONOTONIC_RAW计时器。谁知道正在使用的是什么呢?:
using time.sleep
delayMicroseconds(1)
dt (us) = 1022.0
dt (us) = 417.0
dt (us) = 407.0
dt (us) = 450.0
dt (us) = 2078.0
dt (us) = 393.0
dt (us) = 1297.0
dt (us) = 878.0
dt (us) = 1135.0
dt (us) = 2896.0
delayMicroseconds(2000)
dt (us) = 2746.0
dt (us) = 2568.0
dt (us) = 2512.0
dt (us) = 2423.0
dt (us) = 2454.0
dt (us) = 2608.0
dt (us) = 2518.0
dt (us) = 2569.0
dt (us) = 2548.0
dt (us) = 2496.0
using GS_timing
timing.delayMicroseconds(1)
dt (us) = 572.0
dt (us) = 673.0
dt (us) = 1084.0
dt (us) = 561.0
dt (us) = 728.0
dt (us) = 576.0
dt (us) = 556.0
dt (us) = 584.0
dt (us) = 576.0
dt (us) = 578.0
timing.delayMicroseconds(2000)
dt (us) = 2741.0
dt (us) = 2466.0
dt (us) = 2522.0
dt (us) = 2810.0
dt (us) = 2589.0
dt (us) = 2681.0
dt (us) = 2546.0
dt (us) = 3090.0
dt (us) = 2600.0
dt (us) = 2400.0
答案 1 :(得分:-4)
import time
def delayMicroseconds(n):
time.sleep(n / 1000000.)
def delayMillisecond(n):
time.sleep(n / 1000.)