我使用glium作为我的opengl绑定,但是不可能获得可靠的60 FPS。
最小的测试用例是
#[macro_use]
extern crate glium;
extern crate clock_ticks;
use glium::Surface;
use glium::glutin;
fn main() {
use glium::DisplayBuild;
let display = glutin::WindowBuilder::new()
.build_glium()
.unwrap();
let frames = 60 * 5;
let trials = 3;
for _ in 0.. trials {
let start_ns = clock_ticks::precise_time_ns();
for _ in 0..frames {
display.draw().finish().unwrap();
}
let duration_ns = clock_ticks::precise_time_ns() - start_ns;
let duration_s = (duration_ns as f64) / 1_000_000_000f64;
let fps = (frames as f64) / duration_s;
let dropped = (duration_s - (frames as f64 * (1f64/60f64))) / (1f64/60f64);
println!("{} frames in {:.6} seconds = {:.3} fps (estimated {:.1} frames dropped)", frames, duration_s, fps, dropped);
}
}
我期望60 FPS,但在我运行它时经常显示59 FPS(在OSX中)。 The project is available on github以便于编译和运行。
有什么方法可以调整glium以便它不会丢帧? OSX覆盖了vsync
设置,因此无法在每个帧之间等待vsync。
答案 0 :(得分:1)
是的,就像@ 8bitree一样,我怀疑你测量不正确,而不是实际问题。在我的系统上,Debian:
steve@warmachine:~/tmp/guess$ cargo run
Running `target/debug/guess`
300 frames in 4.427656 seconds = 67.756 fps (estimated -34.3 frames dropped)
300 frames in 0.006892 seconds = 43529.834 fps (estimated -299.6 frames dropped)
300 frames in 0.006522 seconds = 45997.412 fps (estimated -299.6 frames dropped)
steve@warmachine:~/tmp/guess$ cargo run
Running `target/debug/guess`
300 frames in 4.953447 seconds = 60.564 fps (estimated -2.8 frames dropped)
300 frames in 4.999410 seconds = 60.007 fps (estimated -0.0 frames dropped)
300 frames in 1.608712 seconds = 186.485 fps (estimated -203.5 frames dropped)
所以,是的,有点......奇怪。