top | item 33802007

(no title)

clintonwoo | 3 years ago

The article mentions that you can't use FPS as a yardstick for performance if you're already getting "max", but there is a way around this.

You can measure the frame start and end time (duration) and divide 1000/average duration over 1 second. This will give you the render speed even if that's not the actual number of rendered frames.

discuss

order

speps|3 years ago

You're right, it's usually called the frame time and some engines even measure this per frame using GPU events/markers. However, the main reason for not using FPS is that it's not a linear scale. Losing 5 FPS when it's running at 120 FPS isn't a big deal but losing 5 when running at 10 FPS is a disaster. Comparing frame rates is useless, comparing frame times is the right way.