Sunday, August 21, 2005

Some people do their frame rate meter like this (at least I did when I first did it):

this.frames = 0;
this.startTime = getTimer();
this.onEnterFrame = function() {
this.frames++;
var now = getTimer();
var timeElapsed = (now-this.startTime)*0.001;
// find fps and truncate results to 2 decimal
places;
var fps = Math.round(100*this.frames/timeElapsed)*0.01;
txtFPS.text = "fps: "+fps;
};
stop();

Can you see any problem with the code?

The problem with the code is that it's getting the frame rate via the # of frames since the app starts divided by the time that has elapsed SINCE the app starts. The problem with this approach is that you are only getting average fps, not the current fps. So if your app has run for a few hours with a good frame rate, and suddenly the frame rate drops for a few seconds, you won't see it because the sudden drop in frame rate is "diluted" by the huge time elapse.

What you need to do instead is to find out the frame counts in the last sec elapsed.

this.frames = 0;
this.startTime = getTimer();
this.onEnterFrame = function()
{
this.frames++;
var now = getTimer();
var timeElapsed = (now - this.startTime) * 0.001;
if (timeElapsed >= 1.0)
{
var fps = Math.round(this.frames / timeElapsed);
txtFPS.text = "fps: " + fps;
this.startTime = getTimer();
this.frames = 0;
}
}

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home