Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
360 views
in Technique[技术] by (71.8m points)

javascript - setInterval delays not accurate

I'm currently creating a countdown using setInterval though at the moment it runs slower than it should. According to the MDN, the delay parameter is in milliseconds however it isn't accurate.

I compared my countdown to the one on my phone and the phone runs nearly 5 times faster.

    var count = setInterval( function() {
            if (iMil == 0) {
                if (iS == 0) {
                    if (iMin == 0) {
                        if (iH == 0) {
                            // DONE
                        } else {
                            iH--;
                            iMin = 59;
                            iS = 59;
                            iMil = 999;
                        }
                    } else {
                        iMin--;
                        iS = 59;
                        iMil == 999;
                    }
                } else {
                    iS--;
                    iMil = 999;
                }
            } else {
                iMil--;
            }
            hours.text(iH);
            minutes.text(iMin);
            seconds.text(iS);
            milliseconds.text(iMil);
        }, 1 );

This is the main part of my script. The variables hours, minutes, seconds and milliseconds are jQuery object elements.

What I'm getting at is, is there a reason that it runs slower than it is supposed too?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

setInterval() is not guaranteed to run perfectly on time in javascript. That's partly because JS is single threaded and partly for other reasons. If you want to display a time with setInterval() then get the current time on each timer tick and display that. The setInterval() won't be your timer, but just a recurring screen update mechanism. Your time display will always be accurate if you do it that way.

In addition, no browser will guarantee a call to your interval at 1ms intervals. In fact, many browsers will never call setInterval more often than every 5ms and some even longer than that. Plus, if there are any other events happening in the browser with other code responding to those events, the setInterval() call might be delayed even longer. The HTML5 spec proposes 4ms as the shortest interval for setTimeout() and 10ms as the shortest interval for setInterval(), but allows the implementor to use longer minimum times if desired.

In fact, if you look at this draft spec for timers, step 5 of the algorithm says:

If timeout is less than 10, then increase timeout to 10.

And, step 8 says this:

Optionally, wait a further user-agent defined length of time.

And, it includes this note:

This is intended to allow user agents to pad timeouts as needed to optimise the power usage of the device. For example, some processors have a low-power mode where the granularity of timers is reduced; on such platforms, user agents can slow timers down to fit this schedule instead of requiring the processor to use the more accurate mode with its associated higher power usage.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...