High resolution timers that provide a resolution of 1 microsecond are system-specific, so you will have to use different methods to achieve this on different OS platforms. You may be interested in checking out the following article, which implements a cross-platform C++ timer class based on the functions described below:
- [Song Ho Ahn - High Resolution Timer][1]
Windows
The Windows API provides extremely high resolution timer functions: QueryPerformanceCounter()
, which returns the current elapsed ticks, and QueryPerformanceFrequency()
, which returns the number of ticks per second.
Example:
#include <stdio.h>
#include <windows.h> // for Windows APIs
int main(void)
{
LARGE_INTEGER frequency; // ticks per second
LARGE_INTEGER t1, t2; // ticks
double elapsedTime;
// get ticks per second
QueryPerformanceFrequency(&frequency);
// start timer
QueryPerformanceCounter(&t1);
// do something
// ...
// stop timer
QueryPerformanceCounter(&t2);
// compute and print the elapsed time in millisec
elapsedTime = (t2.QuadPart - t1.QuadPart) * 1000.0 / frequency.QuadPart;
printf("%f ms.
", elapsedTime);
}
Linux, Unix, and Mac
For Unix or Linux based system, you can use gettimeofday()
. This function is declared in "sys/time.h".
Example:
#include <stdio.h>
#include <sys/time.h> // for gettimeofday()
int main(void)
{
struct timeval t1, t2;
double elapsedTime;
// start timer
gettimeofday(&t1, NULL);
// do something
// ...
// stop timer
gettimeofday(&t2, NULL);
// compute and print the elapsed time in millisec
elapsedTime = (t2.tv_sec - t1.tv_sec) * 1000.0; // sec to ms
elapsedTime += (t2.tv_usec - t1.tv_usec) / 1000.0; // us to ms
printf("%f ms.
", elapsedTime);
}
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…