Note: I've seen https://stackoverflow.com/a/64315882/21728 and understand that time
is not necessarily that precise. However, I'm seeing a 4× difference between the reported time
and time it actually took, and I'd like to understand what's causing it on macOS – that's the point of this question.
I'm trying to compare two ways to run a binary and they report very similar time
info:
$ time ../../../node_modules/.bin/quicktype --version
quicktype version 15.0.214
Visit quicktype.io for more info.
../../../node_modules/.bin/quicktype --version 0.46s user 0.06s system 110% cpu 0.474 total
$ time $(yarn bin quicktype) --version
quicktype version 15.0.214
Visit quicktype.io for more info.
$(yarn bin quicktype) --version 0.44s user 0.06s system 110% cpu 0.449 total
However, the latter feels much slower. So I've added timestamps before and after:
$ date +"%T.%3N" && time $(yarn bin quicktype) --version && date +"%T.%3N"
15:11:09.667
quicktype version 15.0.214
Visit quicktype.io for more info.
$(yarn bin quicktype) --version 0.49s user 0.06s system 108% cpu 0.513 total
15:11:11.400
Indeed, the difference between 15:11:09.667
and 15:11:11.400
is almost two seconds but time
is reporting about 0.5 second. What explains this rather vast difference?
question from:
https://stackoverflow.com/questions/66064977/what-explains-the-inaccuracy-of-time-on-macos 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…