Why would DateTime.Now be made less precise than what most CPU clocks could handle?
A good clock should be both precise and accurate; those are different. As the old joke goes, a stopped clock is exactly accurate twice a day, a clock a minute slow is never accurate at any time. But the clock a minute slow is always precise to the nearest minute, whereas a stopped clock has no useful precision at all.
Why should the DateTime be precise to, say a microsecond when it cannot possibly be accurate to the microsecond? Most people do not have any source for official time signals that are accurate to the microsecond. Therefore giving six digits after the decimal place of precision, the last five of which are garbage would be lying.
Remember, the purpose of DateTime is to represent a date and time. High-precision timings is not at all the purpose of DateTime; as you note, that's the purpose of StopWatch. The purpose of DateTime is to represent a date and time for purposes like displaying the current time to the user, computing the number of days until next Tuesday, and so on.
In short, "what time is it?" and "how long did that take?" are completely different questions; don't use a tool designed to answer one question to answer the other.
Thanks for the question; this will make a good blog article! :-)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…