I have a date time that I generate like this:
DateTime myDateTime = DateTime.Now;
I then store it in the database (in a DateTime
typed column) with Entity Framework. I then retrieve it with OData (WCF Data Services).
When it goes in the TimeOfDay value is: 09:30:03.0196095
When it comes out the TimeOfDay value is: 09:30:03.0200000
The net effect of this makes it so that the Milliseconds are seen as 19 before it is saved and 20 after it is re-loaded.
So when I do a compare later in my code, it fails where it should be equal.
Does SQL Server not have as much precision as .NET? Or is it Entity Framework or OData that is messing this up?
I will just truncate off the milliseconds (I don't really need them). But I would like to know why this is happening.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…