Google Docs says:
The local time of a given location is the sum of the timestamp parameter, and the dstOffset and rawOffset fields from the result.
timestamp + dstOffset + rawOffset = local time in seconds in UTC
Problem
I am trying to get the local time in the correct time zone.
I thought that new Date(timestamp + dstOffset + rawOffset *1000) would return the local time in UTC, but instead I find that the actual numerical values are indeed in local time, in the wrong time zone (UTC).
Example:
timestamp: 1456349190
readable timestamp: new Date(1456349190*1000).toGMTString()
(Wed, 24 Feb 2016 21:26:30 GMT)
https://maps.googleapis.com/maps/api/timezone/json?location=39.2079206,-84.5274616×tamp=1456349190&key=${googleApiKey}
Returns:
- dstOffset: 0
- rawOffset: -18000
- timeZoneId: America/New_York
Data:
Sum in seconds = 1456349190 + 0 + (-18000) = 1456331190
Sum in milliseconds = 1456331190*1000 = 1456331190000
Supposed Local Date = new Date(1456331190000).toGMTString()
(Wed, 24 Feb 2016 16:26:30 GMT)
Question
Shouldn't readable timestamp and Supposed Local Date be the same since both are in UTC?
Seems like Supposed Local Date should really be Wed, 24 Feb 2016 16:26:30 EST
Is this correct?
If so it seems like I just need to extract the values I need (local hour) from Supposed Local Date and apply the correct time zone returned from google api timeZoneId (America/New_York) because 16:26:30 is the correct local time I need.
Helpful Tips
These are some guidelines that helped me understand timestamp better:
"The unix timestamp isn't affected by a timezone setting. Setting the timezone only affects the interpretation of the timestamp value."
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…