So am not sure which one to use since my understanding of both is limited. The problem that I have is that I want to display the time depending on what time zone the user has set their clock to on their machine. For example if the user set their machine to EST but is in California I want the time to be displayed in EST. Should I use toUniversalTime() and ToLocalTime()?
ChangeDate = DBNulls.DBNullToDateTime(reader[ChangeDateOrdinal]).ToUniversalTime()?
ChangeDate = DBNulls.DBNullToDateTime(reader[ChangeDateOrdinal]).ToLocalTime()?
In that case, you would use ToLocalTime()
... but that's only going to work if the value from the database is already in UTC. If it's in some other time zone, that's a whole different matter :)
In many - but not all - cases, storing UTC in the database is the right approach. So when you accept user input, you may want to convert that to UTC and store the UTC value... then when you use ToLocalTime()
to present the value back to a user (possibly a different user in a different time zone) it will use that viewing user's time zone.
The physical location of the machine is irrelevant - it's the system time zone that's important.
Now if you're writing a web app so the user's time zone isn't the time zone of the machine running the .NET code, that's a different problem again - then you could either ship the UTC back to the client and get Javascript to convert it to local time, or detect the user's time zone (again with Javascript) and perform the conversion on the server, probably using TimeZoneInfo
.
Plug: if you're willing to move away from the BCL classes, you may find that my Noda Time library helps to keep things clearer. It has different types for all the different kinds of value you might want to work with, as well as support for the most common time zone database maintained by IANA, which would probably help if you're detecting a client time zone.
See more on this question at Stackoverflow