I'm writing a 16-bit application. When I use the time function I get a value that is about 18000 seconds off, giving me a time that is hours off of the actual time. When I use the same function in a 32-bit application, it returns the correct value. What's wrong here?
The problem is with timezones. The 32-bit version of the function collects the timezone information for your computer from the operating system. The 16-bit version must have timezone information provided in an environment variable named TZ. If you do not provide TZ, then the function defaults to eastern time. Unless the program is running on the east coast, this will not be the correct time.
To solve this problem, you must add an environment variable to your operating system in this format:
TZ = zzz[+/-]d[d][lll]
Where zzz is a three-character string representing the name of the current time zone. All three characters are required. For example, the string "PST" could be used to represent Pacific Standard Time.
[+/-]d[d] is a required field containing an optionally signed number with 1 or more digits. This number is the local time zone's difference from Greenwich Mean Time (GMT) in hours. Positive numbers adjust westward from GMT. Negative numbers adjust eastward from GMT. For example, the number 5 = EST, +8 = PST, and -1 = Continental Europe.
lll is an optional three-character field that represents the local time zone, daylight saving time. For example, the string "PDT" could be used to represent Pacific Daylight Savings Time.
To set environment variables on Windows NT 4, click Start/ Settings/ Control Panel/ System /Environment. To set envrionment variables on Windows 95/98, add a line to your AUTOEXEC.BAT file in this format (this example is for Pacific Daylight Time):
Last Modified: 27-OCT-99