Everyone understands the familiar years, days, hours, minutes, and seconds used to measure
the passage of time. These terms are deceptively simple. Every now has exactly one
set of year-day-hour-minute-second numbers that separates it from every other time. At
this level of understanding, the familiar terms are both accurate and sufficient.
But what if you need to know how many seconds there will be between now and a year from now?
If you think that you can just multiply out the result, you'd be wrong. A year isn't a
fixed number of seconds, it's the time it takes the Earth to make one full revolution around
the sun. And, depending on how you look at it, a day doesn't have 24 hours either. If a
day is the amount of time it takes the Earth to make one full rotation, then a "day" is
almost exactly 23 hours and 56 minutes. But since the Earth revolves around the sun while
it rotates about its axis, the sun or a star will appear to cross the meridian after
approximately 24 hours. But the Earth's orbit isn't circular, and the equatorial plane
is inclined to the orbital plane. Thus a "day," measured by the sun crossing the meridian,
varies in length throughout the year. To make things worse, the sun isn't standing still
either, so the length of a year changes over time.
Let's Get Technical
Sidereal time is the hour angle of the vernal equinox, the ascending node of the
ecliptic on the celestial equator. The daily motion of this point provides a
measure of the rotation of the Earth with respect to the stars, rather than
the Sun. Local mean sidereal time is computed from the current Greenwich Mean
Sidereal Time plus an input offset in longitude (converted to a sidereal offset
by the ratio 1.00273790935 of the mean solar day to the mean sidereal day).
Applying the equation of equinoxes, or nutation of the mean pole of the Earth
from mean to true position, yields local apparent sidereal time. Astronomers
use local sidereal time because it corresponds to the coordinate right
ascension of a celestial body that is presently on the local meridian.
International Standard Time Terms
Mean Solar Day
24 hours. An average value used for convenience. Usually corresponds to 86,400 standard
A relatively invariant amount of time. There are usually 60 standard seconds to the standard minute,
60 standard minutes to the standard hour, and 24 standard hours to the standard day.
The length of a standard second is derived from the decay time of cesium atoms. It
is close to, but not identical with, the length of time measured by a solar second.
By international agreement, a standard second is "the duration of 9,192,631,770 periods
of the radiation corresponding to the transition between the two hyperfine levels of
the ground state of the cesium atom." (Did you really want to know that?)
Because the standard second is stable, while the motion of the Earth is not, the
standard minute is occasionally adjusted by adding a leap second. Some
minutes, therefore, have 61 seconds. Standard hours always have 60 minutes, even if
one of the minutes is a second longer than usual. Standard days always have 24 hours.
Greenwich Mean Time (GMT)
GMT is a mathematical mean, defined in terms of the solar second, measured at the former location
of the Royal Observatory in England, located on the Prime Meridian (zero degrees longitude).
GMT is useful for navigation (when converted to UT1, which is outside the scope of this
document), but not for time-keeping programs.
When speaking loosely, GMT and Coordinated Universal Time (UTC) are roughly the same,
but careful speakers will note that GMT is based on the solar second, while UTC is based
on the standard second. The delta between the two time systems is too small for humans
to notice, but very important for computer programs and universal synchronization.
Coordinated Universal Time (UTC)
There are seven "universal" times, all currently within one second of each other, used for various
purposes. The Coordinated Universal Time is the one used for timesetting. The abbreviation UTC
is a language-independent international abbreviation. It means both
"Coordinated Universal Time" and "Temps Universel Coordonné." (If it strikes you as odd that
neither of these two word orders can be abbreviated to UTC, you're not alone. For more information, we refer you to
UTC is the time-zone-independent reference standard used by time-keeping programs. UTC is based on the
standard second, so varies occasionally from GMT, but is corrected by the periodic application
of a leap second.
Domain Time uses UTC internally when serving and obtaining the time. Conversion to local
time (i.e., the time as defined by your time zone and current daylight-savings corrections)
is performed for display purposes only. Thus, any Domain Time Server anywhere in the
world can provide the time to any Domain Time Client. Each machine will display local time,
but communicate using UTC. Time zones and daylight-savings time do not affect Domain Time's
PC Timekeeping Terms
The degree of conformity of a measured or calculated value to its definition,
or with respect to a standard reference. Also the closeness of a measurement
to the true value as fixed by a universally accepted standard. For PC timekeeping,
accuracy usually means "how closely this clock matches the reference clock" within
the ability to measure the difference.
The degree of mutual agreement among a series of individual measurements. Precision
is often, but not necessarily, expressed by the standard deviation of the measurements.
Also, random uncertainty of a measured value, expressed by the standard deviation or
by a multiple of a standard deviation. For PC timekeeping, precision refers to
how well a particular machine keeps the time, determined by the short-term fluctuations
in frequency of the computer clock's oscillator, and by measurement errors introduced
by the overall clock resolution, interrupt latencies, preemption latencies, and
processor load. Although precision has a precise technical definition, the word
is often used in casual speech as if it were synonymous with accuracy.
The degree to which a measurement can be determined is called the resolution of the
measurement, i.e., the smallest significant difference that can be measured with a given
instrument. For example, a measurement made with a time interval counter might have
a resolution of 10 nanoseconds; periods of time smaller than the resolution are imperceivable
to the instrument doing the measuring.
If a clock has monotonicity, then each successive time reading from that clock
will yield a time further in the future than the previous reading. Rapid successive
checks may yield the same time as the last reading if the check interval is smaller
than the resolution of the clock or the test instrument. Monotonicity is very important
for many tasks, such as elapsed time measurement, time-date stamps on documents or
events, system logs, and so forth. Precision clocks and hardware oscillators almost always
display monotonicity due to their physical nature of incrementing a counter at a fixed
interval, but software clocks are under no such restriction. Good PC timekeeping
software restricts backward-travelling clocks as much as possible within the constraints
set by the machine's administrator.
Time Measurement Terms
One-thousandth of a second (0.001). Typically the best accuracy that can be achieved on
One millionth of a second (0.000001). Used primarily by lab equipment, GPS receivers, or
100 nanoseconds (0.0000001), or 1/10th of a microsecond. The base internal unit used by Windows, NT, and 2000 machines
to track the passage of time. Windows calculates elapsed time as x hectonanoseconds per
clock interrupt. The value of x varies depending on the machine's hardware, operating system,
One billionth of a second (0.000000001). Used primarily by physicists and electronics
engineers in measurements of very small intervals, such as RAM access times, or the time
it takes light to cross a room.
PC Timer Resolution
Approximately 0.0555 seconds, or 1/18th of a second. The interval between hardware interrupts
on a standard PC. This value represents the maximum accuracy that can be obtained when setting
or reading the clock under Windows 95, Windows 98, or Windows ME.
1000 seconds, or approximately 16 minutes. The amount of time it takes the average person
to read this page.