Time and Frequency from A to Z: Ti
An oscillator found inside an electronic instrument that serves as a reference for all of the time and frequency functions performed by that instrument. The time base oscillator in most instruments is a quartz oscillator, often an OCXO. However, some instruments now use rubidium oscillators as their time base.
A code (usually digital) that contains enough information to synchronize a clock to the correct time-of-day. Most time codes contain the UTC hour, minute, and second; the month, day, and year; and advance warning of daylight saving time and leap seconds. NIST time codes can be obtained from the WWV, WWVH, WWVB, ACTS, and Internet Time Services. The format of the WWV/WWVH time code is shown in the graphic below.
A statistic used to estimate time stability, based on the Modified Allan deviation. The time deviation is particularly useful for analyzing time transfer data, such as the results of a GPS common-view measurement.
The elapsed time between two events. In time and frequency metrology, time interval is usually measured in small fractions of a second, such as milliseconds, microseconds, or nanoseconds. Coarse time interval measurements can be made with a stop watch. Higher resolution time interval measurements are often made with a time interval counter.
An instrument used to measure the time interval between two signals. A time interval counter (TIC) has inputs for two electrical signals. One signal starts the counter and the other signal stops it.
TIC's differ in specification and design, but they all contain several basic parts known as the time base, the main gate, and the counting assembly. The time base provides evenly spaced pulses used to measure time interval. The time base is usually an internal quartz oscillator that can often be phase locked to an external reference. It must be stable because time base errors will directly affect the measurements. The main gate controls the time at which the count begins and ends. Pulses passing through the gate are routed to the counting assembly, where they are displayed on the TIC's front panel or read by computer. The counter can then be reset (or armed) to begin another measurement. The stop and start inputs are usually provided with level controls that set the amplitude limit (or trigger level) at which the counter responds to input signals. If the trigger levels are set improperly, a TIC might stop or start when it detects noise or other unwanted signals and produce invalid measurements.
The graphic below illustrates how a TIC measures the interval between two signals. The TIC begins measuring a time interval when the start signal reaches its trigger level and stops measuring when the stop signal reaches its trigger level. The time interval between the start and stop signals is measured by counting cycles from the time base. The measurements produced by a TIC are in time units such as microseconds or nanoseconds. These measurements assign a time value to the phase difference between the reference and the test signal.
The most important specification of a TIC is resolution. In traditional TIC designs, the resolution is limited to the period of the TICís time base frequency. For example, a TIC with a 10 MHz time base would be limited to a resolution of 100 ns. This is because traditional TIC designs count whole time base cycles to measure time interval and cannot resolve time intervals smaller than the period of one cycle. To improve this situation, some TIC designers have multiplied the time base frequency to get more cycles and thus more resolution. For example, multiplying the time base frequency to 100 MHz makes 10 ns resolution possible, and 1 ns counters have even been built using a 1 GHz time base. However, a more common way to increase resolution is to detect parts of a time base cycle through interpolation and not be limited by the number of whole cycles. Interpolation has made 1 ns TICs commonplace, and even 20 picosecond TICs are available.
The information displayed by a clock or calendar, usually including the hour, minute, second, month, day, and year. Time codes derived from a reference source such as UTC(NIST) are often used to synchronize clocks to the correct time of day.
The difference between a measured on-time pulse or signal, and a reference on-time pulse or signal, such as UTC(NIST). Time offset measurements are usually made with a time interval counter. The measurement result is usually reported in fractions of a second, such as milliseconds, microseconds, or nanoseconds.
An Internet time code protocol defined by the RFC-868 document and supported by the NIST Internet Time Service. The time code is sent as a 32-bit unformatted binary number that represents the time in UTC seconds since January 1, 1900. The server listens for Time Protocol requests on port 37, and responds in either TCP/IP or UDP/IP formats. Conversion from UTC to local time (if necessary) is the responsibility of the client program. The 32-bit binary format can represent times over a span of about 136 years with a resolution of 1 second. There is no provision for increasing the resolution or increasing the range of years.
An agreed upon system for keeping time. All time scales use a frequency source to define the length of the second, which is the standard unit of time interval. Seconds are then counted to measure longer units of time interval, such as minutes, hours, and days. Modern time scales such as UTC define the second based on an atomic property of the cesium atom, and thus standard seconds are produced by cesium oscillators. Earlier time scales (including earlier versions of Universal Time) were based on astronomical observations that measured the frequency of the Earth's rotation.
A measurement technique used to send a reference time or frequency from a source to a remote location. Time transfer involves the transmission of an on-time marker or a time code. The most common time transfer techniques are one-way, common-view, and two-way time transfer.
A geographical region that maintains a local time that usually differs by an integral number of hours from UTC. Time zones were initially instituted by the railroads in the United States and Canada during the 1880's to standardize timekeeping. Within several years the use of time zones had expanded internationally.
Ideally, the world would be divided into 24 time zones of equal width. Each zone would have an east-west dimension of 15° of longitude centered upon a central meridian. This central meridian for a zone is defined in terms of its position relative to a universal reference, the prime meridian (often called the zero meridian) located at 0° longitude. In other words, the central meridian of each zone has a longitude divisible by 15°. When the sun is directly above this central meridian, local time at all points within that time zone would be noon. In practice, the boundaries between time zones are often modified to accommodate political boundaries in the various countries. A few countries use a local time that differs by one half hour from that of the central meridian.
Converting UTC to local time, or vice versa, requires knowing the number of time zones between the prime meridian and your local time zone. It is also necessary to know whether Daylight Saving Time (DST) is in effect, since UTC does not observe DST. The table below shows the difference between UTC and local time for the major United States time zones.
|Time Zone||Difference from UTC During Standard Time||Difference from UTC During Daylight Time|
|Pacific||-8 hours||-7 hours|
|Mountain||-7 hours||-6 hours|
|Central||-6 hours||-5 hours|
|Eastern||-5 hours||-4 hours|