Time standard

Astronomical observations of several kinds, including eclipse records, studied in the 19th century, raised suspicions that the rate at which Earth rotates is gradually slowing and also shows small-scale irregularities, and this was confirmed in the early twentieth century.

Time standards based on Earth rotation were replaced (or initially supplemented) for astronomical use from 1952 onwards by an ephemeris time standard based on the Earth's orbital period and in practice on the motion of the Moon.

Other intervals of time (minutes, hours, and years) are usually defined in terms of these two.

In the late 1940s, quartz crystal oscillator clocks could measure time more accurately than the rotation of the Earth.

Since 1967, the SI base unit for time is the SI second, defined as exactly "the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom" (at a temperature of 0 K and at mean sea level).

TAI is produced by the International Bureau of Weights and Measures (BIPM), and is based on the combined input of many atomic clocks around the world,[13] each corrected for environmental and relativistic effects (both gravitational and because of speed, like in GNSS).

At high precision, Earth's rotation is irregular and is determined from the positions of distant quasars using long baseline interferometry, laser ranging of the Moon and artificial satellites, as well as GPS satellite orbits.

The offset is chosen such that a new day starts approximately while the Sun is crossing the nadir meridian.

Alternatively the difference is not really fixed, but it changes twice a year by a round amount, usually one hour, see Daylight saving time.

Conversions between atomic time systems (TAI, GPST, and UTC) are for the most part exact.

[15] Conversions for UT1 and TT rely on published difference tables which as of 2022[update] are specified to 10 microseconds and 0.1 nanoseconds respectively.

In astronomy, sidereal time is used to predict when a star will reach its highest point in the sky.

For accurate astronomical work on land, it was usual to observe sidereal time rather than solar time to measure mean solar time, because the observations of 'fixed' stars could be measured and reduced more accurately than observations of the Sun (in spite of the need to make various small compensations, for refraction, aberration, precession, nutation and proper motion).

It was in use for the official almanacs and planetary ephemerides from 1960 to 1983, and was replaced in official almanacs for 1984 and after, by numerically integrated Jet Propulsion Laboratory Development Ephemeris DE200 (based on the JPL relativistic coordinate time scale Teph).

For applications at the Earth's surface, ET's official replacement was Terrestrial Dynamical Time (TDT), which maintained continuity with it.

TDT is a uniform atomic time scale, whose unit is the SI second.

For the calculation of ephemerides, Barycentric Dynamical Time (TDB) was officially recommended to replace ET.

As defined, TCB (as observed from the Earth's surface) is of divergent rate relative to all of ET, Teph and TDT/TT;[30] and the same is true, to a lesser extent, of TCG.

The ephemerides of Sun, Moon and planets in current widespread and official use continue to be those calculated at the Jet Propulsion Laboratory (updated as from 2003 to DE405) using as argument Teph.