Unix time

It measures time by the number of non-leap seconds that have elapsed since 00:00:00 UTC on 1 January 1970, the Unix epoch.

In modern computing, values are sometimes stored with higher granularity, such as microseconds or nanoseconds.

International Atomic Time (TAI), in which every day is precisely 86400 seconds long, ignores solar time and gradually loses synchronization with the Earth's rotation at a rate of roughly one second per year.

[3] On a normal UTC day, which has a duration of 86400 seconds, the Unix time number changes in a continuous manner across midnight.

A Unix clock is often implemented with a different type of positive leap second handling associated with the Network Time Protocol (NTP).

Very briefly the system shows a nominally impossible time number, but this can be detected by the TIME_DEL state and corrected.

In these systems it is necessary to consult a table of leap seconds to correctly convert between UTC and the pseudo-Unix-time representation.

In some applications the number is simply represented textually as a string of decimal digits, raising only trivial additional problems.

The software development platform for version 6 of the QNX operating system has an unsigned 32-bit time_t, though older releases used a signed type.

[10][11] These structures provide a decimal-based fixed-point data format, which is useful for some applications, and trivial to convert for others.

The meaning of Unix time values below +63072000 (i.e., prior to 1 January 1972) is not precisely defined.

Computers of that era rarely had clocks set sufficiently accurately to provide meaningful sub-second timestamps in any case.

[12] A likely means to execute this change is to define a new time scale, called International Time[citation needed], that initially matches UTC but thereafter has no leap seconds, thus remaining at a constant offset from TAI.

Uncertainty about whether this will occur makes prospective Unix time no less predictable than it already is: if UTC were simply to have no further leap seconds the result would be the same.

[13][14] The current epoch of 1 January 1970 00:00:00 UTC was selected arbitrarily by Unix engineers because it was considered a convenient date to work with.

Computer clocks of the era were not sufficiently precisely set to form a precedent one way or the other.

Almost all modern programming languages provide APIs for working with Unix time or converting them to another data structure.

[16][17][18] iOS provides a Swift API which defaults to using an epoch of 1 January 2001 but can also be used with Unix timestamps.

Java provides an Instant object which holds a Unix timestamp in both seconds and nanoseconds.

[24] Filesystems designed for use with Unix-based operating systems tend to use Unix time.

[25][26] Several archive file formats can store timestamps in Unix time, including RAR and tar.

[27][28] Unix time is also commonly used to store timestamps in databases, including in MySQL and PostgreSQL.

The early cutoff can have an impact on databases that are storing historical information; in some databases where 32-bit Unix time is used for timestamps, it may be necessary to store time in a different form of field, such as a string, to represent dates before 1901.

[31]: 60 Date range cutoffs are not an issue with 64-bit representations of Unix time, as the effective range of dates representable with Unix time stored in a signed 64-bit integer is over 584 billion years, or 292 billion years in either direction of the 1970 epoch.

On Windows, the FILETIME type stores time as a count of 100-nanosecond intervals that have elapsed since 0:00 GMT on 1 January 1601.

[36] Many applications and programming languages provide methods for storing time with an explicit timezone.

[37] There are also a number of time format standards which exist to be readable by both humans and computers, such as ISO 8601.

Among some groups round binary numbers are also celebrated, such as +230 which occurred at 13:37:04 UTC on Saturday, 10 January 2004.

Vernor Vinge's novel A Deepness in the Sky describes a spacefaring trading civilization thousands of years in the future that still uses the Unix epoch.

The "programmer-archaeologist" responsible for finding and maintaining usable code in mature computer systems first believes that the epoch refers to the time when man first walked on the Moon, but then realizes that it is "the 0-second of one of humankind's first computer operating systems".

Unix time passed 1 000 000 000 seconds on 2001-09-09T01:46:40Z. [ 1 ] It was celebrated in Copenhagen, Denmark, at a party held by the Danish UNIX User Group at 03:46:40 local time.