Gigabyte

This definition is used in all contexts of science (especially data science), engineering, business, and many areas of computing, including storage capacities of hard drives, solid-state drives, and tapes, as well as data transmission speeds.

The term is also used in some fields of computer science and information technology to denote 1073741824 (10243 or 230) bytes, however, particularly for sizes of RAM.

The latter binary usage originated as compromise technical jargon for byte multiples that needed to be expressed in a power of 2, but lacked a convenient name.

By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST, and in 2009 it was incorporated in the International System of Quantities.

Nevertheless, the term gigabyte continues to be widely used with the following two different meanings: Based on powers of 10, this definition uses the prefix giga- as defined in the International System of Units (SI).

This usage is widely promulgated by some operating systems, such as Microsoft Windows in reference to computer memory (e.g., RAM).

Some operating systems such as Mac OS X[8] and Ubuntu,[9] and Debian[10] express hard drive capacity or file size using decimal multipliers, while others such as Microsoft Windows report size using binary multipliers.

For RAM, the JEDEC memory standards use IEEE 100 nomenclature which quote the gigabyte as 1073741824bytes (230 bytes).

Specifically, the courts held that "the U.S. Congress has deemed the decimal definition of gigabyte to be the 'preferred' one for the purposes of 'U.S.

[12][14] Because of their physical design, the capacity of modern computer random-access memory devices, such as DIMM modules, is always a multiple of a power of 1024.

This 2.5-inch hard drive has a capacity of 500 gigabytes (GB) of data (i.e., 500 billion bytes).