Point (typography)

In digital type, letters of a computer font are designed around an imaginary space called an em square.

[1][2] The point was first established by the Milanese typographer, Francesco Torniella da Novara (c. 1490 – 1589) in his 1517 alphabet, L'Alfabeto.

The construction of the alphabet is the first based on logical measurement called "Punto," which corresponds to the ninth part of the height of the letters or the thickness of the principal stroke.

During the metrication of France amid its revolution, a 1799 law declared the meter to be exactly 443.296 French lines long.

[citation needed] It was still a standard in Belgium, in parts of Austria, and in Northern France at the beginning of the 20th century.

He did not change the subdivisions (1 inch = 12 subdivisions = 72 points), but defined it strictly in terms of the royal foot, a legal length measure in France: the Didot point is exactly 1⁄864 of a French foot or 1⁄72 of a French inch, that is (by 1799) 15625⁄41559 mm or about 0.375972 mm.

The basic unit of measurements in American typography was the pica,[12][27][28] usually approximated as one sixth of an inch, but the exact size was not standardized, and various type foundries had been using their own.

[12] During and after the American Revolutionary War, Benjamin Franklin was sent as commissioner (Ambassador) for the United States to France from December 1776 to 1785.

Franklin then imported French typefounding equipment to Philadelphia to help Bache set up a type-foundry.

[30][31] After the death of Franklin, the matrices and the Fournier mould were acquired by Binny and Ronaldson, the first permanent type-foundry in America.

As a consequence all the tables of measurements in the German, Dutch, French, Polish and all other manuals elsewhere on the European continent for the composition caster and the super-caster are different in quite some details.

The Monotype wedges used at the European continent are marked with an extra 'E' behind the set-size: for instance: 5-12E, 1331-15E etc.

This specification was found in the Xerox Interpress language used for its early digital printers and further developed by John Warnock and Charles Geschke when they created Adobe PostScript.

[citation needed] Since the advent of high-density "Retina" screens with a much higher resolution than the original 72 dots per inch, Apple's programming environment Xcode sizes GUI elements in points that are scaled automatically to a whole number of physical pixels in order to accommodate for screen size, pixel density and typical viewing distance.

In lead typecasting, most font sizes commonly used in printing have conventional names that differ by country, language and the type of points used.

Desktop publishing software and word processors intended for office and personal use often have a list of suggested font sizes in their user interface, but they are not named and usually an arbitrary value can be entered manually.

While most software nowadays defaults to DTP points, many allow specifying font size in other units of measure (e.g., inches, millimeters, pixels), especially code-based systems such as TeX and CSS.

The Fournier scale: two inches in total, divided into four half-inches, the medium intervals are one line ( 1 12 inch), and the smallest intervals are 1 36 inch; no intervals for the point is given, though