In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended.
ENIAC and CALDIC used this representation.
[1]
This computer-storage-related article is a stub.
You can help Wikipedia by expanding it.