The data is modified using differential Manchester encoding when written to allow clock recovery to address timing effects known as "jitter" seen on disk media.
Main memory systems in modern computers store binary information using two different electrical signals, typically voltages.
The letter "A" in ASCII is represented as 01000001 in binary, which might be stored in a typical late-1970s DRAM like the Mostek MK4116 as a series of 0 and 5 V voltages in the individual capacitors making up the memory.
During reading, the disk is rotating so its surface moves rapidly past the read/write head, a small electromagnet.
In particular, disks suffer from an effect known as jitter due to small changes in timing as the media speeds up and slows down during rotation.
Since each bit of data requires two minimum times, FM encoding stores about half the amount that is theoretically possible on that media.
[3] Encoding these transitions requires the system to accept digital data from the host computer and then re-code it into the underlying FM format.
Because the FM system is so simple, it could be implemented in single-chip forms using late 1970's semiconductor fabrication techniques.
In the IBM format, this consists of a series of thirteen zeros followed by three hexadecimal A1's in front of the header and data areas.
[3] As each bit of data requires two transition periods in the FM system, it makes use of only half the potential storage capacity of the disk.
Generally this takes the form of a phase locked loop or similar system that will produce a steady output clock signal from a noisy input.
MFM IC's were available, and were used on more expensive platforms like the IBM PC, but using them required the clock recovery to be performed by external hardware, the "data separator".