Fragmentation (computing)

In many cases, fragmentation leads to storage space being "wasted", and programs will tend to run inefficiently due to the shortage of memory.

Eventually, it may become impossible for the program to obtain large contiguous chunks of memory.

In this scenario, the unusable memory, known as slack space, is contained within an allocated region.

It is a weakness of certain storage allocation algorithms, when they fail to order memory used by programs efficiently.

The result is that, although free storage is available, it is effectively unusable because it is divided into pieces that are too small individually to satisfy the demands of the application.

It is typically the result of attempting to insert a large object into storage that has already suffered external fragmentation.

However, as files are added, removed, and changed in size, the free space becomes externally fragmented, leaving only small holes in which to place new data.

There are a variety of algorithms for selecting which of those potential holes to put the file; each of them is a heuristic approximate solution to the bin packing problem.

Most defragmenting utilities also attempt to reduce or eliminate free space fragmentation.

External fragmentation tends to be less of a problem in file systems than in primary memory (RAM) storage systems, because programs usually require their RAM storage requests to be fulfilled with contiguous blocks, but file systems typically are designed to be able to use any collection of available blocks (fragments) to assemble a file which logically appears contiguous.

Most basically, fragmentation increases the work required to allocate and access a resource.

For example, on a hard drive or tape drive, sequential data reads are very fast, but seeking to a different address is slow, so reading or writing a fragmented file requires numerous seeks and is thus much slower, in addition to causing greater wear on the device.

Further, if a resource is not fragmented, allocation requests can simply be satisfied by returning a single block from the start of the free area.

However, if it is fragmented, the request requires either searching for a large enough free block, which may take a long time, or fulfilling the request by several smaller blocks (if this is possible), which results in this allocation being fragmented, and requiring additional overhead to manage the several pieces.

Thus cache sizing in system design must include margin to account for fragmentation.

During real-time computing of applications, fragmentation levels can reach as high as 99%, and may lead to system crashes or other instabilities.

[citation needed] This type of system crash can be difficult to avoid, as it is impossible to anticipate the critical rise in levels of memory fragmentation.

While fragmentation is best known as a problem in memory allocation, analogous phenomena occur for other resources, notably processors.