Banker's algorithm is a resource allocation and deadlock avoidance algorithm developed by Edsger Dijkstra that tests for safety by simulating the allocation of predetermined maximum possible amounts of all resources, and then makes an "s-state" check to test for possible deadlock conditions for all other pending activities, before deciding whether allocation should be allowed to continue.
Some of the resources that are tracked in real systems are memory, semaphores and interface access.
This is a reasonable assumption in most cases since the system is not particularly concerned with how long each process runs (at least not from a deadlock avoidance perspective).
For an example of an unsafe state, consider what would happen if process 2 was holding 1 units of resource B at the beginning.
The algorithm is fairly straightforward once the distinction between safe and unsafe states is understood.
In most systems, this information is unavailable, making it impossible to implement the Banker's algorithm.