In garbage collection algorithms, reference counts may be used to deallocate objects that are no longer needed.
Weighted reference counts are a good solution for garbage collecting a distributed system.
Tracing garbage collection cycles are triggered too often if the set of live objects fills most of the available memory;[citation needed] it requires extra space to be efficient.
[citation needed] Reference counting performance does not deteriorate as the total amount of free space decreases.
For example, systems that depend heavily on immutable objects such as many functional programming languages can suffer an efficiency penalty due to frequent copies.
Reference counting in naive form has three main disadvantages over the tracing garbage collection, both of which require additional mechanisms to ameliorate: In addition to these, if the memory is allocated from a free list, reference counting suffers from poor locality.
It is also critical to perform the deferred increment before the object's count drops to zero, to avoid a premature free.
Therefore, update coalescing solves the third problem of naive reference counting (i.e., a costly overhead in a concurrent setting).
Levanoni and Petrank presented an enhanced algorithm that may run concurrently with multithreaded applications employing only fine synchronization.
This algorithm achieves throughput comparable with the fastest generational copying collectors with the low bounded pause times of reference counting.
Perhaps the most obvious way to handle reference cycles is to design the system to avoid creating them.
Computer scientists have also discovered ways to detect and collect reference cycles automatically, without requiring changes in the data structure design.
Bacon describes a cycle-collection algorithm for reference counting with similarities to tracing collectors, including the same theoretical time bounds.
It is based on the observation that a cycle can only be isolated when a reference count is decremented to a nonzero value.
[10] An enhanced version of this algorithm by Paz et al.[11] is able to run concurrently with other operations and improve its efficiency by using the update coalescing method of Levanoni and Petrank.
Because of this, removing a single reference can potentially lead to a large number of objects being freed.
[14] Microsoft's Component Object Model (COM) and WinRT makes pervasive use of reference counting.
In fact, two of the three methods that all COM objects must provide (in the IUnknown interface) increment or decrement the reference count.
[citation needed] One primary motivation for reference counting in COM is to enable interoperability across different programming languages and runtime systems.
C++ does not perform reference-counting by default, fulfilling its philosophy of not adding functionality that might incur overheads where the user has not explicitly requested it.
Objects that are dynamically allocated but not intended to be shared can have their lifetime automatically managed using a std::unique_ptr.
Traditionally this was accomplished by the programmer manually sending retain and release messages to objects, but Automatic Reference Counting, a Clang compiler feature that automatically inserts these messages as needed, was added in iOS 5[15] and Mac OS X 10.7.
Delphi is mostly not a garbage collected language, in that user-defined types must still be manually allocated and deallocated; however, it does provide automatic collection using reference counting for a few built-in types, such as strings, dynamic arrays, and interfaces, for ease of use and to simplify the generic database functionality.
The Vala programming language uses GObject reference counting as its primary garbage collection system, along with copy-heavy string handling.
Reference counting functionality is provided by the Rc and Arc types, which are non-atomic and atomic respectively.
In Rust, shared references cannot mutate their held data, so Rc often comes bundled with Cell, and Arc with Mutex, in contexts where interior mutability is necessary.
This tiny language is relatively unknown outside the video game industry; however, it is a concrete example of how reference counting can be practical and efficient (especially in realtime environments).
The references are counted at a data structure level, so the problems with very frequent updates discussed above do not arise.