Nucleic acid quantitation

In molecular biology, quantitation of nucleic acids is commonly performed to determine the average concentrations of DNA or RNA present in a mixture, as well as their purity.

To date, there are two main approaches used by scientists to quantitate, or establish the concentration, of nucleic acids (such as DNA or RNA) in a solution.

[1] A spectrophotometer is able to determine the average concentrations of the nucleic acids DNA or RNA present in a mixture, as well as their purity.

Spectrophotometric analysis is based on the principles that nucleic acids absorb ultraviolet light in a specific pattern.

[3] The optical density [4] is generated from equation: In practical terms, a sample that contains no DNA or RNA should not absorb any of the ultraviolet light and therefore produce an OD of 0 Optical density= Log (100/100)=0 When using spectrophotometric analysis to determine the concentration of DNA or RNA, the Beer–Lambert law is used to determine unknown concentrations without the need for standard curves.

The same conversion factors apply, and therefore, in such contexts: It is common for nucleic acid samples to be contaminated with other molecules (i.e. proteins, organic compounds, other).

The secondary benefit of using spectrophotometric analysis for nucleic acid quantitation is the ability to determine sample purity using the 260 nm:280 nm calculation.

[2][7] The reverse, however, is not true — it takes a relatively large amount of protein contamination to significantly affect the 260:280 ratio in a nucleic acid solution.

While the protein contamination cannot be reliably assessed with a 260:280 ratio, this also means that it contributes little error to DNA quantity estimation.

[10] To date there is no fluorescence method to determine protein contamination of a DNA sample that is similar to the 260 nm/280 nm spectrophotometric version.

Optical density of ribosome sample. The important wavelengths of 260nm and 280nm are labeled.