[1] A standard solution ideally has a high degree of purity and is stable enough that the concentration can be accurately measured after a long shelf time.
[2] Making a standard solution requires great attention to detail to avoid introducing any risk of contamination that could diminish the accuracy of the concentration.
The concentrations of standard solutions are normally expressed in units of moles per litre (mol/L, often abbreviated to M for molarity), moles per cubic decimetre (mol/dm3), kilomoles per cubic metre (kmol/m3), grams per milliliters (g/mL), or in terms related to those used in particular titrations (such as titres).
The compound must not be hydroscopic to have a mass that accurately represents the exact number of moles when weighed.
An example of a secondary standard is sodium hydroxide, a hydroscopic compound that is highly reactive with its surroundings.
Multiple samples with unknown concentrations can then be analyzed using this calibration curve which make it a useful tool.
Standard solutions are commonly used to determine the concentration of an analyte species via calibration curve.
[11] Internal standards are used in GC/MS and LC/MS to control for variability introduced by injection, sample preparation and other matrix effects.
[12] A common type of internal standard is an isotopically labeled analogue of the analyte, which incorporates one or more atoms of 2H, 13C, 15N and 18O into its structure.
A table summarizing a method for creating these solutions is shown below: 25 mL Volumetric Flask Here, a stock solution of glutamine is added in increasing increments with a high-accuracy instrument and diluted to the same volume in volumetric flasks.