One of the most common applications of spectrophotometry is to determine the concentration of an analyte in a solution. The experimental approach exploits Beer's Law, which predicts a linear relationship between the absorbance of the solution and the concentration of the analyte (assuming all other experimental parameters do not vary).
In practice, a series of standard solutions are prepared. A standard solution is a solution in which the analyte concentration is accurately known. The absorbances of the standard solutions are measured and used to prepare a calibration curve, which in this case is a plot of absorbance vs concentration. The points on the calibration curve should yield a straight line.
The unknown solution is then analyzed. The absorbance of the unknown solution is used in conjunction with the calibration curve to determine the concentration of the analyte. The data obtained from the standard are used to plot a straight line. The slope and intercept of that line provide a relationship between absorbance and concentration:
The absorbance of the unknown solution, Au, is then used with the slope and intercept to calculate the concentration of the unknown solution, cu.
|cu =||Au - intercept
Run each simulation sufficiently long to detect at least 1000 photons. (Not all photons are shown on the screen.) Because the intensity for the blank is used to calculate all absorbances, it is especially important that the intensity for the blank be known accurately. If possible, wait until at least 4000 photons are detected for the blank.