Accuracy, the ability of a methodology to give results within acceptable limits when compared to known values, is a fundamental and key requirement. Calibration is the most common approach to obtain accuracy.
Accuracy requires a comparison to a validated material. The level of confidence depends on traceability, a chain connecting the samples of interest all the way to the specific metric standards. This can be a complex task that can raise doubts in an auditor. There are different situations for the use of each of the standards of the various types. How to access validity is key.
In calibration, fundamental reliance on the linearity of response make proof of linearity important. This is more involved than only obtaining a linear regression factor or correlation coefficient. Once proven, though, calibrations can change with various factors. Monitoring calibration behavior and using that to maintain or even improve a methodology can be very useful.
- How accuracy is determined
- What are the various types of standards and what are their strengths and weaknesses
- The importance of traceability
- Calibration curves – what are the important criteria? Linearity versus nonlinearity, slope, intercept and what they mean
- Matrix effects and how to deal with them
- Using calibration data to monitor performance
- Lab Chemists
- Lab Managers
- Lab Technicians
- Lab Analysts
- Industries into Compliance Methodology (Biotech, Pharma)
- Companies into Environmental Compliance or EPA