Proteomics is the study of proteins on a large scale, encompassing the many interests scientists and physicians have in their expression and physical properties. Proteomics continues to be a rapidly expanding field, with a wealth of reports regularly appearing on technology enhancements and scientific studies using these new tools. This review focuses primarily on the quantitative aspect of protein expression and the associated computational machinery for making large-scale identifications of proteins and their post-translational modifications. The primary emphasis is on the combination of liquid chromatographymass spectrometry (LCMS) methods and associated tandem mass spectrometry (LCMS/MS). Tandem mass spectrometry, or MS/MS, involves a second analysis within the instrument after a molecular dissociative event in order to obtain structural information including but not limited to sequence information. This review further focuses primarily on the study of in vitro digested proteins known as bottom-up or shotgun proteomics. A brief discussion of recent instrumental improvements precedes a discussion on affinity enrichment and depletion of proteins, followed by a review of the major approaches (label-free and isotope-labeling) to making protein expression measurements quantitative, especially in the context of profiling large numbers of proteins. Then a discussion follows on the various computational techniques used to identify peptides and proteins from LCMS/MS data. This review article then includes a short discussion of LCMS approaches to three-dimensional structure determination and concludes with a section on statistics and data mining for proteomics, including comments on properly powering clinical studies and avoiding over-fitting with large data sets.
Becker, C.; Bern, M. W. Recent developments in quantitative proteomics. Mutation Research - Genetic Toxicology and Environmental Mutagenesis. 2011 June 17; 722 (2): 171-182.