Data collected on the physical, biological or man-made world are often highly correlated with one another, posing the question of whether fewer variables would contain almost as much information. A crude solution is simply to look at the Pearson correlation matrix and omit one of a pair of highly correlated variables. In contrast to this, we develop a systematic method of conditioning on one or more variables, and observing the resulting partial covariance matrix. If the variables have little variance after the conditioning, then the conditioning variables contain most of the information of all the original variables. Paralleling the usual tests applied injudging how many principal components are sufficient to represent all the data, we use the amount of variance explained by the conditioning variable(s), as a measure of information content. The paper explains the computation and includes examples using published data sets. The approach is found to be highly competitive with using principal components, and has the obvious advantage over principal components of simply omitting some of the original variables from further consideration. The method has been coded in Visual-Basic add-ins to an Excel spreadsheet