During most of the selection work I have done you end up having to select from a short list of around three or four solutions with some dozen criteria. PCA uses eigenvector and eigenvalue techniques on the covariance matrix of the scores. With a short list of three or four the statistical significance of the solution is going to be extremely poor.

A case history weighted tree is shown below.

There are four strategic options being marked against eleven criteria. I normalised the scores to zero means and used

1 -1.9

2 -0.6

3 -0.1

4 1.1

Actually, the same order as the client weighted scores!?!

The PCA criteria weights were:

1.1 .03

1.2 .3

1.3 -.31

2.1 .35

2.2 .35

2.3 -.39

3.1 .32

3.2 -.01

3.3 -.39

3.4 .02

3.5 -.31

Comparing these weights with the row scores in the table above gives:

- Criteria with high spread of scores get a high PCA weight.

- Criteria with low spread of scores get a low PCA weight.

All as you would expect from PCA.

The central limit theorem comes to our rescue at with, say, ten options to score and statistically significant results should therefore be generated at this level.

However, I feel that DCA cannot be used in a typical short list situation because of the significance issue.

Ian Richmond

Email: ‘About’ and ‘Email Me’ link.

## Comments

You can follow this conversation by subscribing to the comment feed for this post.