Dimension Reduction
The high collinearity of the data in Experiment 1 suggests that the attributes may reflect a smaller number of latent factors. However, an exploratory factor analysis did not result in an interpretable factorization of the data.Here we present several analyses that group the attributes in various ways and speculate on what these groupings might mean.
Principal Components Analysis
We performed a principle components analysis on 18 attributes, including ratings of explanation quality. We excluded attributes that contained missing data (principle consensus, evidence relevance, and evidence credibility; see Table 3 of the article).
A scree plot (Figure S25) shows four eigenvalues with values greater than one. These four components collectively account for 58% of the variance in the data. For visualization, we have plotted the top two components (Figure S26).
Component 1 (x-axis), which accounts for almost a third of the variance, may reflect the completenessor fluidityof an explanation. Attributes with high loadings on this component include explanation quality, internal coherence, perceived truth, visualization, and articulation. In contrast, attributes with negative loadings on this component include incompleteness and alternatives, which both indicate whether an explanation is missing information.
Component 2 (y-axis) accounts for about 12% of the variance, and may reflect the “newness” of information the explanation. Attributes that load negatively onto this component include external coherence and prior knowledge, whereas attributes with positive loadings include novelty, complexity, and expert.
Figure S25. This scree plot shows the eigenvalues of each component from the principal components analysis.
Figure S26. The top two components from the principal components analysis account for almost 43% of the variance. The attributes are color-coded based on the hierarchical clustering presented below, for comparison.
Hierarchical Clustering Analysis
We also performed a hierarchical clustering analysis on the attribute correlation matrix to find potential groupings. We first computed the absolute value of the correlations, so that attributes that are anticorrelated could be grouped together. This is because the attributes have arbitrary directionality. For example, we asked whether an explanation was easy to visualize, but could have asked whether the explanation was hard to visualize, which would result in a reversed-coding.
Hierarchical clustering can be performed using a number of linkage criteria, which defines the similarity measure used to group attributes together. Here, we have used the mean linkage criterion, which calculates the distance between any two attributes as the average distance between each pair of elements in both vectors (columns of the correlation matrix). Thus, when two attributes show similar correlations to all other attributes, they are merged.The results are also shown in Figure 1 of the main article, with an arbitrary cut-off of eight clusters.
The results bear some similarity to the principal components analysis. A three-cluster solution is color-coded for comparison to the principal components analysis in Figure S26.
Figure S27. Hierarchical clustering of the attributes using the mean linkage criteria.