期刊问答网 论文发表 期刊发表 期刊问答
  • 回答数

    1

  • 浏览数

    86

日月阴阳
首页 > 期刊问答网 > 期刊问答 > 关于主成分分析的论文

1个回答 默认排序1
  • 默认排序
  • 按时间排序

快乐每一天7895

已采纳
你的邮箱发不进去,请换一个,这里发部分供你参考Principal component analysisPrincipal component analysis (PCA) is a mathematical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of uncorrelated variables called principal The number of principal components is less than or equal to the number of original This transformation is defined in such a way that the first principal component has as high a variance as possible (that is, accounts for as much of the variability in the data as possible), and each succeeding component in turn has the highest variance possible under the constraint that it be orthogonal to (uncorrelated with) the preceding Principal components are guaranteed to be independent only if the data set is jointly normally PCA is sensitive to the relative scaling of the original Depending on the field of application, it is also named the discrete Karhunen–Loève transform (KLT), the Hotelling transform or proper orthogonal decomposition (POD)PCA was invented in 1901 by Karl P[1] Now it is mostly used as a tool in exploratory data analysis and for making predictive PCA can be done by eigenvalue decomposition of a data covariance matrix or singular value decomposition of a data matrix, usually after mean centering the data for each The results of a PCA are usually discussed in terms of component scores (the transformed variable values corresponding to a particular case in the data) and loadings (the weight by which each standarized original variable should be multiplied to get the component score) (Shaw, 2003)PCA is the simplest of the true eigenvector-based multivariate Often, its operation can be thought of as revealing the internal structure of the data in a way which best explains the variance in the If a multivariate dataset is visualised as a set of coordinates in a high-dimensional data space (1 axis per variable), PCA can supply the user with a lower-dimensional picture, a "shadow" of this object when viewed from its (in some sense) most informative This is done by using only the first few principal components so that the dimensionality of the transformed data is PCA is closely related to factor analysis; indeed, some statistical packages (such as Stata) deliberately conflate the two True factor analysis makes different assumptions about the underlying structure and solves eigenvectors of a slightly different

关于主成分分析的论文

153 评论(10)

相关问答