Getting scores from PCAGetting nicer values for eigensystemsGetting an approximate analytical form for...
How to remove something from the slug/url
How quickly could a country build a tall concrete wall around a city?
Can I call myself an assistant professor without a PhD
Infeasibility in mathematical optimization models
How to display a duet in lyrics?
Non-OR journals which regularly publish OR research
Is TA-ing worth the opportunity cost?
Dropdowns & Chevrons for Right to Left languages
Why does Intel's Haswell chip allow multiplication to be twice as fast as addition?
How do we avoid CI-driven development...?
Why does this Pokémon I just hatched need to be healed?
Ex-contractor published company source code and secrets online
When "he's gone" means "he's dead", is it a contraction of "he is" or "he has"?
In Pokémon Go, why does one of my Pikachu have an option to evolve, but another one doesn't?
How to help new students accept function notation
Does the United States guarantee any unique freedoms?
Do other countries guarantee freedoms that the United States does not have?
Tikzcd pullback square issue
In the movie Harry Potter and the Order or the Phoenix, why didn't Mr. Filch succeed to open the Room of Requirement if it's what he needed?
Does two puncture wounds mean venomous snake?
How can I re-use my password and still protect the password if it is exposed from one source?
Is it true that control+alt+delete only became a thing because IBM would not build Bill Gates a computer with a task manager button?
How many numbers in the matrix?
Why couldn't soldiers sight their own weapons without officers' orders?
Getting scores from PCA
Getting nicer values for eigensystemsGetting an approximate analytical form for eigenvalues of a matrixSorting eigenvectors according to its projectionHow to interpret the results of PCAGetting the row/column reduction matrix of a matrix mGetting different eigenvalues for same matrix?How to efficiently apply PCA followed by SVD to extract the components of PCA?Getting error message from NSolveGetting least norm solutionWhy do I keep getting the same time from the Timing function?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
$begingroup$
I'm not super familiar with Principal Component Analysis, but from what I understand, it sorts a vector in order of decreasing variance, and uses that to transform the vector to correlate variables linearly.
I'm not trying to transform the data I have - I'm simply trying, for a list of 784-length feature vectors, to get the PCA "scores" for each feature. Mathematica returns them sorted - I need to be able to correlate them with the actual features, so I need them in their original order. PrincipalComponents doesn't seem to be able to return anything in this format. Is there a way to?
matrix linear-algebra
$endgroup$
add a comment |
$begingroup$
I'm not super familiar with Principal Component Analysis, but from what I understand, it sorts a vector in order of decreasing variance, and uses that to transform the vector to correlate variables linearly.
I'm not trying to transform the data I have - I'm simply trying, for a list of 784-length feature vectors, to get the PCA "scores" for each feature. Mathematica returns them sorted - I need to be able to correlate them with the actual features, so I need them in their original order. PrincipalComponents doesn't seem to be able to return anything in this format. Is there a way to?
matrix linear-algebra
$endgroup$
2
$begingroup$
doesPrincipalComponents[data][[Ordering @ data]]
orPrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?
$endgroup$
– kglr
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago
add a comment |
$begingroup$
I'm not super familiar with Principal Component Analysis, but from what I understand, it sorts a vector in order of decreasing variance, and uses that to transform the vector to correlate variables linearly.
I'm not trying to transform the data I have - I'm simply trying, for a list of 784-length feature vectors, to get the PCA "scores" for each feature. Mathematica returns them sorted - I need to be able to correlate them with the actual features, so I need them in their original order. PrincipalComponents doesn't seem to be able to return anything in this format. Is there a way to?
matrix linear-algebra
$endgroup$
I'm not super familiar with Principal Component Analysis, but from what I understand, it sorts a vector in order of decreasing variance, and uses that to transform the vector to correlate variables linearly.
I'm not trying to transform the data I have - I'm simply trying, for a list of 784-length feature vectors, to get the PCA "scores" for each feature. Mathematica returns them sorted - I need to be able to correlate them with the actual features, so I need them in their original order. PrincipalComponents doesn't seem to be able to return anything in this format. Is there a way to?
matrix linear-algebra
matrix linear-algebra
edited 10 hours ago
TreFox
asked 11 hours ago
TreFoxTreFox
1,5507 silver badges23 bronze badges
1,5507 silver badges23 bronze badges
2
$begingroup$
doesPrincipalComponents[data][[Ordering @ data]]
orPrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?
$endgroup$
– kglr
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago
add a comment |
2
$begingroup$
doesPrincipalComponents[data][[Ordering @ data]]
orPrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?
$endgroup$
– kglr
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago
2
2
$begingroup$
does
PrincipalComponents[data][[Ordering @ data]]
or PrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?$endgroup$
– kglr
10 hours ago
$begingroup$
does
PrincipalComponents[data][[Ordering @ data]]
or PrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?$endgroup$
– kglr
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
You're in luck, because I recently waded through this problem myself. If I understand your question correctly, you want to know the matrix that transforms the data into the output of PrincipalComponents
. The answer to this: that matrix is just the eigenvectors of the correlation matrix.
Simple 2D example:
data = RandomVariate[BinormalDistribution[{-1, 2}, {1, 2}, 0.9], 100];
eig = Eigensystem[Covariance[data]];
ListPlot[
{
PrincipalComponents[data],
Standardize[data, Mean, 1 &].Transpose[eig[[2]]]
}
]
As you can see, the result is the same except for the two clouds being mirrored in the x-axis. This makes sense, since principle component analysis is about transforming the data such that the covariance matrix become diagonal (with the diagonal decreasing towards the bottom right) and flipping the data along an axis leaves the covariance invariant.
As a bonus, the eigenvalues of the covariance matrix tells you how much variance each principle component accounts for, so you don't have to calculate that separately:
eig[[1]]
Variance[PrincipalComponents[data]]
Out[142]= {4.62687, 0.137012}
Out[143]= {4.62687, 0.137012}
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "387"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f203527%2fgetting-scores-from-pca%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You're in luck, because I recently waded through this problem myself. If I understand your question correctly, you want to know the matrix that transforms the data into the output of PrincipalComponents
. The answer to this: that matrix is just the eigenvectors of the correlation matrix.
Simple 2D example:
data = RandomVariate[BinormalDistribution[{-1, 2}, {1, 2}, 0.9], 100];
eig = Eigensystem[Covariance[data]];
ListPlot[
{
PrincipalComponents[data],
Standardize[data, Mean, 1 &].Transpose[eig[[2]]]
}
]
As you can see, the result is the same except for the two clouds being mirrored in the x-axis. This makes sense, since principle component analysis is about transforming the data such that the covariance matrix become diagonal (with the diagonal decreasing towards the bottom right) and flipping the data along an axis leaves the covariance invariant.
As a bonus, the eigenvalues of the covariance matrix tells you how much variance each principle component accounts for, so you don't have to calculate that separately:
eig[[1]]
Variance[PrincipalComponents[data]]
Out[142]= {4.62687, 0.137012}
Out[143]= {4.62687, 0.137012}
$endgroup$
add a comment |
$begingroup$
You're in luck, because I recently waded through this problem myself. If I understand your question correctly, you want to know the matrix that transforms the data into the output of PrincipalComponents
. The answer to this: that matrix is just the eigenvectors of the correlation matrix.
Simple 2D example:
data = RandomVariate[BinormalDistribution[{-1, 2}, {1, 2}, 0.9], 100];
eig = Eigensystem[Covariance[data]];
ListPlot[
{
PrincipalComponents[data],
Standardize[data, Mean, 1 &].Transpose[eig[[2]]]
}
]
As you can see, the result is the same except for the two clouds being mirrored in the x-axis. This makes sense, since principle component analysis is about transforming the data such that the covariance matrix become diagonal (with the diagonal decreasing towards the bottom right) and flipping the data along an axis leaves the covariance invariant.
As a bonus, the eigenvalues of the covariance matrix tells you how much variance each principle component accounts for, so you don't have to calculate that separately:
eig[[1]]
Variance[PrincipalComponents[data]]
Out[142]= {4.62687, 0.137012}
Out[143]= {4.62687, 0.137012}
$endgroup$
add a comment |
$begingroup$
You're in luck, because I recently waded through this problem myself. If I understand your question correctly, you want to know the matrix that transforms the data into the output of PrincipalComponents
. The answer to this: that matrix is just the eigenvectors of the correlation matrix.
Simple 2D example:
data = RandomVariate[BinormalDistribution[{-1, 2}, {1, 2}, 0.9], 100];
eig = Eigensystem[Covariance[data]];
ListPlot[
{
PrincipalComponents[data],
Standardize[data, Mean, 1 &].Transpose[eig[[2]]]
}
]
As you can see, the result is the same except for the two clouds being mirrored in the x-axis. This makes sense, since principle component analysis is about transforming the data such that the covariance matrix become diagonal (with the diagonal decreasing towards the bottom right) and flipping the data along an axis leaves the covariance invariant.
As a bonus, the eigenvalues of the covariance matrix tells you how much variance each principle component accounts for, so you don't have to calculate that separately:
eig[[1]]
Variance[PrincipalComponents[data]]
Out[142]= {4.62687, 0.137012}
Out[143]= {4.62687, 0.137012}
$endgroup$
You're in luck, because I recently waded through this problem myself. If I understand your question correctly, you want to know the matrix that transforms the data into the output of PrincipalComponents
. The answer to this: that matrix is just the eigenvectors of the correlation matrix.
Simple 2D example:
data = RandomVariate[BinormalDistribution[{-1, 2}, {1, 2}, 0.9], 100];
eig = Eigensystem[Covariance[data]];
ListPlot[
{
PrincipalComponents[data],
Standardize[data, Mean, 1 &].Transpose[eig[[2]]]
}
]
As you can see, the result is the same except for the two clouds being mirrored in the x-axis. This makes sense, since principle component analysis is about transforming the data such that the covariance matrix become diagonal (with the diagonal decreasing towards the bottom right) and flipping the data along an axis leaves the covariance invariant.
As a bonus, the eigenvalues of the covariance matrix tells you how much variance each principle component accounts for, so you don't have to calculate that separately:
eig[[1]]
Variance[PrincipalComponents[data]]
Out[142]= {4.62687, 0.137012}
Out[143]= {4.62687, 0.137012}
edited 9 hours ago
answered 10 hours ago
Sjoerd SmitSjoerd Smit
6,38012 silver badges24 bronze badges
6,38012 silver badges24 bronze badges
add a comment |
add a comment |
Thanks for contributing an answer to Mathematica Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f203527%2fgetting-scores-from-pca%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
does
PrincipalComponents[data][[Ordering @ data]]
orPrincipalComponents[data][[Ordering[Variance/@ data]]]
give what you need?$endgroup$
– kglr
10 hours ago
$begingroup$
@kglr Since I'm passing PrincipalComponents a 10x784 matrix, that seems to just reorder the 784-length sublists, rather than the values within those lists. I want to correlate the PCA scores with the feature (value in the 784-length list) they actually correspond to.
$endgroup$
– TreFox
10 hours ago