What is the reasoning behind standardization (dividing by standard deviation)?












6












$begingroup$


Why does dividing a dataset by sigma make the sample variance equal to 1? Assuming a zero mean for simplicity.



What's the intuition behind this?



Dividing by the range (max-min) makes intuitive sense. But standard deviation does not.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:44






  • 1




    $begingroup$
    When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:47


















6












$begingroup$


Why does dividing a dataset by sigma make the sample variance equal to 1? Assuming a zero mean for simplicity.



What's the intuition behind this?



Dividing by the range (max-min) makes intuitive sense. But standard deviation does not.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:44






  • 1




    $begingroup$
    When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:47
















6












6








6


3



$begingroup$


Why does dividing a dataset by sigma make the sample variance equal to 1? Assuming a zero mean for simplicity.



What's the intuition behind this?



Dividing by the range (max-min) makes intuitive sense. But standard deviation does not.










share|cite|improve this question











$endgroup$




Why does dividing a dataset by sigma make the sample variance equal to 1? Assuming a zero mean for simplicity.



What's the intuition behind this?



Dividing by the range (max-min) makes intuitive sense. But standard deviation does not.







standardization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 18 at 16:26









Karolis Koncevičius

2,09941527




2,09941527










asked Mar 18 at 12:09









alwayscuriousalwayscurious

1985




1985








  • 1




    $begingroup$
    The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:44






  • 1




    $begingroup$
    When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:47
















  • 1




    $begingroup$
    The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:44






  • 1




    $begingroup$
    When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
    $endgroup$
    – Nick Cox
    Mar 18 at 16:47










1




1




$begingroup$
The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
$endgroup$
– Nick Cox
Mar 18 at 16:44




$begingroup$
The zero mean assumption isn't necessary. You can take this as three separate statements: dividing by SD gives an SD of 1; the variance is the square of the SD; and the square of 1 is 1.
$endgroup$
– Nick Cox
Mar 18 at 16:44




1




1




$begingroup$
When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
$endgroup$
– Nick Cox
Mar 18 at 16:47






$begingroup$
When people say intuitive, I translate that as "familiar to me", and most of the time it fits. Reasons for not dividing by the range are practical rather than theoretical. The range can be highly labile. Also, often the range of all values is enormously larger than the that of the bulk of the values, so the results wouldn't be very helpful. Income illustrates both points: the observed maximum may vary capriciously and values divided by the range would often be concentrated near 0.
$endgroup$
– Nick Cox
Mar 18 at 16:47












2 Answers
2






active

oldest

votes


















15












$begingroup$

This stems from the property of variance. For a random variable $X$ and a constant $a$, $mathrm{var}(aX)=a^2mathrm{var}(x)$. Therefore, if you divide the data by its standard deviation ($sigma$), $mathrm{var}(X/sigma)=mathrm{var}(X)/sigma^2=sigma^2/sigma^2=1$.






share|cite|improve this answer








New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 1




    $begingroup$
    that helps, thanks. Do you have an intuitive approach?
    $endgroup$
    – alwayscurious
    Mar 18 at 13:00



















5












$begingroup$

Standardizing is is just changing the units so they are in "standard deviation" units. After standardization, a value of 1.5 means "1.5 standard deviations above 0". If the standard deviation were 8, this would be equivalent to saying "12 points above 0".



An example: when converting inches to feet (in America), you multiply your data in inches by a conversion factor, $frac{1 foot}{12 inches}$, which comes from the fact that 1 foot equals 12 inches, so you're essentially just multiplying your data points by a fancy version of 1 (i.e., a fraction with equal numerator and denominator). For example, to go from 72 inches to feet, you do $72 inches times frac{1 foot}{12 inches}=6feet$.



When converting scores from raw units to standard deviation units, you multiply your data in raw units by the conversion factor $frac{1sd}{sigma points}$. So if you had a score of 100 and the standard deviation ($sigma$) was 20, your standardized score would be $100 points times frac{1 sd}{20 points}=5sd$. Standardization is just changing the units.



Changing the units of a dataset doesn't affect how spread out it is; you just change the units of the measure of spread you're using so that they match. So if your original data had a standard deviation of 20 points, and you've changed units so that 20 original points equals 1 new standardized unit, then the new standard deviation is 1 unit (because 20 original units equals 1 new unit).






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
    $endgroup$
    – Nick Cox
    Mar 18 at 18:16










  • $begingroup$
    I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
    $endgroup$
    – Noah
    2 days ago










  • $begingroup$
    Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
    $endgroup$
    – Nick Cox
    2 days ago













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398116%2fwhat-is-the-reasoning-behind-standardization-dividing-by-standard-deviation%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









15












$begingroup$

This stems from the property of variance. For a random variable $X$ and a constant $a$, $mathrm{var}(aX)=a^2mathrm{var}(x)$. Therefore, if you divide the data by its standard deviation ($sigma$), $mathrm{var}(X/sigma)=mathrm{var}(X)/sigma^2=sigma^2/sigma^2=1$.






share|cite|improve this answer








New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 1




    $begingroup$
    that helps, thanks. Do you have an intuitive approach?
    $endgroup$
    – alwayscurious
    Mar 18 at 13:00
















15












$begingroup$

This stems from the property of variance. For a random variable $X$ and a constant $a$, $mathrm{var}(aX)=a^2mathrm{var}(x)$. Therefore, if you divide the data by its standard deviation ($sigma$), $mathrm{var}(X/sigma)=mathrm{var}(X)/sigma^2=sigma^2/sigma^2=1$.






share|cite|improve this answer








New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$









  • 1




    $begingroup$
    that helps, thanks. Do you have an intuitive approach?
    $endgroup$
    – alwayscurious
    Mar 18 at 13:00














15












15








15





$begingroup$

This stems from the property of variance. For a random variable $X$ and a constant $a$, $mathrm{var}(aX)=a^2mathrm{var}(x)$. Therefore, if you divide the data by its standard deviation ($sigma$), $mathrm{var}(X/sigma)=mathrm{var}(X)/sigma^2=sigma^2/sigma^2=1$.






share|cite|improve this answer








New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$



This stems from the property of variance. For a random variable $X$ and a constant $a$, $mathrm{var}(aX)=a^2mathrm{var}(x)$. Therefore, if you divide the data by its standard deviation ($sigma$), $mathrm{var}(X/sigma)=mathrm{var}(X)/sigma^2=sigma^2/sigma^2=1$.







share|cite|improve this answer








New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this answer



share|cite|improve this answer






New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered Mar 18 at 12:31









Chao SongChao Song

2015




2015




New contributor




Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Chao Song is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 1




    $begingroup$
    that helps, thanks. Do you have an intuitive approach?
    $endgroup$
    – alwayscurious
    Mar 18 at 13:00














  • 1




    $begingroup$
    that helps, thanks. Do you have an intuitive approach?
    $endgroup$
    – alwayscurious
    Mar 18 at 13:00








1




1




$begingroup$
that helps, thanks. Do you have an intuitive approach?
$endgroup$
– alwayscurious
Mar 18 at 13:00




$begingroup$
that helps, thanks. Do you have an intuitive approach?
$endgroup$
– alwayscurious
Mar 18 at 13:00













5












$begingroup$

Standardizing is is just changing the units so they are in "standard deviation" units. After standardization, a value of 1.5 means "1.5 standard deviations above 0". If the standard deviation were 8, this would be equivalent to saying "12 points above 0".



An example: when converting inches to feet (in America), you multiply your data in inches by a conversion factor, $frac{1 foot}{12 inches}$, which comes from the fact that 1 foot equals 12 inches, so you're essentially just multiplying your data points by a fancy version of 1 (i.e., a fraction with equal numerator and denominator). For example, to go from 72 inches to feet, you do $72 inches times frac{1 foot}{12 inches}=6feet$.



When converting scores from raw units to standard deviation units, you multiply your data in raw units by the conversion factor $frac{1sd}{sigma points}$. So if you had a score of 100 and the standard deviation ($sigma$) was 20, your standardized score would be $100 points times frac{1 sd}{20 points}=5sd$. Standardization is just changing the units.



Changing the units of a dataset doesn't affect how spread out it is; you just change the units of the measure of spread you're using so that they match. So if your original data had a standard deviation of 20 points, and you've changed units so that 20 original points equals 1 new standardized unit, then the new standard deviation is 1 unit (because 20 original units equals 1 new unit).






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
    $endgroup$
    – Nick Cox
    Mar 18 at 18:16










  • $begingroup$
    I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
    $endgroup$
    – Noah
    2 days ago










  • $begingroup$
    Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
    $endgroup$
    – Nick Cox
    2 days ago


















5












$begingroup$

Standardizing is is just changing the units so they are in "standard deviation" units. After standardization, a value of 1.5 means "1.5 standard deviations above 0". If the standard deviation were 8, this would be equivalent to saying "12 points above 0".



An example: when converting inches to feet (in America), you multiply your data in inches by a conversion factor, $frac{1 foot}{12 inches}$, which comes from the fact that 1 foot equals 12 inches, so you're essentially just multiplying your data points by a fancy version of 1 (i.e., a fraction with equal numerator and denominator). For example, to go from 72 inches to feet, you do $72 inches times frac{1 foot}{12 inches}=6feet$.



When converting scores from raw units to standard deviation units, you multiply your data in raw units by the conversion factor $frac{1sd}{sigma points}$. So if you had a score of 100 and the standard deviation ($sigma$) was 20, your standardized score would be $100 points times frac{1 sd}{20 points}=5sd$. Standardization is just changing the units.



Changing the units of a dataset doesn't affect how spread out it is; you just change the units of the measure of spread you're using so that they match. So if your original data had a standard deviation of 20 points, and you've changed units so that 20 original points equals 1 new standardized unit, then the new standard deviation is 1 unit (because 20 original units equals 1 new unit).






share|cite|improve this answer











$endgroup$









  • 2




    $begingroup$
    Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
    $endgroup$
    – Nick Cox
    Mar 18 at 18:16










  • $begingroup$
    I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
    $endgroup$
    – Noah
    2 days ago










  • $begingroup$
    Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
    $endgroup$
    – Nick Cox
    2 days ago
















5












5








5





$begingroup$

Standardizing is is just changing the units so they are in "standard deviation" units. After standardization, a value of 1.5 means "1.5 standard deviations above 0". If the standard deviation were 8, this would be equivalent to saying "12 points above 0".



An example: when converting inches to feet (in America), you multiply your data in inches by a conversion factor, $frac{1 foot}{12 inches}$, which comes from the fact that 1 foot equals 12 inches, so you're essentially just multiplying your data points by a fancy version of 1 (i.e., a fraction with equal numerator and denominator). For example, to go from 72 inches to feet, you do $72 inches times frac{1 foot}{12 inches}=6feet$.



When converting scores from raw units to standard deviation units, you multiply your data in raw units by the conversion factor $frac{1sd}{sigma points}$. So if you had a score of 100 and the standard deviation ($sigma$) was 20, your standardized score would be $100 points times frac{1 sd}{20 points}=5sd$. Standardization is just changing the units.



Changing the units of a dataset doesn't affect how spread out it is; you just change the units of the measure of spread you're using so that they match. So if your original data had a standard deviation of 20 points, and you've changed units so that 20 original points equals 1 new standardized unit, then the new standard deviation is 1 unit (because 20 original units equals 1 new unit).






share|cite|improve this answer











$endgroup$



Standardizing is is just changing the units so they are in "standard deviation" units. After standardization, a value of 1.5 means "1.5 standard deviations above 0". If the standard deviation were 8, this would be equivalent to saying "12 points above 0".



An example: when converting inches to feet (in America), you multiply your data in inches by a conversion factor, $frac{1 foot}{12 inches}$, which comes from the fact that 1 foot equals 12 inches, so you're essentially just multiplying your data points by a fancy version of 1 (i.e., a fraction with equal numerator and denominator). For example, to go from 72 inches to feet, you do $72 inches times frac{1 foot}{12 inches}=6feet$.



When converting scores from raw units to standard deviation units, you multiply your data in raw units by the conversion factor $frac{1sd}{sigma points}$. So if you had a score of 100 and the standard deviation ($sigma$) was 20, your standardized score would be $100 points times frac{1 sd}{20 points}=5sd$. Standardization is just changing the units.



Changing the units of a dataset doesn't affect how spread out it is; you just change the units of the measure of spread you're using so that they match. So if your original data had a standard deviation of 20 points, and you've changed units so that 20 original points equals 1 new standardized unit, then the new standard deviation is 1 unit (because 20 original units equals 1 new unit).







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 2 days ago

























answered Mar 18 at 18:10









NoahNoah

3,3161316




3,3161316








  • 2




    $begingroup$
    Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
    $endgroup$
    – Nick Cox
    Mar 18 at 18:16










  • $begingroup$
    I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
    $endgroup$
    – Noah
    2 days ago










  • $begingroup$
    Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
    $endgroup$
    – Nick Cox
    2 days ago
















  • 2




    $begingroup$
    Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
    $endgroup$
    – Nick Cox
    Mar 18 at 18:16










  • $begingroup$
    I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
    $endgroup$
    – Noah
    2 days ago










  • $begingroup$
    Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
    $endgroup$
    – Nick Cox
    2 days ago










2




2




$begingroup$
Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
$endgroup$
– Nick Cox
Mar 18 at 18:16




$begingroup$
Some of your answer needs an extra assumption that you have subtracted the mean, but you don't mention that. The thread question is equivocal here too, as in statistics subtracting the mean is the default, but it asks only about dividing by the SD.
$endgroup$
– Nick Cox
Mar 18 at 18:16












$begingroup$
I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
$endgroup$
– Noah
2 days ago




$begingroup$
I don't think my answer requires that assumption if we're defining standardization as just dividing by the SD (which OP does). I'm just talking about a change of units, not with reference to the center of the data. E.g., for a scale with a mean of 50 and an SD of 10, I'm saying a score of 20 would have a standardized score of 2, not -3. Subtracting the mean (centering) is a separate issue.
$endgroup$
– Noah
2 days ago












$begingroup$
Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
$endgroup$
– Nick Cox
2 days ago






$begingroup$
Fair point. I don't think defining standardization as merely dividing by the SD is at all standard, so to speak, but granting your definition that value / SD $=: z$, say, then all data points that are positive are then above 0 on the standardized $z$ scale and only points that happen to be negative are below 0 on the $z$ scale. Whether that is as useful a standardization as (value $-$ mean) / SD is open to question.
$endgroup$
– Nick Cox
2 days ago




















draft saved

draft discarded




















































Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398116%2fwhat-is-the-reasoning-behind-standardization-dividing-by-standard-deviation%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

"Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

Alcedinidae

Origin of the phrase “under your belt”?