answersLogoWhite

0


Best Answer

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Muhammad Afaq

Lvl 2
3y ago

under root

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why standard deviation is more often used than variance?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Other Math

Why variance is bigger than standard deviation?

The variance is standard deviation squared, or, in other terms, the standard deviation is the square root of the variance. In many cases, this means that the variance is bigger than the standard deviation - but not always, it depends on the specific values.


The mean of A is 14 with a standard deviation of 4.2. The mean of B is 16 with a standard deviation of 4.4 Which is more dispersed?

B because the spread, in this case standard deviation, is larger.


What a large standard deviation means?

A large standard deviation means that the data were spread out. It is relative whether or not you consider a standard deviation to be "large" or not, but a larger standard deviation always means that the data is more spread out than a smaller one. For example, if the mean was 60, and the standard deviation was 1, then this is a small standard deviation. The data is not spread out and a score of 74 or 43 would be highly unlikely, almost impossible. However, if the mean was 60 and the standard deviation was 20, then this would be a large standard deviation. The data is spread out more and a score of 74 or 43 wouldn't be odd or unusual at all.


What is an acceptable standard deviation?

An acceptable standard deviation depends entirely on the study and person asking for the study. The smaller the standard deviation, the more acceptable it will be because the less likely there are to be errors.


How is standard deviation different from mean absolute decation?

If I have understood the question correctly, despite your challenging spelling, the standard deviation is the square root of the average of the squared deviations while the mean absolute deviation is the average of the deviation. One consequence of this difference is that a large deviation affects the standard deviation more than it affects the mean absolute deviation.

Related questions

Why variance is bigger than standard deviation?

The variance is standard deviation squared, or, in other terms, the standard deviation is the square root of the variance. In many cases, this means that the variance is bigger than the standard deviation - but not always, it depends on the specific values.


Does variance provide more information than standard deviation?

No. Because standard deviation is simply the square root of the variance, their information content is exactly the same.


Why is the standard deviation used more frequently than the variance?

The standard deviation has the same measurement units as the variable and is, therefore, more easily comprehended.


Which type of measure of dispersion is mostly used standard deviation or variance?

They are effectively the same but the standard deviation is more popular because the units of measurement are the same as those for the variable.


How do you calculate salary variance?

I believe you are interested in calculating the variance from a set of data related to salaries. Variance = square of the standard deviation, where: s= square root[sum (xi- mean)2/(n-1)] where mean of the set is the sum of all data divided by the number in the sample. X of i is a single data point (single salary). If instead of a sample of data, you have the entire population of size N, substitute N for n-1 in the above equation. You may find more information on the interpretation of variance, by searching wikipedia under variance and standard deviation. I note that an advantage of using the standard deviation rather than variance, is because the standard deviation will be in the same units as the mean.


Why is standard deviation a better measure of dispersion than variance?

Because it is in same units as the original data. For example, if you have a sample of lengths, all in centimetres, the sample variance will be in units of centrimetres2 which might be more difficult to interpret but the sample standard deviation with be in units of centimetres, which would be relatively easy to intepret with reference to the data.


How standard deviation and Mean deviation differ from each other?

There is 1) standard deviation, 2) mean deviation and 3) mean absolute deviation. The standard deviation is calculated most of the time. If our objective is to estimate the variance of the overall population from a representative random sample, then it has been shown theoretically that the standard deviation is the best estimate (most efficient). The mean deviation is calculated by first calculating the mean of the data and then calculating the deviation (value - mean) for each value. If we then sum these deviations, we calculate the mean deviation which will always be zero. So this statistic has little value. The individual deviations may however be of interest. See related link. To obtain the means absolute deviation (MAD), we sum the absolute value of the individual deviations. We will obtain a value that is similar to the standard deviation, a measure of dispersal of the data values. The MAD may be transformed to a standard deviation, if the distribution is known. The MAD has been shown to be less efficient in estimating the standard deviation, but a more robust estimator (not as influenced by erroneous data) as the standard deviation. See related link. Most of the time we use the standard deviation to provide the best estimate of the variance of the population.


What is the standard deviation of 9?

You need more than one number to calculate a standard deviation, so 9 does not have a standard deviation.


If the standard deviation is small the data is more dispersed?

No, if the standard deviation is small the data is less dispersed.


The mean of A is 14 with a standard deviation of 4.2. The mean of B is 16 with a standard deviation of 4.4 Which is more dispersed?

B because the spread, in this case standard deviation, is larger.


Is a z test or t test used more often?

t test, because the z test requires knowing the population standard deviation and that's rare. The t test embodies an estimate of the standard deviation.


What a large standard deviation means?

A large standard deviation means that the data were spread out. It is relative whether or not you consider a standard deviation to be "large" or not, but a larger standard deviation always means that the data is more spread out than a smaller one. For example, if the mean was 60, and the standard deviation was 1, then this is a small standard deviation. The data is not spread out and a score of 74 or 43 would be highly unlikely, almost impossible. However, if the mean was 60 and the standard deviation was 20, then this would be a large standard deviation. The data is spread out more and a score of 74 or 43 wouldn't be odd or unusual at all.