answersLogoWhite

0


Best Answer

No. Well not exactly.

The square of the standard deviation of a sample, when squared (s2) is an unbiased estimate of the variance of the population.

I would not call it crude, but just an estimate. An estimate is an approximate value of the parameter of the population you would like to know (estimand) which in this case is the variance.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Standard deviation considered a crude measure of variance?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

What does the standard deviation tell us?

standard deviation is the square roots of variance, a measure of spread or variability of data . it is given by (variance)^1/2


What is used as a measure of total risk?

The standard deviation or volatility (square root of the variance) of returns.


Which type of measure of dispersion is mostly used standard deviation or variance?

They are effectively the same but the standard deviation is more popular because the units of measurement are the same as those for the variable.


What is a Measure of spread about the mean?

Standard error, standard deviation, variance, range, inter-quartile range as well as measures based on other percentiles.


What is the relationship between the mean and standard deviation in statistics?

The 'standard deviation' in statistics or probability is a measure of how spread out the numbers are. It mathematical terms, it is the square root of the mean of the squared deviations of all the numbers in the data set from the mean of that set. It is approximately equal to the average deviation from the mean. If you have a set of values with low standard deviation, it means that in general, most of the values are close to the mean. A high standard deviation means that the values in general, differ a lot from the mean. The variance is the standard deviation squared. That is to say, the standard deviation is the square root of the variance. To calculate the variance, we simply take each number in the set and subtract it from the mean. Next square that value and do the same for each number in the set. Lastly, take the mean of all the squares. The mean of the squared deviation from the mean is the variance. The square root of the variance is the standard deviation. If you take the following data series for example, the mean for all of them is '3'. 3, 3, 3, 3, 3, 3 all the values are 3, they're the same as the mean. The standard deviation is zero. This is because the difference from the mean is zero in each case, and after squaring and then taking the mean, the variance is zero. Last, the square root of zero is zero so the standard deviation is zero. Of note is that since you are squaring the deviations from the mean, the variance and hence the standard deviation can never be negative. 1, 3, 3, 3, 3, 5 - most of the values are the same as the mean. This has a low standard deviation. In this case, the standard deviation is very small since most of the difference from the mean are small. 1, 1, 1, 5, 5, 5 - all the values are two higher or two lower than the mean. This series has the highest standard deviation.

Related questions

Why standard deviation is better measure of variance?

1. Standard deviation is not a measure of variance: it is the square root of the variance.2. The answer depends on better than WHAT!


What does the standard deviation tell us?

standard deviation is the square roots of variance, a measure of spread or variability of data . it is given by (variance)^1/2


What do the variance and the standard deviation measure?

They are measures of the spread of distributions about their mean.


Which measure of variation is appropriate when using the mean?

The variance or standard deviation.


Why standard deviation is more often used than variance?

Both variance and standard deviation are measures of dispersion or variability in a set of data. They both measure how far the observations are scattered away from the mean (or average). While computing the variance, you compute the deviation of each observation from the mean, square it and sum all of the squared deviations. This somewhat exaggerates the true picure because the numbers become large when you square them. So, we take the square root of the variance (to compensate for the excess) and this is known as the standard deviation. This is why the standard deviation is more often used than variance but the standard deviation is just the square root of the variance.


What is used as a measure of total risk?

The standard deviation or volatility (square root of the variance) of returns.


Which type of measure of dispersion is mostly used standard deviation or variance?

They are effectively the same but the standard deviation is more popular because the units of measurement are the same as those for the variable.


What is a measure of the spread of a set of data?

The standard deviation is the value most used. Others are variance, interquartile range, or range.


What is a Measure of spread about the mean?

Standard error, standard deviation, variance, range, inter-quartile range as well as measures based on other percentiles.


Do units of measure follow the standard deviation?

Units of measure do follow the standard deviation.


What measures are used to describe variability?

Generally, the standard deviation (represented by sigma, an O with a line at the top) would be used to measure variability. The standard deviation represents the average distance of data from the mean. Another measure is variance, which is the standard deviation squared. Lastly, you might use the interquartile range, which is often the range of the middle 50% of the data.


How is variance used to measure risk?

In finance, risk of investments may be measured by calculating the variance and standard deviation of the distribution of returns on those investments. Variance measures how far in either direction the amount of the returns may deviate from the mean.