answersLogoWhite

Top Answer
User Avatar
Wiki User
Answered 2010-09-18 01:19:14

The standard deviation and the arithmetic mean measure two different characteristics of a set of data. The standard deviation measures how spread out the data is, whereas the arithmetic mean measures where the data is centered. Because of this, there is no particular relation that must be satisfied because the standard deviation is greater than the mean.

Actually, there IS a relationship between the mean and standard deviation. A high (large) standard deviation indicates a wide range of scores = a great deal of variance. Generally speaking, the greater the range of scores, the less representative the mean becomes (if we are using "mean" to indicate "normal"). For example, consider the following example:

10 students are given a test that is worth 100 points. Only 1 student gets a 100, 2 students receive a zero, and the remaining 7 students get a score of 50.

(Arithmetic mean) = 100 + 0(2) + 7(50) = 100 + 0 + 350 = 450/10 students

SCORE = 45

In statistics, the median refers to the value at the 50% percentile. That means that half of the scores fall below the median & the other half are above the median. Using the example above, the scores are: 0, 0, 50, 50, (50, 50), 50, 50, 50, 100. The median is the score that has the same number of occurrences above it and below it. For an odd number of scores, there is exactly one in the middle, and that would be the median. Using this example, we have an even number of scores, so the "middle 2" scores are averaged for the median value. These "middle" scores are bracketed by parenthesis in the list, and in this case are both equal to 50 (which average to 50, so the median is 50). In this case, the standard deviation of these scores is 26.9, which indicates a fairly wide "spread" of the numbers. For a "normal" distribution, most of the scores should center around the same value (in this case 50, which is also known as the "mode" - or the score that occurs most frequently) & as you move towards the extremes (very high or very low values), there should be fewer scores.

012
๐Ÿ™
0
๐Ÿคจ
0
๐Ÿ˜ฎ
0
๐Ÿ˜‚
0
User Avatar

Your Answer

Still Have Questions?

Related Questions

Can standard deviation be greater than mean?

Standard deviation can be greater than the mean.


What does it indicate if the mean is greater than the standard deviation?

It does not indicate anything if the mean is greater than the standard deviation.


Can the mean be less than the standard deviation?

In general, a mean can be greater or less than the standard deviation.


How do you calculate mean and Median smaller then Standard deviation?

In the same way that you calculate mean and median that are greater than the standard deviation!


Is the mean for a set of data always greater than the standard deviation?

Yes; the standard deviation is the square root of the mean, so it will always be larger.


What are all the values a standard deviation can take?

The standard deviation must be greater than or equal to zero.


Why is the standard error a smaller numerical value compared to the standard deviation?

Let sigma = standard deviation. Standard error (of the sample mean) = sigma / square root of (n), where n is the sample size. Since you are dividing the standard deviation by a positive number greater than 1, the standard error is always smaller than the standard deviation.


Is the standard deviation best thought of as the distance from the mean?

No. A small standard deviation with a large mean will yield points further from the mean than a large standard deviation of a small mean. Standard deviation is best thought of as spread or dispersion.


What is the difference between a general normal curve and a standard normal curve?

A standard normal distribution has a mean of zero and a standard deviation of 1. A normal distribution can have any real number as a mean and the standard deviation must be greater than zero.


What is the z score of 1.0?

It is the value that is one standard deviation greater than the mean of a Normal (Gaussian) distribution.


Can a standard deviation be greater than its mean?

Yes - but the distribution is not a normal distribution - this can happen with a distribution that has a very long tail.


Can the Variance ever be smaller than standard deviation?

Yes. If the variance is less than 1, the standard deviation will be greater that the variance. For example, if the variance is 0.5, the standard deviation is sqrt(0.5) or 0.707.


Is standard deviation always smaller than mean?

No.


How does one interpret a standard deviation which is more than the mean?

Standard deviation is a measure of the dispersion of the data. When the standard deviation is greater than the mean, a coefficient of variation is greater than one. See: http://en.wikipedia.org/wiki/Coefficient_of_variation If you assume the data is normally distributed, then the lower limit of the interval of the mean +/- one standard deviation (68% confidence interval) will be a negative value. If it is not realistic to have negative values, then the assumption of a normal distribution may be in error and you should consider other distributions. Common distributions with no negative values are gamma, log normal and exponential.


How is standard deviation different from mean absolute decation?

If I have understood the question correctly, despite your challenging spelling, the standard deviation is the square root of the average of the squared deviations while the mean absolute deviation is the average of the deviation. One consequence of this difference is that a large deviation affects the standard deviation more than it affects the mean absolute deviation.


Sample standard deviation?

Standard deviation in statistics refers to how much deviation there is from the average or mean value. Sample deviation refers to the data that was collected from a smaller pool than the population.


Can Standard Deviation be greater than 100?

Yes. It can have any non-negative value.


How do you know if a z score is positive or negative?

A negative Z-Score corresponds to a negative standard deviation, i.e. an observation that is less than the mean, when the standard deviation is normalized so that the standard deviation is zero when the mean is zero.


What is mean deviation and why is quartile deviation better than mean deviation?

What is mean deviation and why is quartile deviation better than mean deviation?


What does it mean if the standard deviation is less than the mean?

If the mean is less than or equal to zero, it means there has been a serious calculation error. If the mean is greater than zero and the distribution is Gaussian (standard normal), it means that there is an 84.1% chance that the value of a randomly variable will be positive.


What is the sample size for standard deviation?

There is no such thing. The standard error can be calculated for a sample of any size greater than 1.


Let X be a normal random variable with mean 10 and standard deviation of 5 What is the probability that X is greater than 12?

30 percent.


Assume that X has a normal distribution with mean equals 15.2 and standard deviation equals 0.9 What is the probability that X is greater than 16.1?

0.8413


What is the standard deviation of 9?

You need more than one number to calculate a standard deviation, so 9 does not have a standard deviation.


What is considered a high standard deviation?

There's no valid answer to your question. The problem is a standard deviation can be close to zero, but there is no upper limit. So, I can make a statement that if my standard deviation is much smaller than my mean, this indicates a low standard deviation. This is somewhat subjective. But I can't make say that if my standard deviation is many times the mean value, that would be considered high. It depends on the problem at hand.


Still have questions?