answersLogoWhite

0


Best Answer

The purpose in computing the sample standard deviation is to estimate the amount of spread in the population from which the samples are drawn. Ideally, therefore, we would compute deviations from the mean of all the items in the population, rather than the deviations from the sample mean. However the population mean is generally unknown, so the sample mean would be used in place.

It is a mathematical fact that the deviations around the sample mean tend to be a bit smaller than the deviations around the population mean and by dividing by n-1 rather than n provide the exactly the right amount of correction.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why is the sample deviation divided by n-1in business statistics?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Sample standard deviation?

Standard deviation in statistics refers to how much deviation there is from the average or mean value. Sample deviation refers to the data that was collected from a smaller pool than the population.


Is samplemean and the sample standard deviation simple statistics?

Yes.


In statistics what is SE?

SE is the standard error. it is the standard deviation divided by the square root of sample size. It basically measures how accurately a statistic describes the population.


What does s-hat mean in statistics?

It is the estimate for s, the sample standard deviation.


What does the s stand for in statistics?

Usually s means standard deviation of a sample.


What is the value of the standard error of the sample mean?

The sample standard deviation (s) divided by the square root of the number of observations in the sample (n).


Why is standard deviation of a statistic called standard error?

The standard error is the standard deviation divided by the square root of the sample size.


Can a standard deviation of a sample be less than a standard deviation of a population?

Sure it can. But in the survey business, the trick is to select your sample carefully so that they'll be equal, i.e. a sample that is accurately representative of the population.


What does mean S in statistics?

s is the standard deviation of a sample. It is difficult to know what you are asking. I will note that there is a statistical programming language called S-Plus, see "Modern Applied Statistics with S-Plus, by Venables and Ripley. I also note that "s" is also used commonly in statistics as standard deviation of a sample. That's about all that comes to mind.


What is the sample standard deviation of 27.5?

A single observation cannot have a sample standard deviation.


How does a sample size impact the standard deviation?

If I take 10 items (a small sample) from a population and calculate the standard deviation, then I take 100 items (larger sample), and calculate the standard deviation, how will my statistics change? The smaller sample could have a higher, lower or about equal the standard deviation of the larger sample. It's also possible that the smaller sample could be, by chance, closer to the standard deviation of the population. However, A properly taken larger sample will, in general, be a more reliable estimate of the standard deviation of the population than a smaller one. There are mathematical equations to show this, that in the long run, larger samples provide better estimates. This is generally but not always true. If your population is changing as you are collecting data, then a very large sample may not be representative as it takes time to collect.


What is the standard error if the population standard deviation is 100 and the sample size is 25?

Formula for standard error (SEM) is standard deviation divided by the square root of the sample size, or s/sqrt(n). SEM = 100/sqrt25 = 100/5 = 20.