answersLogoWhite

0


Best Answer

Use %RSD when comparing the deviation for popolations with different means. Use SD to compare data with the same mean.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: When do you use the relative standard deviation instead of standard deviation?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Statistics

Variance and standard deviation are one and the same thing?

No. But they are related. If a sample of size n is taken, a standard deviation can be calculated. This is usually denoted as "s" however some textbooks will use the symbol, sigma. The standard deviation of a sample is usually used to estimate the standard deviation of the population. In this case, we use n-1 in the denomimator of the equation. The variance of the sample is the square of the sample's standard deviation. In many textbooks it is denoted as s2. In denoting the standard deviation and variance of populations, the symbols sigma and sigma2 should be used. One last note. We use standard deviations in describing uncertainty as it's easier to understand. If our measurements are in days, then the standard deviation will also be in days. The variance will be in units of days2.


Are standard deviation and mean use for ratio data?

Yes.


What is the difference between a normal distribution and a standard normal distribution?

when you doesnt have information about the real mean of a population and use the estimation of mean instead of the real mean , usually you use t distribution instead of normal distribution. * * * * * Intersting but nothing to do with the question! If a random variable X is distributed Normally with mean m and standard deviation s, then Z = (X-m)/s has a standard Normal distribution. Z has mean 0 and standard deviation = 1 (or Variance = sd2 = 1).


When to use z or t-distribution?

If the sample size is large (>30) or the population standard deviation is known, we use the z-distribution.If the sample sie is small and the population standard deviation is unknown, we use the t-distribution


When we know the population mean but not the population standard deviation which statistic do we use to compare a sample to the population?

The sample standard error.

Related questions

Why use standard deviation and not average deviation?

Because the average deviation will always be zero.


What is relative measure?

These measures are calculated for the comparison of dispersion in two or more than two sets of observations. These measures are free of the units in which the original data is measured. If the original data is in dollar or kilometers, we do not use these units with relative measure of dispersion. These measures are a sort of ratio and are called coefficients. Each absolute measure of dispersion can be converted into its relative measure. Thus the relative measures of dispersion are:Coefficient of Range or Coefficient of Dispersion.Coefficient of Quartile Deviation or Quartile Coefficient of Dispersion.Coefficient of Mean Deviation or Mean Deviation of Dispersion.Coefficient of Standard Deviation or Standard Coefficient of Dispersion.Coefficient of Variation (a special case of Standard Coefficient of Dispersion)


Why use the T score?

T-score is used when you don't have the population standard deviation and must use the sample standard deviation as a substitute.


What is the use of coefficient of deviation?

the relative measures of the mean deviation to the average about which it is calculated,i.e. arithmetic mean.


How do you use standard deviation?

Standard deviation is a measure of how spread out a set of numbers are from each other. It has a variety of uses in statistics.


Why does the effect-size calculation use standard deviation rather than standard error?

The goal is to disregard the influence of sample size. When calculating Cohen's d, we use the standard deviation in teh denominator, not the standard error.


How do you calculate standard deviation without a normal distribution?

You calculate standard deviation the same way as always. You find the mean, and then you sum the squares of the deviations of the samples from the means, divide by N-1, and then take the square root. This has nothing to do with whether you have a normal distribution or not. This is how you calculate sample standard deviation, where the mean is determined along with the standard deviation, and the N-1 factor represents the loss of a degree of freedom in doing so. If you knew the mean a priori, you could calculate standard deviation of the sample, and only use N, instead of N-1.


How do you calculate sample standard deviation?

Here's how you do it in Excel: use the function =STDEV(<range with data>). That function calculates standard deviation for a sample.


Variance and standard deviation are one and the same thing?

No. But they are related. If a sample of size n is taken, a standard deviation can be calculated. This is usually denoted as "s" however some textbooks will use the symbol, sigma. The standard deviation of a sample is usually used to estimate the standard deviation of the population. In this case, we use n-1 in the denomimator of the equation. The variance of the sample is the square of the sample's standard deviation. In many textbooks it is denoted as s2. In denoting the standard deviation and variance of populations, the symbols sigma and sigma2 should be used. One last note. We use standard deviations in describing uncertainty as it's easier to understand. If our measurements are in days, then the standard deviation will also be in days. The variance will be in units of days2.


Are standard deviation and mean use for ratio data?

Yes.


How do you do standard deviation on Microsoft Excel?

Use the STDEV() function.


What is the difference between a normal distribution and a standard normal distribution?

when you doesnt have information about the real mean of a population and use the estimation of mean instead of the real mean , usually you use t distribution instead of normal distribution. * * * * * Intersting but nothing to do with the question! If a random variable X is distributed Normally with mean m and standard deviation s, then Z = (X-m)/s has a standard Normal distribution. Z has mean 0 and standard deviation = 1 (or Variance = sd2 = 1).