The reason the standard deviation of a distribution of means is smaller than the standard deviation of the population from which it was derived is actually quite logical. Keep in mind that standard deviation is the square root of variance. Variance is quite simply an expression of the variation among values in the population. Each of the means within the distribution of means is comprised of a sample of values taken randomly from the population. While it is possible for a random sample of multiple values to have come from one extreme or the other of the population distribution, it is unlikely. Generally, each sample will consist of some values on the lower end of the distribution, some from the higher end, and most from near the middle. In most cases, the values (both extremes and middle values) within each sample will balance out and average out to somewhere toward the middle of the population distribution. So the mean of each sample is likely to be close to the mean of the population and unlikely to be extreme in either direction. Because the majority of the means in a distribution of means will fall closer to the population mean than many of the individual values in the population, there is less variation among the distribution of means than among individual values in the population from which it was derived. Because there is less variation, the variance is lower, and thus, the square root of the variance - the standard deviation of the distribution of means - is less than the standard deviation of the population from which it was derived.