Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. In probability theory and statistics, the variance of a random variable or distribution is the expected, or mean, value of the square of the deviation of that variable from its expected value or mean. Thus the variance is a measure of the amount of variation within the values of that variable, taking account of all possible values and their probabilities or weightings (not just the extremes which give the range). For example, a perfect dice, when thrown, has expected value (1+2+3+4+5+6)/6 = 3.5, expected absolute deviation 1.5 (the mean of the equally likely absolute deviations (3.5−1, 3.5−2, 3.5−3, 4−3.5, 5−3.5, 6−3.5), giving 2.5, 1.5, 0.5, 0.5, 1.5, 2.5, but expected square deviation or variance of 35/12 ≈ 2.9 (the mean of the equally likely squared deviations 1/4, 9/4, and 25/4).