Why is the average for $n$ numbers given by $(a+b+c+\cdots)/n$? I deduced the formula for the average of 2 numbers which was easy because its also the mid point, but I couldn't do it for more than 2 numbers.
-
1See here: https://en.m.wikipedia.org/wiki/Average – Dec 22 '16 at 04:05
-
9Place all of your numbers as points on the number line. The average is the point you would place a fulcrum to balance the points. – John Douma Dec 22 '16 at 04:06
-
2Another thought (apologies if this restates ideas in some of the comments/answers): Over $n$ days, if we get paid each day according to varying amounts $a_1, a_2, …, a_n$, then at the end we have the same amount of money as we would if we had been paid the same amount $(a_1+…+a_n)/n$ on each day. So it makes sense to say that $(a_1+...+a_n)/n$ represents the average pay per day. – Michael Dec 22 '16 at 07:34
-
3Maybe it's just me, but isn't the average (or better, the arithmetic mean) DEFINED as (a+b+c+...)/n? It's not a discovery, nor a theorem to be demonstrated. It's a definition. It's almost the same as asking "why does the light in vacuum travel at c (which is the speed of light in vacuum)?". Well, because c IS the speed of light in vacuum... – frarugi87 Dec 22 '16 at 10:48
-
4@frarugi87: Not really; or at least, I'd say that's the average, but not the arithmetic mean. They might be synonyms in your middle school textbook, but their equality doesn't make sense if you think about what they should intuitively mean (no pun intended). The real definition of arithmetic mean is that it is the value with the minimum total squared distance to all the elements of the given set. I actually find find it shocking that this value (whose primary operation is subtraction, i.e. distance) happens to be given by such a simple formula (whose primary operation is addition). – user541686 Dec 22 '16 at 11:54
-
@Mehrdad thank you for your reply, and thanks to you I spent a couple of minutes doing some calculations to demonstrate that the addition formula indeed IS the one that minimizes the total squared distance ;-) I haven't done this for years, so thank you ;-) anyway I searched for the definition of arithmetic mean, but I could only find sites that stated that it was defined as sum of all the items divided by the number of them. Can you point me towards the definition you wrote? – frarugi87 Dec 22 '16 at 15:30
-
1@frarugi87 It's just you. "Why is X defined this way?" should not be answered with "Because it is. Next question." Even worse, "average" pretty clearly has an a priori meaning that one would hope has something to do with the "definition," which is not really a definition since "average" already means something, so much as it is a formalization, or a definition with respect to the language of set theory, or something. – djechlin Dec 22 '16 at 15:49
-
@djechlin I don't agree very much with you. If I tell you "let the derivative of a function be the limit with h tending ...." you can't ask me "why is the derivative the limit...". You can ask me "why the derivative of x is 1" but you can't ask me why it is expressed by that expression because, well, it's a definition. I read that the definition of the arithmetic mean (usually called average) is sum/n. Now, Mehrdad said that the definition of average is not that one, ok, but again it's a definition. Saying that the word average already means something outside the mathematical world is useless – frarugi87 Dec 22 '16 at 16:39
-
1@frarugi87 yes you can. People do it all the time. Calc students ask why the derivative is defined that way and the teacher answers. The student then learns something. Are you trying to claim this never happens or that these questions should not be asked? Which isn't to get into your very wrong philosophical and ahistoric views for the role of definitions in mathematics. – djechlin Dec 22 '16 at 16:43
-
@djechlin Well, that's not what they taught me. A definition is just a "label" you attach on something. Like saying "why is a cat called 'cat'". It's, in my opinion, meaningless. It's its name. The valid questions are "why does that formula have this or that property". That is totally legit. You can also ask "why is that formula useful", or from an historical perspective "how did they get there", but asking "why did they call it that way" is meaningless, in my opinion. What is a feasible answer, in your opinion, to "why is the derivative defined that way?"? – frarugi87 Dec 22 '16 at 16:58
-
@frarugi87 because you want some way to talk about tangents in a geometric sense, or instantaneous change in a physical one, or what happens at the margins of a system (why didn't Apple charge $1 for the iphone or produce 1 more?) in a physical or economic sense. Like if I give you a graph of how long your car has gone at time t you ought to be able to tell me the speed you were going at time t. IMO that is a great answer to the question. technically it is a historical answer and involves the human act of coming up with the right label for the right concept . . . – djechlin Dec 22 '16 at 17:37
-
. . . of course the label "derivative" relates to the word "change" for whoever came up with this. But I find this type of historical perspective so interconnected with the mathematics itself, and so vital for building intuition and seeing mathematician as anything but definition and axiom pushing (of course in computability theory you view it exactly this way...) that I would just call it "mathematics" instead of "history of mathematics." Although you may choose to draw the line differently, I think this is pure philosophy and does not describe how people learn, intuit, etc. – djechlin Dec 22 '16 at 17:39
-
Let us continue this discussion in chat. – djechlin Dec 22 '16 at 17:39
-
@frarugi87: I wouldn't say it's the more common view -- what you take as the "definition" is subjective, and different authors take different views -- but here is 1 example of someone who defines it the way I mentioned. I personally find this far more enlightening than the usual definition (because it also explains why the arithmetic mean is sensitive to outliers), so I consider this to be the real definition. Note that the point really isn't the terminology but the idea. If you want them to be synonyms, then OK, but then you miss this. – user541686 Dec 22 '16 at 18:51
-
See my answer at http://math.stackexchange.com/a/922044/124095. – mweiss Dec 23 '16 at 03:52
4 Answers
Suppose all of us gathered here in this room take all the money out of our pockets and put it on the table, and then we divide it among us in such a way that we all have the same amount. The total amount is still the same. Then the amount we each have is the average. That's what averages are.
The total amount is $a+b+c+\cdots$. The number of us gathered here is $n$. So the amount that each of us gets is $(\text{total}/n).$
-
5I like how this example can illustrate skewing. If in this scheme one of us in the room began with a million dollars, and the rest of us between 20 and 50 dollars, then all but one of us is going to be very happy afterwards. – cobaltduck Dec 22 '16 at 19:31
In most contexts, what passes for an 'average' can be thought of this way: if you replaced a collection of separate instances with their 'average', you get the same result.
The usual mean comes from thinking this way for addition: if you have numbers $a_1,\ldots,a_n$, their sum is $a_1+\cdots+a_n$. If you replaced all of them with their mean $\mu$, you should also get $a_1+\cdots+a_n$.
Therefore $\mu$ must satisfy $$ n\mu=a_1+\cdots+a_n, $$ leading to the formula you've seen.
As another example: doing the same thing but for multiplication leads to the geometric mean $\sqrt[n]{a_1a_2\cdots a_n}$.

- 32,430
-
By "same result", if you mean "same sum", then this is right. "Sum" is certainly a more precise term in this case. – Michael Hardy Dec 22 '16 at 06:13
-
5I was making a point that averages can be defined in lots of different contexts, and that the specific definition depends on the context. In the context of addition, the 'average' is the arithmetic mean. In the context of multiplication, it is the geometric mean. Etc. – Nick Peterson Dec 22 '16 at 06:25
-
$\ldots,$and in some contexts, "average" means median, or some other trimmed mean. – Michael Hardy Dec 22 '16 at 18:23
Here is a slightly different perspective on what Nick and Michael have already said: the average of $n$ numbers $x_i$ is the unique number $\mu$ such that the sum of the deviations $x_i-\mu$ is zero.
Starting from this characteristic property it is easy to derive the formula
$$\mu=\frac{1}{n}\sum x_i$$
A closely related characterization comes from statistics. Suppose we want to find the "number of best fit" for our data points $x_i$. To find this number, we first need to say what counts as "best."
One popular choice is to measure the "error" of our best-fit "approximation" using a quadratic "cost function." More formally, finding "the number of best fit" amounts to finding the number $m$ that minimizes the sum of the squared errors
$$SSE=\sum (x_i-m)^2$$
If you know any calculus (or simple multivariable geometry) you can easily prove that this function is minimized precisely when $m$ is the average of the $x_i$. In this sense, the average is the minimizer of squared errors.
If instead of measuring error by the sum of the quadratic deviations $(x_i-m)^2$ we use the sum of absolute deviations $|x_i-m|$, the minimizer is the median rather than the average.
In fact, other types of means (the geometric mean, the harmonic mean, etc.) can be understood using this same framework. See the Wikipedia page on Fréchet means.

- 18,604
-
-
1To me, anything that requires math above Algebra II isn't "intuitive"... – Feathercrown Dec 22 '16 at 17:37
-
1@Feathercrown: the word you're looking for is elementary, not intuitive. – symplectomorphic Dec 22 '16 at 19:40
-
If Bill Gates walked into a crowded bar, on an average, everyone is a millionaire.
Loosely, an average is supposed to be a representative value for a sample. Sort of. But as you can see, it needn't be the case always.
But every time, average is definitely this: if what we collectively have is distributed equally among all of us, the average is what each of us would get.
What we collectively have: $a_1+a_2+\cdots+a_n$
How many of us are there: $n$
What each one would get: $\frac{\text{total}}{\text{number of people}} = \frac{a_1+a_2+\cdots+a_n}{n} = \textbf{average}$
And that's why everyone becomes a millionaire. On average. Bill Gates simply has that much. The moral is, outliers can sometimes mess up the average, make it unreliable. Other times, everyone kind of has that much.
PS: Call it arithmetic mean instead of average. Also, read the answer by @symplectomorphic. It has an interesting (and often very useful) take on how to think of an arithmetic mean.

- 218