We all know what the average of a bunch of numbers is: you add up the numbers and divide the total by the number of numbers you added up and you get something in the middle. This sort of average assumes that all the numbers you are adding up are equally important. There are many situations where the numbers are not equally important. This is why we have weighted averages — a weight is another number that says how important a number is in the group of values you are averaging.
The simplest example is when the weights are positive and add to one. This is something that often happens with grades. The teacher might say that “The final grade is 50% of your homework grade plus 30% of your quiz grade plus 20% of your final exam grade”. In this case the weights are 0.5, 0.3, and 0.2. If a student had 86% on homework, 72% on quizzes, and a 91% on their final then their grade would be 0.5×86%+0.3× 72%+0.2× 91%=82.8% for their final grade.
A very common application of weighted averages is making decisions in business. Suppose that you are in charge of allocating your firm’s budget. You also have a model that tells you how a given set of budget numbers affects sales on three different product lines, expansion plans, employee retention, advertising, and customer retention. A weighted average of the impact of each of these outcomes gives you a single number that tells you how good a budget is. This single-number summary can be used to compare and adjust the budget. You could even feed the weights into optimization code and get best budgets or even a set of possible trade-offs based on different sets of weights.
The weights represent your (or your firm’s) estimates of the relative importance of all the different outcomes. The weights are also a great place to get your supervisor’s input, granting them ownership of the results. This lets your supervisor participate in the process and, if the individual is not all that quantitative, gives you a simple venue to engage their participation. Occupy Math apologizes for the implicit cynicism.
What if the weights don’t add to one?
Suppose that we want to compute the weighted average of 1, 5, and 3 but 5 is twice as important as the other two numbers. Then the computations look like this:
You multiply each number by its weight (importance), add up the results, and then divide by the sum of the weights. The weights all have to be positive, but the trick of dividing by the sum of the weights means you don’t need to make sure the weights add to one. Of course, you can divide each weight, individually, by the sum of the weights to get a set of weights that do add to one. That would make the weights of 1 and 3 equal to 0.25 and the weight for 5 equal to 0.5.
What else can weighted averages do?
One answer is image filtering. The pixels that make up an image are a grid of color specifications. Each pixel has an intensity value for red, blue, and green; in the picture below these got from 0 to 255. If we take a weighted average of the red, green, and blue color intensities at a pixel and the other eight pixels that form a 3× 3 window using these weights:
then that can smooth out the image. We multiply the color values at each pixel by the corresponding weights in the grid, divide by 20 (the total of the weights) and use the result as the new set of red, green, and blue color intensities for the pixel at the center of the grid in the filtered version of the image. You apply the weighted average to every pixel in the picture. Here is an example of applying the sort of weighted average filter:
The left snowflake is crisper, the right one is fluffier — the weighted average fluffed it up. The image is a Sierpinski fractal and the filter smoothed it out quite a bit. By changing the weights of the average you can control the degree of smoothing.
You can apply weighted averages to filter sound. The picture below uses a filter — on groups of five intensity values in a row — with weights 1, 3, 8, 3 ,1 in the same sort of way that the image filter worked. Look what happens when we apply weighted averages to filter a very noisy sound track five times.
The weighted average can bring a fairly structured signal out of the noise. The crisp detail in the pre-averaging snowflake corresponds to little picky details in the sound; many of these little picky details are things you might not want. Weighted averages are good at removing hiss and pop from old audio tracks — but they also remove some of the crispness from the music.
Morphing is a type of moving average
Fractals are defined by numerical parameters. The fractals below are defined by a list of six numbers. To make the animation, morphing is used. We start with the numbers that define one fractal and then take weighted averages of parameters for that fractal and another with the weights going 100:0 95:5 90:10 … 5:95 0:100. Since the fractals change continuously with their defining parameters, one fractal just changes into another. The frames of the animation below all use parameters that are weighted averages of the parameters that define the fractals in the first and last frame.
The animation is a little jerky, it could be made much smoother by changing the weights more slowly — Occupy Math wanted to avoid a fractal that takes ten minutes to download.
Weighted averages are a whole family of averages.
You choose which member of the family of types of weighted average you are computing when you choose the weights. The “normal” average is the one with all the weights equal. You can probably think of other examples of weighted averages. If you have ever participated in a quiz bowl, harder questions are worth more points. While it’s not explicit, this means that the score of each team is a weighted average of the scores of the questions they got correct. Harder problems get higher weights.
The business decision-making example at the beginning of the post is another way of using weighted averages that comes up pretty often. There, you might want to back off of the simplicity a little. Instead of choosing any weights at all, you might want to set a minimum value so that things like employee retention don’t get dropped off the list. If you’re using an algorithm to optimize your budget and, as you feed it different weights to optimize with, it turns out that ignoring employee retention is the best option, then, first of all, your model probably doesn’t include what will happen next year and, second, you might look dumb to someone above you at work.
Weighted averages are averages with more controls. The other thing is that all the averages in this posts are arithmetic averages. It turns out that there are lots of different types of averages: geometric, harmonic, and others. Occupy Math will leave these for a future post, but they too have weighted versions. Do you have a mathy topic you would like explained? Please comment or tweet!
I hope to see you here again,
University of Guelph,
Department of Mathematics and Statistics