Applying a Growth Rate
Let me back up from my last post and explain this in a different way.
Here is an example of how money grows: Let’ assume that you invest a dollar at 10% percent for three years. Determining what you have at the end of the three years is a matter of simple multiplication:
$1*(1+.10)*(1+.10)*(1+.10) = $1.33
Why the (1+.10) in the formula? Because each year that you earn 10% you have 110% more money at the end of the year than you had at the beginning of the year. At the beginning of the first year, you had a dollar, but at the end of the first year, you had $1.10. At the beginning of the second year you had $1.10, but at the end of the year you had $1.21. At the beginning of the third year you had $1.21 but it earned 10%, so you had $1.33 by the end of that third year.
Got it?