The amount of money $A$ in an account with an interest rate $r$ compounded annually is given by
$$A=P(1+r)^{t}$$
where $P$ is the initial principal and $t$ is the number of years the money is invested.
a. If a 10,000 dollars investment grows to 11,664 dollars after 2 yr, find the interest rate.
b. If a 6000 dollars investment grows to 7392.60 dollars after 2 yr, find the interest rate.
c. Jamal wants to invest 5000 dollars . He wants the money to grow to at least 6500 dollars in 2 yr to cover the cost of his son's first year at college. What interest rate does Jamal need for his investment to grow to 6500 dollars in 2 yr? Round to the nearest hundredth of a percent.