I am stuck with this question.
"You have a data set consisting of the sales prices of houses in your neighborhood, with each sale time-stamped by the month and year in which the house sold. You want to predict the average value of houses in your neighborhood over time, so you fit a simple regression model with average house price as the output and the time index (in months) as the input. Based on $10$ months of data, the estimated intercept is $\$4569$ and the estimated slope is $143$ [Dollar/month]. If you extrapolate this trend forward in time, at which time index (in months) do you predict that your neighborhood's value will have doubled relative to the value at month $10$? (Round to the nearest month)."
What I tried: Since the house price is increasing according to slope, the house price should be doubled after $4569/143$, which equals to $32$ months. But this does not seem to be the right answer.
Any hint for solving this question will be greatly appreciated.
Thanks.