0

A man begins a car trip to visit his in-laws. The total distance is 60 miles, and he starts off at a speed of 60 miles per hour. After driving exactly 1 mile, he loses some of his enthusiasm for the journey, and (instantaneously) slows down to 59 miles per hour. After traveling another mile, he again slows to 58 miles per hour. This continues, progressively slowing by 1 mile per hour for each mile traveled until the trip is complete.

(a) How long does it take the man to reach his in-laws?

I'm not sure how to approach this problem, I need a nudge in the right direction, thanks.

Bart Michels
  • 26,355
Jason
  • 3

2 Answers2

1

For each mile, find the speed he is travelling in ($x$ mph). When he is travelling at that speed, how much time does it take for him to travel $1$ mile? Add all values for each $x$ to get the answer.

I leave the math for you to work out. The answer should be between $4.5$ and $5$ hours.

2012ssohn
  • 3,827
0

You will need to construct a differential equation for the problem. And solve it to express distance travelled as a function of time. The man decreases his speed by 1 mph starting from 60. Hence velocity at any time t would be $$v=60-t$$ where t is in hours. $$\Rightarrow \frac{dx}{dt}=60-t$$ $$\Rightarrow dx=(60-t)dt$$ Integrating we get $$x=60t-\frac{t^2}{2}+c$$ We know that he covers a distance of 1 mile with a speed of 60mph. Hence time taken is $\frac1{60}h$. Use this information to find c and finally solve for t by taking x=60miles.

GTX OC
  • 1,569
  • 2
  • 12
  • 19