2

I am a solid technology troubleshooter and problem solver, but when it comes to numbers.....well I suck at it. :/

I am trying to calculate how much time a program will theoretically finish copying the data from a malfunctioning hard drive so I can provide the customer with a reasonably accurate estimate. I have determined that it takes roughly 2 hours and 30 minutes for the program to copy what would normally take 1 hour and 20 minutes. I used math plus a little guestimation to take certain things into account. What I have come up with is this: for every 1 hour and 30 minutes it actually takes 2 hours and 30 minutes*. That would be 2 hours for every 1 hour, so if the program would normally take 15 hours, it will actually take 30 hours. The i subtracted 10 minutes for every hour, as the actual time is 1 hour and 20 minutes. Again, math is not my strong suit. 20 minutes X 30 = 10 hours, so my final estimate is it will take somewhere around 20 hours.

Is anything above seriously flawed and/or what would have been the CORRECT way to calculate this?

1 Answers1

1

Just work in minutes rather than mixing hours and minutes.

Normally it takes 80 minutes but will instead take 150 minutes. So this is taking $\frac{15}{8}$ times as long. You want to multiple the normal full length of the job by this amount.

$\frac{15}{8}\times15 hours=\frac{225}{8}=28.125hours$

As its only a rough estimate you could also just note that 2 hours 30 minutes is nearly double 1 hour 20 minutes and just expect the answer to be about 30 hours. You are working with very rough calculations and lots of assumptions that it will continue exactly how it has so far.

Ian Miller
  • 11,844