I am a solid technology troubleshooter and problem solver, but when it comes to numbers.....well I suck at it. :/
I am trying to calculate how much time a program will theoretically finish copying the data from a malfunctioning hard drive so I can provide the customer with a reasonably accurate estimate. I have determined that it takes roughly 2 hours and 30 minutes for the program to copy what would normally take 1 hour and 20 minutes. I used math plus a little guestimation to take certain things into account. What I have come up with is this: for every 1 hour and 30 minutes it actually takes 2 hours and 30 minutes*. That would be 2 hours for every 1 hour, so if the program would normally take 15 hours, it will actually take 30 hours. The i subtracted 10 minutes for every hour, as the actual time is 1 hour and 20 minutes. Again, math is not my strong suit. 20 minutes X 30 = 10 hours, so my final estimate is it will take somewhere around 20 hours.
Is anything above seriously flawed and/or what would have been the CORRECT way to calculate this?