Why do I see this thread on your link rather than
the article we are discussing?.
I'm really uncomfortable with formulas that mix dimensions. It is interesting that if you subtract your pack weight in pounds from the maximum distance you can walk (in miles) you can get what appears to be a good estimate of how many miles you can walk with that pack weight. But for me it is hard to know how hard you ought to lean on that estimate unless I understand a little more about *why* that might be the case than just seeing an observed empirical relationship.
The other thing is it isn't clear to me how the fact that food is consumed over the course of the trip is integrated into his equations. Anyone who has taken a long multiday trip knows that the last few days in the trip almost always are the biggest mileage days -- because the pack is so much lighter than on the first days out. That kind of makes an average "miles-per-day" figure meaningless and kind of useless as a point of calculation.
I'll give an example here of what I think is wrong with the trip time and pack weight calculator.
Given a trip length of 120 miles, the ability to walk 44 miles/day empty, 12 pounds base weight, and 2 pounds per day of food the trip calculator tells me I need six days (20 miles per day) and 24 pounds of stuff in my pack.
Here's the rub.
The adventure racing rule says that I should be able to hike 20 miles per day with a 24 pound pack.
But on the second day my pack weighs 22 pounds (because I ate two pounds of food), so I should (in this example) be able to hike 22 miles.
On the third day my pack weighs 20 pounds, so I should be able to hike 24 miles.
On the fourth day my pack weighs 18 pounds, so I should be able to hike 26 miles.
On the fifth day my pack weighs 16 pounds, so I should be able to hike 28 miles.
Now I'm at my destination 120 miles out with an extra day's worth of food.
I'm fiddling around right now, but probably a more correct answer for that 120 mile trip is probably around 4.5 days and 21 and a tiny bit pounds of food.