Let the walking speed be x miles per hour. Then, the jogging speed is 2x miles per hour.
Given that John jogged 5 miles and walked 2 miles, and that the total trip took 0.9 hours, we can set up the following equations for the time spent jogging and walking:
\text{jogging time} = \frac{5}{2x}
\text{walking time} = \frac{2}{x}
The total time is the sum of these two times:
\frac{5}{2x} + \frac{2}{x} = 0.9
To solve for x, we find a common denominator and combine the fractions:
\frac{5}{2x} + \frac{2}{x} = \frac{5}{2x} + \frac{4}{2x} = \frac{9}{2x}
So,
\frac{9}{2x} = 0.9
Multiply both sides by 2x:
9 = 1.8x
Divide both sides by 1.8:
x = \frac{9}{1.8} = 5
Thus, the walking speed is 5 mph, so his jogging speed is:
2x = 2 \times 5 = 10
Therefore, his average jogging speed is 10 mph.