If one object is thrown vertically and the other dropped when will the two objects reach the same height?

Suppose object A is propelled in the upward direction from height h_0 with an instantaneous upward velocity at time t=0 of v_0, and that object B is dropped from height h_1 with an instantaneous velocity at time t=0 of 0. Height increases in the upward direction. We assume acceleration due to gravity of 9.8m/s^2 in the downward direction. Recall that velocity is the first derivative of position, and that acceleration is the second derivative. After a bit of simplification, we can express object A's height at time t as h_A(t) = h_0 + tv_0 - 9.8t^2/2

and object B's height at time t as h_B(t) = h_1 - 9.8t^2/2 we now wish to solve the equation h_A(t) = h_B(t) in terms of t. Writing it out: h_0 + tv_0 - 9.8t^2/2 = h_1 - 9.8t^2/2

We note as a matter of interest that the acceleration terms cancel, which is to be expected as the objects are (assumed to be) in a uniform gravitational field. Technically speaking, we can find an accelerating frame of reference in which gravity plays no part. So our equation is: h_0 + tv_0 = h_1

t = (h_1 - h_0) / v_0. All of this is of course in SI units (seconds, metres, metres-per-second).

In other words, the time taken is the distance between the two objects divided by the speed at which object A is launched vertically.