The Problem
For a small mass a distance $R_i$ away from the center of the Earth, how long would it take for the object to fall to the surface of the Earth, assuming that the only force acting upon the object is the Earth's gravitational force?
Relevant Information
The following discussion seems to have solved exactly the same problem: http://www.physicsforums.com/showthread.php?t=555644
However, upon working out the mathematics, I'm not exactly sure how to evaluate the constant of integration.
A Partial Solution
$$ F=\frac {-GMm}{s^2} $$
$$ a=\frac {-GM}{s^2} $$
$$ \frac {dv}{dt} = \frac {-GM}{s^2} $$
Multiplying by $v$ and then integrating by $dt$ on both sides, we have
$$\frac {1}{2} v^2=\frac {GM}{s} +c_1$$
where $c_1$ is a constant of integration. Substituting initial conditions of $v=0, s=R$, we have
$$\frac {1}{2} v^2=GM(\frac {1}{s}-\frac{1}{R})$$
At this point of time, when I use Wolfram Alpha, I get
$$c_2+\sqrt{\frac{2}{R}}t=\frac{\sqrt{s}(s-R)+R\sqrt{R-s}\times{\tan^{-1}(\sqrt{\frac{s}{R-s}})}}{\sqrt{GM(R-s)}}$$
where $c_2$ is a constant of integration. Substituting initial conditions of s=R, t=0, we find that the term $$\tan^{-1}(\sqrt{\frac{s}{R-s}})$$ is undefined. At this point, I'm stuck. Any ideas on where I've made the mistake here?
(For those interested, this question was inspired by the Greek myth which states that a bronze hammer dropped from heaven would take 9 days to hit the Earth and would reach on the tenth).