There is a famous problem which asks:
Suppose an object is held at 1 au from the Sun and released from rest. How long does it take to fall into the Sun (neglecting the size of the sun)?
The trick solution is to imagine the trip into the sun as the limit of an elliptical orbit where the semi-minor axis of the orbit goes to zero. In that limit, the sun is at one focus of the ellipse, but also at the extreme end of the ellipse, and so it's an orbit with semi-major axis of 1/2 au, so the time taken is $2^{-3/2}$ years.
This is correct, but why?
If we drop the object straight at the sun (and imagine that it can pass through the sun), then it should pass through the sun in a straight line, continue out the other side, and eventually wind up 1 au away from the sun on the opposite side.
In the thin elliptical orbit, the behavior is totally different. The object only just barely makes it past the sun at all before returning in the direction it came.
Additionally, the time to return to the starting point is different in the two scenarios. If we drop the object straight towards the sun, then when it gets to the sun it has completed 1/4 a period, whereas for the thin elliptical orbit it has covered half a period.
So without doing a more direct calculation of the time for the object to fall into the sun, why should we believe that the "limit of a thin orbit trick" works?