Reference: Moore, Thomas A., A General Relativity Workbook, University Science Books (2013) – Chapter 18; Problem P18.1.
We’ve seen that in Newtonian physics, the tidal effect produces a relative acceleration between two objects in free fall above a sphere of mass given by
where is the relative separation vector of the two objects, and the radial direction is taken to be along the axis.
To illustrate this, we can consider a situation where we have two masses separated initially by a vertical distance of 1 m and then released from rest near the Earth’s surface. How long will it take for the distance between these objects to increase by 1 nanometre due to the tidal effect?
We’ll start with the approximation that is a constant equal to the radius of the Earth, or . We can check the answer for consistency with this assumption after we’re done. We’ll also need the mass of the Earth: and of course in MKS units. We’re thus faced with the differential equation:
This has the general solution
where the numerical factor in the exponent is . From initial conditions
We want the time such that or in other words:
Since the difference is so small, we can expand the argument of the cosh in a Taylor series and keep the leading term:
In that time, the objects would fall a distance of
or about 3 mm, so the assumption of being constant at the Earth’s radius seems reasonable.
For the separation to reach 1 mm, the same calculation yields a time of around 25.5 sec, which would amount to a distance of (assuming the acceleration due to gravity is constant and ignoring air resistance) of around 3.2 km, so here the assumption of a constant is a bit shakier, considering we’re dealing with such a small tidal effect.