There are many questions on this site relating to the spreading of Gaussian wave packets and the time dependence of the uncertainty in position ($\Delta x$). For example, see Is it possible for $\Delta x$ ($\sigma_x$) of any free particle wave packet to be decreasing at any time? or Does Heisenberg's uncertainty under time evolution always grow? (and the links questions therein). However, I am struggling to follow the arguments presented in the thread above.
Specifically, if I measure a free particle described by a Gaussian wave packet with uncertainty $\Delta x$ at some time, does the positional uncertainty increase or decrease (or neither)? I am not sure I appreciate the physical significance of time-reversed solutions $\Psi^*(x,-t)$ - surely we either observe increasing positional uncertainty or not? If not, why do people even consider the dispersion of Gaussian wave packets?
Edit 1: From the comments and discussion in the answer here and Propagating a Gaussian wavepacket backwards in time I can accept that wave packets do not always spread and states can be prepared that narrow or spread with time. My refined confusion therefore is, what implicit initial conditions do quantum textbooks/lectures use when they describe Gaussian wave packet spreading in time?
For example, see the line from this lecture:
According to Eq. (112), the width of our wave packet grows as time progresses. Indeed, it follows from Eqs. (79) and (105) that the characteristic time for a wave packet of original width ${\mit\Delta} x$ to double in spatial extent is ...
The questions linked do not this issue (or at least not explicitly enough for me to work out).