I think the attitude of most working scientists would be that we should make the most conservative possible assumptions when extrapolating the laws of physics to new areas, and see how far you can get. If you run into a contradiction, then this is evidence that we need to revise our underlying assumptions. However, if a consistent picture emerges, this is evidence that the underlying assumptions work and can be used to build a coherent story.
In particular, the simplest assumption is to assume that the speed of light is constant. This assumption (plus the framework of general relativity) allows us to build a picture of cosmology and astrophysics that is remarkably consistent. As an example of what I mean by "consistent", multiple independent probes of the history of the Universe, such as measurements of the abundance of primordial elements as well as measurements of the cosmic microwave background, can be used to estimate the number of neutrinos, and both come up with an answer consistent with three to within error bars, which is consistent with the Standard Model of particle physics. Many such internal consistency checks have been done, and point to the basic framework with conservative assumptions being correct (or at least, approxiimately correct).
Now, there is a small discrepancy (the now infamous "Hubble tension"), and as a result there is a lot of work understanding which of our many assumptions might be leading to a contradiction. It could well be that there is some exotic new physics, like a changing speed of light, that could explain it (although as far as I know, there isn't an actual concrete theory with a changing speed of light that has been proposed to solve this problem; I am just illustrating the logic of how scientists think). Or, it could be that there is some subtle technical issue with, say, how the distance ladder is calibrated, that leads to a biased estimate of the Hubble constant by one team or another.
From the other end, people also have also imagined what it would look like if the speed of light were not constant. You can then look for (presumably small) experimental signatures consistent with a theory that has this property. As a technical point, since the speed of light is a dimensionful parameter, it is not actually meaningful to ask if its value changes -- we can always choose our units so it has any value we want. However, there is a dimensionless quantity -- the fine structure constant $\alpha$ -- in which the speed of light appears. So we can ask what happens if the fine structure constant changes with spacetime. This would have many effects, such as changing the sizes and energy levels of atoms. This kind of effect is well constrained by measurements of various spectral lines, and therefore there is a tight constraint on how much $\alpha$ can vary. In the lab, constraints are very tight; according to Wikipedia $\frac{\dot \alpha}{\alpha}\lesssim 10^{-17} {\rm\ per\ year}$. Using astrophysical measurements to consider time variation of $\alpha$ in the past seems to be more controversial, but different methods seem to agree at least with a bound of $\frac{\dot \alpha}{\alpha}\lesssim 10^{-5} {\rm\ per\ 10\ billion\ years}$.