One of the tests of Quantum Electrodynamics is the value of the "Anomalous magnetic dipole moment".
The theoretical value is:
$$a_e = 0.001\ 159\ 652\ 181....$$
We say that QED "predicts" this value. But as far as I'm aware, the series which predicts this actually diverges. (Or at least not proven to converge?) Hence the full series does not predict this value it predicts infinity.
So in order that the theory of QED is predictive, it means that we must state two things:
(1) How many terms of the series are we to use.
(2) After discarding the rest of the series, how close to the real value is this theoretical value.
e.g. We might say, QED is accurate up to 10 terms in the series and this gives a value which will be within 0.001% of the experimental value.
If we don't state these two things, and the series diverges, all we can say is that coincidentally the first few terms off the series matches the experimental value. And that adding more terms MIGHT or MIGHT NOT give a closer value.
So assuming the series in the coupling constant diverges, has anyone worked out how many terms are we supposed to take, how close to the experimental value we should expect to get, and at what point will the series begin to get less accurate?
Alternatively we might say that we don't know when the series will begin to get less accurate and we can only add more terms and compare it with experiment. In which case this is not very predictive.
A third possiblility is that the number of terms we can use would be proportional to the cut-off we are using (as in renormalisation), but if this is the case we would need a formula relating the number of terms to use with the cut of parameter.