Partially motivated by this question, I get the impression that it is generally more difficult to make accurate statistical predictions in Physics about "the small" (microscopic phenomena) than "the big" (macroscopic phenomena).
Why? Do we know what explains this relationship between scale and uncertainty?
For example, not only do we seem to have different theories for the small vs the big, they are also often different in what type of predictions they make. For example, classical mechanics is deterministic and quantum mechanics is inherently probabilistic.
One could speculate that predicting physical observations always involves uncertainty, but it was just easier to make more accurate (and thus effectively deterministic) predictions about "the big" than about "the small" because the underlying laws governing the small are intrinsically random, so we naturally ended up developing more probabilistic theories for the small.
But is there a statistical, perhaps simpler mathematical explanation for this difference in "prediction hardness"? (e.g. an aggregation of microscopic random effects that may cancel each other so we can more easily make accurate large-scale predictions). Or even simpler, is this a false premise and we actually have the ability to make predictions with the same level of certainty about the small and the big?