Almost phenomenologically, "the appearance of flecks of metallic silver on a photographic plate" is a thermodynamic transition that happens at different rates depending on the details of how the plate is prepared and on details of the exposure of the photographic plate. Such thermodynamic transitions are often correlated in nontrivial ways. All QM has to do to be Useful is to model or describe the statistics of the thermodynamic transitions. [Note that my introduction of the idea of a thermodynamic transition makes my claim here "theory-laden", not quite phenomenological, at least to that extent.]
Explanation is not necessary for Usefulness. One topic of research in Philosophy of Physics has been to try to determine what makes a model "explanatory", which IMO has been rather inconclusive. Models may be more or less Useful for many different reasons, including tractability and directness of reference between elements of the theory and elements of experimental signal data. Note that a class of models may seem explanatory for 50 years even if it is the phlogiston theory, if the model is superficially nice in whatever ways.
Which brings me to my Answer, which I'm pretty sure you won't find Useful, which is that Decoherence doesn't explain particularly well, whatever that means, partly because it's not a very tractable approach. Decoherence seems to have fairly direct referents, which perhaps is what makes it appeal to some people quite strongly. The same is true of "wave function collapse": it's possible to structure experimental data taking wave function collapse as a fundamental modeling strategy, but so far no-one has produced a mathematization that is enough more Useful than just dealing with the statistics of thermodynamic events. There are people who think it illuminates what we're doing with QM in ways that might lead to a better mathematical formulation of the whole theory, but, I think, nothing yet.
In a similar vein, you may notice that Particle Physics is more often called High Energy Physics than it used to be, which seems to me to reflect the realization, not uniformly acknowledged, that the explanation of tracks of obviously related thermodynamic events in detectors as "caused by particles" is weakened by the many low-energy experiments that show that the concept of a particle cannot be that simple. As of now, Quantum fields are as likely to be the locus of descriptions of experiments.
I'm curious whether you can knock down this argument, such as it is. I think you're looking at this all wrong, but of course it may be me. That I've worked on this for a long time doesn't guarantee much.
EDIT (a long comment, in response to Marty's comment that first mentions "Quantum Siphoning"): I take the Wave Function and operators to be a good way to generate probability measures. The empirical success comes from the probabilities being able to be good models for (or descriptions of) statistics of raw experimental data. I take it that probabilities do not cause individual events, they describe sets of events (propensity interpretations of probability notwithstanding). [If we go the Wigner function route --which I don't, except as a mathematical equivalence, because I think it obscures the relationship to empirical data-- the wave function is just a generalized probability function that sometimes has negative values.] If one wants to change probability distributions as a result of experience, instead of taking other approaches to statistics, then one should use something like Bayes' rule, which in general doesn't just change the probability from 0.615802 to 0 or to 1. "collapse" of the wave function adds an extra level of structure to the concept of a probability distribution that I think just doesn't fit well, as Mathematics. If people want to use "collapse", I think it has to be done somehow differently. It's possible that a propensity interpretation could work for you, but I think we would then quite quickly get far enough apart that we can't talk to each other.
I think I prefer my description of individual events (and we may just have to accept that this is a sticking point)-- that we should say that the individual events are "thermodynamic transitions", whatever that means, leaving a causal account of how that happens for the future. The concept of thermodynamic transitions is the historical concept from Physics that I think fits the case. A thermodynamic event implicitly invokes at least a large number, perhaps an infinite number of degrees of freedom, to explain what happens when there is an apparent discontinuity, it introduces a degree of complexity that is hard to manage mathematically, which definitely has its problems. Decoherence also introduces an infinite number of degrees of freedom, but I think by introducing the environment in the way it does it doesn't adequately embrace the complexity of the photographic plate. I think your description of what happens in a photographic plate accepts that complexity, but then looks to make "collapse" of a quantum state, which has nowhere near as much structure as the photographic plate, be an explanation of what is happening. It's important that it not be brushed under the table, but we can measure where and when thermodynamic events happen without knowing how they happen.
I hope that's helpful. I expect no-one else is listening!