For years now i have in my haed a thought experiment for information exchange via quantum entanglement. And i am aware that something must be wrong with it but i can't figure out what it is.
The setup is as follows:
Two doubleslit experiments Station A and Station B are supplied with entangled photons by an emitter between them. When Station A measures its photons the Photons in Station B should also behave like particles and land on rectangles representing the slits behind the experiment. But if A doesn't measure they can land with a certain probabillity somewhere behind the experiment, the propagation of whitch is a wavy interference pattern.
We can't send full sets of information however because anything that lands in the rectangles for 'particle like interaction' could also have been part of the wavepattern. And there is no way for us to figure out if it was the former or the latter until A and B have compaired notes with each other.
What we do know however, is that everything outside of the rectengles for 'particle like interaction' can only be from a wavy interaction and therefore was not measured by A.
But now imagine we send a constant stream of those entangled photons, say 50.000 a second. And Station A were to constantly measure their entangled partner. If we were trying to send one bit of information per second so 50.000 measured or unmeasured particles we still couldn't absolutely sure weather the right bit made it overor not, but we would have a pretty high probabillity. Definately way over 50 percent. That means with redundancies and safetyfeatures we allready employ for accidental bitflips in normal computer communications, we should be able to get some information across shouldn't we?
We wouldnt be breaking laws causality or general relativity because not a single Full Information was sent, but the probabillity is so high that it wouldn't matter to us, if all we wanted is to get across an email or a rudimentary morsecode or something.
And yes it would be very slow, but the two stations could be arbitrarily far apart so no matter how long we need its still quick enough to "Break the cosmic speedlimit" The main Point of all of this is: Do we even have to send full sets of information? We are so good at estimating, predicting and Processing incomplete information, That a fraction of it should be enough keep day to day Communications going even to Very distant places.
So if you know what is wrong with this theory i would be gratefull for an answer so i can scrape this from my mind.