For all practical purposes today the answers above are very informative.
However, as Marek has pointed out above, your fundamental theoretical model of the thermodynamics of computation, on which you are basing the question is, surprisingly, wrong, as we first began to discover 50 years ago (see refs. to Landauer Charlie Bennet, Friedkin, others). Actually, all computations are in principle, dissipation free, except for the dissipation required to overwrite or forget previously stored bits.
The classic example is this. Suppose you want to compute the next to the last binary digit of the zillionth prime or some such. Then you do so, slowly and reversibly, carefully not overwriting any of the intermediate bits you generate, which requires a lot of space. Perhaps you even make use of quantum entanglement in the computer. Then you write the answer, by overwriting (irreversibly forgetting) the single bit of the answer in some (say external) register. Then you can reverse the original computation, also without any dissipation at all. You are left having to dissipate only the entropy necessary to overwrite the 1 bit of the recorded answer, because that is the only information you were forced to forget.
Since the denominator approaches zero, in theory, the theoretical answer to your question is infinity. There is a tradeoff with space to hold all the intermediate results. This is a surprise, a shock really, but it shows the power of clear thought. It is intimately connected with quantum computing, but also has entirely classical models.
So the right theoretical way to ask your question would be more like, for a particular computation, to be completed in a time t, operating with a limited memory of x bits, what is the necessary dissipation. I'm no expert, but will try to get more refs. PS. The resting brain probably uses about 20 Watts.