This is a much-discussed question in thermodynamics.
The short answer is that the 2nd law of thermodynamics does not imply that balls cannot move between the containers when thermal equilibrium is established (at some non-zero temperature), nor does it rule out the possibility of fluctuations which move balls between the chambers temporarily. The question is, therefore, how to understand the concept of entropy when such fluctuations can take place.
Consider an isolated system, so it has fixed energy, volume, particle number $U,V,N$. Suppose this system also has some internal parameter $x$. In your example $x$ is the number of balls in one of the two containers. Let $S(x;U,V,N)$ be the entropy which the system would have if $x$ were constrained to take some particular value.
Such a system, in thermal equilibrium at some given temperature, will exhibit thermal fluctuations such that the probability that the system is found, at any time, in a state whose entropy would be $S(x;U,V,N)$ if $x$ were constrained is given by
$$
P \propto e^{S(x;U,V,N)/ k_{\rm B}}
$$
There is some particular value $x_0$ at which this $S$ is maximised for the given $U,V,N$. So that is the most likely value of $x$. Let us call it $x_0$. It would be $N/2$ in your example.
Now, as the question rightly points out, during the fluctuations $S(x;U,V,N)$ reaches values less than $S(x_0;U,V,N)$, and the question is, doesn't this break the 2nd law? The muddle has arisen because of two different uses of the word "entropy". The entropy of the system under consideration---the system with no constraint on $x$---is
$$
S(U,V,N) \simeq S(x_0; U,V,N).
$$
where the quantity on the left is the entropy for the system with no constraint on $x$. This is very closely approximated by $S(x_0; U,V,N)$ in the limit of large systems. $S(x_0; U,V,N)$ is the entropy of the largest set of microstates in the collection of those which contribute to $S(U,V,N)$ and this largest set contributes almost all of the total.
This $S(U,V,N)$ is the quantity which the second law is concerned with.
The quantity
$$
S(x \ne x_0; U,V,N)
$$
is not the entropy of the system. It is what the entropy of the system would be if $x$ were also constrained and had some given value.
During thermal fluctuations the system explores a range states with different values of the parameter $x$, and these states can be macroscopically different in the statistical mechanics sense of the word. This makes one think of the system as moving around the macroscopic state space, with entropy going up and down accordingly, but that would be something of a mistake. One has to get used to the fact that such fluctuations are an intrinsic aspect of the situation of thermal equilibrium. The very concept of "thermal equilibrium" has to be understood as embracing this range of states. In this understanding we should ascribe the term "equilibrium entropy" not to
$S(x \ne x_0; U,V,N)$ but to $S(x_0; U,V,N)$. It is this second entropy which the second law says cannot fall in an isolated system.
For a proof that indeed the 2nd law does hold in such cases, you can examine devices such as the Feynman-Smoluchowski ratchet which try to exploit thermal fluctuations in order to extract work from heat at one temperature. Such devices only get work when there is a temperature difference and they involve a loss of heat exactly as the 2nd law requires.
[All the above has assumed conditions towards the thermodynamic limit, where the distinction between average value and maximum value can be neglected, and probability distribution functions can be taken to be normal (Gaussian). A reference for further information is my thermodynamics book, section 27.1.1. Pippard also has interesting remarks.]