I am writing a simulation that includes chemical reactions and I think I am getting some things wrong with entropy and Gibb's free energy.
I was calculating the overall entropy of my system at a particular point in time. There is a map of molecule abundances. My naive approach was to just use the frequencies of molecule abundances. Say I have a 1D map of molecular abundances $M$ in mol. I would calculate entropy $S$ like: $$ p_i = M_i ({\sum_{j} M_j})^{-1} \quad , \quad S = R \sum_{i} p_i \ln p_i $$ where R is the gas constant. At first sight this number looks good. A Kronecker delta (like $M = [0.0, 0.0, 5.0, 0.0, 0.0]$) would give $S=0.0$ and when I allow molecules to diffuse (e.g. $M = [0.5, 1.0, 2.0, 1.0, 0.5]$) this number increases.
Now, these molecules also have internal energies (representing energies of chemical bonds). If there is only one molecule species, and $E$ is the internal energy of one mol of this molecule, the internal energy of this system would be: $U = E \sum_{i} M_i$.
Here is what confuses me: $U$ grows with the number of molecules (seems intuitive). $S$ does not. So, when I calculate the Gibb's free energy of the whole system $G = U - T S$ ($T$ being absolute temperature) its dependence on $S$ depends on the size of my system. I.e. the more molecules I add to my system, the less important $S$ will be overall.
Is there some term that I should add to $S$ to reflect the actual number of molecules?
Edit for context The simulation is a rather macroscopic view of the world. Each voxel has a number of molecules (for ease of use in mol) and reaction velocities are based on Michaelis Menten Kinetics. Changes are calculated step-by-step where each step is a time unit. The direction of any reaction is defined by its change in free Gibb's energy as $\Delta G = \Delta G_0 + RT \ln Q$ where $G_0$ is the standard free energy of the reactants and $Q$ is the reaction quotient.