2

Let me first say that I am not a physicist, but I am trying to make a simulation on my computer and I have the following question.

enter image description here

Let's consider that we have three free charges that somehow can change their charge in time. $$ Q_1(t) = \sin(\omega t) \\ Q_2(t) = \begin{cases}1 & \mod(t,2)=1 \\ -1 & \mod(t,2)=0\end{cases} \\ Q_3(t) = 2\cos(2\omega t) $$ where $t$ is the time measured in seconds. We say that all particles have same mass $m$. We also consider gravitational potential energy to be negligible.

I am trying to find

  1. potential energy in time
  2. entropy variation in time

I understand that this might not be solvable analytically, but as I said, I am trying to simulate this on a computer. I have the forces over time, but I really don't want to integrate them spatially to find the potential energy, as in my simulations, the particles can count thousands. Could you point me out an easier way, or a simplification that I could try?

About the entropy, do you think this has no sense as the system if far away from equilibrium, or for any other reason? If you think this is a valid thing to measure, could you tell me any ideas on how to do it?

Qmechanic
  • 201,751
Yann
  • 141
  • Charges 1 and 3 make no sense, considering that charge is a discrete quantity. – Kyle Kanos Aug 04 '14 at 15:17
  • @KyleKanos Continuum mechanics is "wrong", but that doesn't make it wrong. I have no clue as to what the charges here are supposed to represent, or have any guesses as to their units or magnitudes, but surely there are cases where varying a charge continuously is perfectly fine. As for the question, I don't see a way of avoiding integration. If you only have 1000 or so particles, you should be fine if you implement some fast methods (e.g. Ewalds). Potential is easy, but the entropy variation wrt time is tricky. Especially so if the variation frequency is comparable to equilibriation time. – alarge Aug 04 '14 at 15:40
  • @alarge: Continuum mechanics is an approximation, so it's not wrong in any sense of the word. Charge is quantized, so saying it can range between -1e and 1e is contrary to physics (it can only be -1e, 0e, or +1e with nothing between). – Kyle Kanos Aug 04 '14 at 15:43
  • @KyleKanos When 1 C = 6.241e18 e, surely the charge, too, can be considered a continuum quantity (approximately). Again, I am stressing that this all depends on units, and I have no idea what the orbs are supposed to represent in the original question. I should also note that even if they (the orbs) were atoms, these are in typical molecular dynamics simulations given partial charges that are not integral numbers of e, for example 0.301 e (because that's approximately how the electrons are shared). – alarge Aug 04 '14 at 15:48

3 Answers3

0

Correct me if I am wrong, but a potential energy can only be determined for a conservative force field, which means that the force can only depend on position. So, because the charges vary with time you can not determine the potential energy.

If the velocity is small compared to the oscillation, such that the displacement during the common period of the charges is small compared to their distances to each, you could use the time average of the forces to get some sort of effective potential. This effective potential between $Q_1$ and $Q_3$ will always be zero. With $Q_2$ it is a bit unclear, since with the current definition for it is undefined for the most part. If you meant $Q_2$ to be a square wave then there could be a potential depending of $\omega$.

I am not sure what you want with entropy, since from the equation you gave you can formulate a differential equation, which will have a deterministic solution.

fibonatic
  • 5,876
0

I'll make this an answer, even though it is more of a drawn out comment.

As I mentioned as a comment, computing the potential energy is trivial. If you want speed, you'll probably want to look at fast methods for long-range interactions. The link takes you to state-of-the-art libraries and methods, but any introductory book on computational statistical mechanics (or molecular dynamics) will explain Ewald sums. This is to say that I don't think you will be able to analytically calculate the stuff you want, but these methods should cope well with thousands of particles (or maybe hundreds if you want real-time performance).

The entropy is tricky for several reasons. First of all, it is difficult to compute in general from simulations. Also, I don't know if you even can in general compute absolute entropies. So what we are left is the relative entropy between two states. What are the two states? Herein lies the next problem: entropy is an equilibrium quantity. So, say you want to compare the entropies between t = 1 and t = 2. This is to say that you want systems that are equilibriated with the charges that would occur at t = 1, and t = 2. To this effect you can do, for example, thermodynamic integration. This is an expensive computation.

alarge
  • 2,413
0

The entropy can be written as (discrete form)$$S=\sum_i p_ilog(p_i)$$

So you must identify what the uncertainty in your problem comes from, you would think that you have (in principle) exact deterministic equations for the evolution of these particles, so there is no uncertainty with respect to that.

If you have unknown initial conditions then you could certainly then compute entropies because the exact relations for the trajectories would be propagating the uncertain initial conditions. Then maybe you could infer what the charge distribution looks like at some moment in time or something, that distribution would have an entropy associated with it.

SalmonProtocol
  • 295
  • 2
  • 6