I'm trying to understand Dirac's argument for monopoles in his 1931 paper$^1$, but for me it's still a tough read (undergrad physicist).
I can begin to summarize his argument as follows: Take a very small closed curve. Assuming that the wave function is continuous, then the change in phase around a small closed curve must be small, which implies that the phase difference between different wave functions cannot be multiples of $2\pi$.
In the exceptional case that the wave function vanishes, the points at which it vanishes form a nodal line.
...but after this point I get completely lost. To clarify, I'm trying to summarize his "nodal line" argument into a short, clear argument that a group of fellow undergrads can understand.
- P.A.M. Dirac. Quantized singularities in the electromagnetic field. Proc. R. Soc. London Ser. A 133, pp. 60–72 (1931).