I understand how the field lines enter and leave the gaussian surface. But my concern is that the field isn't constant everywhere on the gaussian surface, i.e, there exactly doesn't exist an $- E\cdot da$ corresponding to every $E\cdot da$. I understand the idea of how the enlargement of the area compensates for the reduction in the field. But I want it mathematically. If you could prove that for a point charge on an irregular surface, the job would be done.
I have however, made an attempt to prove it in the below image, where I basically proved how the flux doesn't depend on $r$ for an infinitesimally small surface $da$. But I am not satisfied with it. Please help. Ignore the proof if it isn't good enough and doesn't make sense, I am just a fresher, hence a noob. [Proof]1