Normalized can be an ambiguous word, but the meaning that lets it capture most of what it is meant to in quantum mechanics is that Parseval's and Plancherel's theorems should hold, so that the inner product between quantum states in $L^2$ is equal to the appropriate definition of the inner product between the corresponding superposition / weight functions when these states are resolved into a new co-ordinate system with the new basis states. That is, our transformations between co-ordinate systems must all be unitary, so that calculations of inner products, probabilities and the like all carry over seamlessly.
When the new basis states are a nondiscrete set, this definition reduces to the equation you cite. It will also reduce to the one more familiar to you when you apply it to a discrete set of basis states.
Also, different problems have different domains of definition and different state spaces. The Generalized Legendre functions crop up in both 2D problems and as part of the 3D spherical harmonics. So the notion of normalization will vary according to the problem domain.
Lastly, "normalized" can also mean "scaled according to a standard recipe". Hence the particular definition of the Legendre function normalization you cite.
There is no, universal definition and you must check each author carefully, although you will not find any usage of "normalized" in quantum mechanics other than the first one I gave within quantum mechanics.
Further Background
What's tricky about the position and momentum observables is that they have continuous spectrums. Their eigenfunctions are also unnormalizable, since they have no $L^2$ integral. The two - continuous spectrum and unnormalizability - go hand in hand as discussed in QMechanics summary of the reasons for "discreteness" in some of quantum mechanics.
This means that their eigenfunctions do NOT belong to the usual, separable Hilbert space discussed in quantum mechanics
Let that sink in a bit: eigenfunctions of the position and momentum operators simply do not exist in our usual quantum state space. This fact is often not emphasized enough. Indeed, it is often glossed over, the mysterious Dirac delta whipped out in the lecture in question and students who question this weird beast duly made to feel inadequate because they can't understand this magic instantly.
Something highly nontrivial is going on here, and it is that we have to build a whole new framework - that of the notion of a Rigged Hilbert Space - to even talk about such eigenfunctions and to achieve the ability to build an arbitrary quantum state in $L^2$ out of the eigenvectors of a noncompact operator which now do exist in the rigged Hilbert space. I give a detailed discussion of rigged Hilbert space here.
But, when we do build the eigenfunction expansion in rigged Hilbert space, the superposition cannot be discrete and we must now use an integral. The relationship between this integral and a resolution of a quantum state into a countably infinite set of eigenvectors is very like the relationship between the Fourier transform and Fourier series.
Since we are now working with a broadened definition of quantum state space with integral instead of sum superpositions, it's not surprising that the notion of normalization must broaden too. The equation you cite is the broadening that keeps transformation into position / momentum co-ordinates unitary.