Although I could understand the basics of single slit diffraction, diffraction grating seems something confusing to me.
In diffraction grating, we know there are a lot of slits within a unit length. For our convenience we consider that there are 5 slits per centimeter. Now if the source is quit far from the slits for around 5cm behind the slits,diffraction fringes will be formed on th screen. There will be a pattern of minima and maxima. If we consider two slits only then we would easily understand the presence of minimas. But this time there are many slits, so shouldn't the minima be a more bright considered to single/double slit diffraction? Since the wavelets emerging from the surrounding slits might not have that specific phase difference which would negate each others. I don't know if it is correct or not but I think the brightness of minima shouldn't be much less compared to the maxima. I know that in case of single slit diffraction the minimas do have a little bit of brightness which usually gets ignored or cannot be percieved.
Another question: Is there only one central bright maxima in case of diffraction grating or many depending on the number of slits? By central maxima I am referring to fringes with maximum brightness/intensity.