I'm recently taught that the angle of diffraction of waves depends on the barrier width relative to the wavelength. For example, if you have a plane wave of shorter wavelength, and if it hits a gap of an obstacle, the angle of diffraction will be smaller since the gap is bigger compared to the wavelength, and vice versa.
I got confused when considering a wave hitting an edge of an obstacle (of infinite width). I was taught that in this case, the angle of diffraction depends completely on the wavelength:
and
However, unlike the previous situation where you have a gap for size reference, there's nothing for the wavelength to compare with in this case! In other words, the two scenarios in the images are completely the same if you change your perspective a bit by "zooming in" the latter one! (Assuming the width of the barrier doesn't matter) So why do their angles of diffraction still differ?