I have two parts to this question.
First, I understand that the meter is defined as the distance light travels in 1/299,792,458 seconds. But how is this distance actually measured? The second is obviously from an atomic clock, but Wikipedia makes it appear that the distance is calculated by the counting of wavelengths. For the count of wavelength to be useful you must calculate the physical length of it, which is dependent on the speed of light, which is defined with meters and to increase the accuracy of the measurement, NIST recommends the use of a specific wavelength of laser (which is in meters). Does this not make the actual measurement of the length of the meter circular? Or does the fact that the speed of light is a constant defined in meter/second over come the appearance of the circular logic? The way I see the circular logic would be if the speed of light ever changed, the length of the meter would also change, which make it impossible for us to know the speed of light changed without referencing it back to an older physical object.
Further, the measurement is done in a vacuum but we can not actually create a perfect vacuum, so I assume there would be a pressure range allowed on the vacuum, and pressure measurements are also based on the definition of a meter (I would think this would further add circular logic to laboratory measurements performed in air and adjusted for refraction).
Second, how is this measurement for the meter actually used to calibrate physical objects? I.e. If I buy a meter stick that has been calibrated against the national standard, how do they actually compare the length of the stick vs the wavelength measurements?