If there were any significant random variations in photon velocity, then when a telescope looked at a distant galaxy or other object moving with a high translational speed, the object's image would be smeared. The position of early arriving photons would not match up with the position of the laggards. This effect should put a bound on how much the velocity could be varying on a macro scale.
Galaxies don't move fast enough for this to be an issue. I mean, sure, they move at a few hundred km/s, which is a lot faster than anything we see on Earth, but a few hundred km/s means that they move just a tiny fraction of the length of the galaxy (which is at least several tens of thousands of light years) within a human lifetime. No way would such an effect ever be measurable for something like a galaxy.