Interesting paper shows that the speed of light fluctuates in vacuum: http://www.popsci.com/science/article/2013-03/speed-light-vacuum-varies-slightly-study-finds?src=SOC&dom=fb

^ These ideas may be testable, according to the article. But until they actually are testable, it's bullshit.

Soooo. Speed of light may not be constant in a vacuum... that is not really a vacuum because virtual particles interact with the photon causing it to not go in a straight line. Did I get that right? A true vacuum can't exist because Nothing would have at least the single property of absence of something which is not nothing and immediately decay into virtual particles, queue Big Bang. So the point is moot anyway. Also the photon is not slowed down, it is just delayed or distracted, basically zigzagging around instead of going straight from a to b but still doing so at constant light speed. Or did those scientist say something different?

Is this actually a new finding? I can think of a few reasons why the actual speed of light is already known to fluctuate – vacuum is not absolute vacuum, and the uncertainty principle guarantees that the speed of light, or rather, individual photons, is non-constant. Also, it is already known that particles do tend to pop in and out of existence all the time. What in this study is inconsistent with what we know? Does the speed of light fluctuate more than predicted by other studies?

This effect must have been incredibly small to have gone unnoticed by now. I honestly doubt its real.

In an inertial reference frame the speed of light should be constant. Anyone see a research grade paper that computed the backreaction of the EM vacuum of the geodesic of a photon?

If the velocity of a photon is constant, you could calculate it by calculating the average speed. If you know the velocity of the photon in every moment (or any moment, for that matter), by Heisenberg's uncertainty principle, the photon has to occupy the entire universe. Since the vast majority of photons are somewhere, their velocity vectors can't be constant. Of course, the change could be in both the direction and the size of the velocity vector, but suffice to say, the photon is not moving at a constant speed in a constant direction. What's more, the individual photons can occasionally have a velocity larger than the speed of light, in fact they can move with any speed shortly due to the uncertainty principle. When you add up all points in time of the photon's journey, however, the average speed is exactly the speed of light if you subtract any interaction of the photon with other matter.

If there were any significant random variations in photon velocity, then when a telescope looked at a distant galaxy or other object moving with a high translational speed, the object's image would be smeared. The position of early arriving photons would not match up with the position of the laggards. This effect should put a bound on how much the velocity could be varying on a macro scale.

Galaxies don't move fast enough for this to be an issue. I mean, sure, they move at a few hundred km/s, which is a lot faster than anything we see on Earth, but a few hundred km/s means that they move just a tiny fraction of the length of the galaxy (which is at least several tens of thousands of light years) within a human lifetime. No way would such an effect ever be measurable for something like a galaxy.

^ Well, let me crunch some numbers. If there was a galaxy moving in translation at 300 km/sec (3.0e5 m/sec, or 0.1% of C) then the lead angle for the light rays should be asin(0.001) or 0.057295 degrees, which is 206.264 arc seconds. If half of the photons actually travelled at 299e6 m/sec their lead angle would be 206.954 arc seconds, a 0.69 arc second difference, which should be observable with current telescopes, a bit like gravitational lensing. I don't know if that's a valid way to look at it, but if it is it would at least set a limit on how big the effect could be.