In the Walking Dead Universe, Everyone turns once they die.
Ok...now I have to ask (bear in mind, I've never seen the show): If zombism is the inevitable result of all death on TWD, then why does anybody on this show do anything at all? If they're all doomed to inevitably become zombies anyway, then what's the point of anything?

Are they trying to find a cure or something? Because that would pretty much be the only thing worth fighting for in a world where everyone must become a zombie.