I feel like a lot of zombie fiction where characters know what zombies are and the dangerous of getting bitten end up being semi-satirical comedies. Movies and shows where the idea of zombies didn’t previously exist seem to be a bit more serious from what I’ve experienced. I don’t know if it’s the aura of suspense and mystery or because it leads to more pandemonium.
…in the same way christianity is based on the teachings of Jesus.
There’s not much in common.