I feel like a lot of zombie fiction where characters know what zombies are and the dangerous of getting bitten end up being semi-satirical comedies. Movies and shows where the idea of zombies didn’t previously exist seem to be a bit more serious from what I’ve experienced. I don’t know if it’s the aura of suspense and mystery or because it leads to more pandemonium.
Zombie is just a tool. Best Monster media is always about something else, that’s more important than the zombies. About trust on strangers (dawn of the dead), the breakdown of society (28 days) about moral decisions (the last of us) or about the distribution of rights and the power of corporations (resident evil). The rest is an excuse for action and gore.