I am not a huge fan of zombie themed shows and movies. However, The Walking Dead is a great show. It is not just about zombies and death. It's also about the drama that goes one with the people who are still alive. What I love most about the show is that it doesn't focus on the apocalypse, it focuses on survival while also trying to lead a normal life.