"This isn't a democracy anymore!"
With those words from Rick, (leader of a band of people who have come together to try and survive some sort of zombie apocalypse), the season two of The Walking Dead came to an end--and my excitement of the possible political implications of what this could mean for the next season began!
Could some real deep social commentary about America finally be at the forefront of this series? The show has had its ups and down in terms of plot, but it has mostly been an entertaining show to watch. However, it's mostly just been entertaining, so the possibility of a show that is both entertaining and also makes some sort of commentary on current events or the current state of society has gotten me very excited! Of course, this is all in the writers' hands...
But think of all that can be done: Could the zombie apocalypse represent the failed capitalist system? Will the zombies come to represent people who vote? Will the band of lone survivors become some sort of metaphor for a future society--possibly even socialism? (Or, another unfortunate cynical story about so-called human nature)?
What gives me hope is the fact that a lot of good social commentary has already been made in zombie movies. Cult classic films by George Romero like, "Night of the Living Dead," and "Dawn of the Dead," which deal with issues of racism, inequality and consumerism, come to mind. What do people think???