I've just finished the first half of Season Three of Walking Dead.
i continue to enjoy this show, which has me hooked on the storyline episode to episode. I'm a big fan of Romero-esque post-apocolypic horror/disaster stuff, but dang, i think it's time to make a zombie move/series NOT in the USA. It's getting depressing watching the tattered remains of American humanity continue to kill each OTHER instead of the zombies!
Maybe we need a scandanavian one, where survivours will cooperate with one another to rebuild society, and teach the zombies to be vegetarian or something...


