The Walking Dead is just the latest in a long line of tv shows rumored to have a movie version in the works. Of course some eventually do make the transition -- Sex And The City, The X-Files --, while some seem destined to be the stuff of fan frustrating "what ifs" -- Lost, The Sopranos. Still, given the enormous popularity of AMC's gruesome series, there is always a possibility that we will eventually see a feature film, and Bloody Disgusting have heard that it might just happen. The site report..
I have some friends working closely with “The Walking Dead” crew and AMC who confirm with me rumblings of a feature film. But before you get your panties in a bunch, we’re only in the third season of the popular zombie show – adapted from Robert Kirkman’s astounding comic book – and when I say rumblings, I strongly suggest the idea has been passed off in non-business conversations
In other words, nothing official...at all. But given that the show was actually originally shopped as a movie before going to network, there is nothing to say these "rumblings" won't become a reality and we get a movie to finish up the series. Of course, as pointed out above,
The Walking Dead is only on its third season now, so we will most likely be talking quite a few years from now one way or the other. Would you guys like to see the show end that way? Or would you rather we got our last season, complete with a nail biting tv finale?