I went and saw "Battle Los Angeles" last night, where the world pretty much gets its ass kicked by a bunch of aliens. Most major cities are leveled, and thousands (millions?) of people are dead. One might consider it a mini-apocalypse, of sorts. Sure, the human race survives, but at what cost?
That said, it ends on a very positive note, and it got me thinking. Are there any positive, optimistic movies/books about rebuilding society after a huge apocalyptic event? Most of the time we see post-apocalyptic worlds where everyone is living is ruin, and there are goofy monsters running around, and everything about it makes the audience want to kill themselves. Is there enough drama to make an uplifting story about a post-apocalyptic world?
That said, it ends on a very positive note, and it got me thinking. Are there any positive, optimistic movies/books about rebuilding society after a huge apocalyptic event? Most of the time we see post-apocalyptic worlds where everyone is living is ruin, and there are goofy monsters running around, and everything about it makes the audience want to kill themselves. Is there enough drama to make an uplifting story about a post-apocalyptic world?