My girlfriend has been binge watching Breaking Bad all week. We just finished it last night. I have very mixed feelings about that show.
Don't get me wrong, Breaking Bad is a work of art. An outstanding achievement. But at some point, mostly in season 5, the show got so cynical and dark and cruel, that it just kinda stopped being entertaining and depressed me. I understand that the real world is often like that, but that doesn't make it...I dunno...enjoyable to experience. Very satisfying series finale though, despite my iffy feelings about the last season in general.