I'm a big fan of The West Wing and always contended that it was first and foremost (at least for the first 5 seasons) a work-place drama rather than a political show, although there was a bit of political ideaology that was always there.Key to what? The stories are still about people. Similar stories can be told on a ranch in the 1870's or a hospital in 2010's.
It was primarily a TV show about a bunch of people who work together and the events that happen during the course of doing their jobs...with their workplace just happening to be the White House, and their boss just happening to be the President.
It wasn't until season 6 and 7 that the show started floundering in its own "politicalness" and went downhill. I wanted to see a show about the co-workers I had grown to like, not a show all about politics or social ideologies.