You don't get ratings worthy of broadcast TV by showing nipples and dropping f-bombs, dude.
No but it adds an level of realism absent in other shows. For example, take the decent but sometimes anti-climatic NBC drama, Revolution.
Would the story had worked better if they were able to add in more realism in terms of profanity, violence, and even nudity into the show?
No. If you want realism, then turn off the TV and go outside. TV should offer a respite from "realism", which, sadly, it no longer does, which is why I never watch anything being made anymore.
It might be better to create a new topic for this, but the question of whether TV ought to be realistic is an interesting debate.
I like both realistic entertainment and fantasy entertainment, and both certainly have their place. And it certainly needs to be realistic enough to make the decisions made by the characters believable. If you don't believe the characters' decisions it's hard to care about them.
And certainly as realistic as you try to make your show look, there's always an element of unrealism, just because it's necessary to tell a dramatic narrative.
I've also noticed that when a show gets called 'Realistic' they usually just mean 'Depressing'. And when people say they don't want to watch realistic shows they often mean they want to watch uplifting shows where the heroes win. These two really shouldn't be connected. Good and bad things happen in reality all the time. The reason for the perceived shift toward realism in television is that shows like Full House are going away. That's not a shift toward realism though, it's just a shift away from triteness and away from characters and stories that are so idyllic they're completely unrelateable.