No. Lets see.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
As a starfleet officer, he would be expected to order subordinates to their death if necessary. We saw this demonstrated when Troi was trying to become a commander. Based on the fact that we've seen Data in command of the Enterprise on occassion, we can assume that he has passed the command exams, I can't see how that would be possible if he was forced to obey this law.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
By Federation principles, forcing a sentient lifeform (which we know Data to be considered as under Federation law) to follow any order given to it by a human would be considered slavery. Obeying the orders of a senior officer is one thing, but
any human?.
Given Data's line in First Contact "To hell with our orders" also lends weight to this argument.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
From what we've seen, Data actually appears to follow this law. He has a sense of self-preservation. Yes, he has more than once put himself in danger, and ended his own life, but in all cases it was to prevent humans from doing so.
All things considered, I'd say the latter is personal choice, (and logical) but no, he doesn't follow the laws.