The thing that really sticks in my craw in the whole franchise is the notion of "sentient" "living" or whatever machines. I think it's just stupid, even if some stories built around this misconception are actually interesting (kinda like vampire stories, they're absurd but can be entertaining). The whole point of making a machine is that it can do the work that is too dangerous, too tedious, too demanding, too expensive (when done by a human being) for a human being to do. If you build sophisticated tools after years and years of research and people tell you: "You can't use those tools because they are alive" whatever the hell that means, then what's the point?
Seriously? What's the point? There's lots of potential answers to this. One frequently bandied around these days is that super-intelligent AI is an eventual certainty, so rather than create it by accident (which could result in a self-aware entity which cares nothing about us) we should consciously attempt to create self-aware AI with ethics and morality that will protect us rather than destroy us.
There's also just the whole god complex thing, which has a longstanding basis in sci-fi. Humans like to create things, from great works of art to fine buildings to weapons of mass destruction. Creating another self-aware being (other than the normal way, with your gametes) would be the ultimate power trip.
Plus intelligence and sentience and/or sapience are two very different things. Intelligence can be simulated at the highest level by a program that will be no more sentient than your current toaster.
You have no way of knowing that.
Presuming that we live in a totally materialist world (no souls or other non-physical basis to consciousness) a self aware machine - or at least self-awareness within a machine - is inevitable. At absolute worst you'd just need to simulate a human brain down to the atomic level. Since the simulation would contain identical structure to the "real" thing, there's no reason to think it wouldn't be as self aware as a "real human."