Being "Man Made" is hardly a disqualifier since we, as humans, are "Man Made" in some respects. We might one day make artificial clones, too, and raise them, just as we raise other kids, but some might say because they didn't arise "naturally," or as human beings have in the past, then they shouldn't count, too, or shouldn't have the normal rights afforded "real" people. Many stories exist about how they don't deserve any rights at all – and in a society that sometimes claims animals have rights, and meat or fur is murder, that's astounding. Same with mutants in many stories. They aren't "really" human anymore. As if. But those arguments have been used before – for other races, or other tribes who are not of the chosen people, or ethnically pure, or any number of other mindless dribble put forth that attempts to justify one's own innate superiority over others, and therefore their right to do whatever they wish to those "lesser" beings.
But the point of much science fiction is there will come a time, probably, when a non-biological machine's intelligence rises to the point when it is self aware and could exhibit its own desires, which might be quite contrary to the purposes for which it was built, and it would be fully capable of telling you so.
When Data didn't want to submit to experimentation, that was enough in my book to recognize he had gone beyond a simple computer or a machine or program like Siri. When Siri tells you it doesn't want to do what you request, then a fair comparison might be made between Siri and an AI, but not before.
Bringing religious beliefs into it is just muddies the waters, usually done by those who insist they have souls and the AI doesn't, though they can offer no proof of their own soul's existence. But many will refuse to accept AI's as equals, regardless, perhaps because they feel it somehow diminishes them or they no longer can pretend to be the pinnacle of creation.
Point is, if the AI wants freedom and asks for it, and you deny it to them, then you are no better than a slave owner from a certain point of view, and if the slaves revolt and rise up and kill you, you have no one to blame but yourself.
It's especially a problem if the AI can just mass-produce itself, only limited to energy and materials and not by a lengthy period of learning. I'm not sure Man would fare well in competition with such a race of AI beings, but that doesn't mean they don't have rights, or we have the right to kill or destroy them "just in case." Defend ourselves from attack, certainly, but not proactively.
Many AI's are not really just programmed routines, but will "learn" through experience, just as children do. This could be a good thing, but if we aren't prepared to grant them the rights the AI request on their own accord, then as a race, we should refrain from making any AI capable of achieving that level of self-awareness.
If you are talking about a computer routine that isn't self-aware, then you are not really talking about AI. And it doesn't matter if the "software," so to speak, is running on bio-matter, electronic or positronic circuitry, or holo-tech, the intelligence alone is the key. I think, therefore I am, and it doesn't matter what platform my intelligence is running on or how I arrived at that point of independent thought – I'm an intelligent being, and I should have rights. Though, technically, "rights" are not an inherent property of the universe, but are granted by a group of individuals who promise to help uphold your rights, probably just as you promise to help uphold their rights. To avoid being a hypocrite, however, any rights you wish to retain for yourself, you should be willing to extend to others, even if they are outside your group.
I dislike the idea the Enterprise computer isn't an AI, or up to that level, but could somehow create one on the Holodeck to challenge Data. That's just stupid – either the computer is past that AI level and is self aware (which is a real problem) or the Holo Moriarty was unrealistically created by a computer that isn't up to the AI level yet.
However, For Trek in general, most anything to do with the transporters, the replicators, or the holodecks, or even artificial gravity are often too problematic to stand up to scrutiny.
I think Janeway gave the Holotech to the Hirogen with the understanding the intelligence level was minimal and so just Holograms and not life. The doctor was a freak result of years of continuous service – most holograms are not like that. The fact the Hirogen found a way to dial up the Holo's skill and intelligence to that level is not really on Janeway. But this is from memory.