• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Measure of a Man: The solution.

Personally, I don't think that you can just create a self-aware being

Data told the court who he was and what they were doing in the courtroom, deciding on his right to choose. Like Picard said: "Seems pretty self-aware to me".
But is this about how technological creatures can't be self-aware? Does it have to be "made by nature" to really be self-aware? Artificial lifeforms can't be self-aware? I don't know, just asking....
 
Data told the court who he was and what they were doing in the courtroom, deciding on his right to choose. Like Picard said: "Seems pretty self-aware to me"...

None of this proves that he's self-aware, just that his AI is well-programmed which is a completely different thing. If your computer greets you in an affectionate tone, you won't conclude that it's self-aware, will you?
 
None of this proves that he's self-aware, just that his AI is well-programmed which is a completely different thing. If your computer greets you in an affectionate tone, you won't conclude that it's self-aware, will you?

Can't good programming create self-awareness? Our brains are self-aware but where does that come from? Programming of some sort?
 
Can't good programming create self-awareness? Our brains are self-aware but where does that come from? Programming of some sort?

Good programming can't create things that you don't understand no more than throwing car parts in a big container will make a working car. That's a point that I am trying painfully to make, obviously, I have yet to be successful.
 
But do we know what makes us self-aware? If we don't know what makes us self-aware how can we say what is and isn't self-aware?

With that kind of reasoning, you could argue that a toaster is self-aware. I think you need a little more than a "we never know" to prove your case.
 
With that kind of reasoning, you could argue that a toaster is self-aware. I think you need a little more than a "we never know" to prove your case.

How do we know what creates self-awareness? Like Picard said to Maddox: ”Prove to me I'm sentient”. Maddox couldn't do it.
I'm not saying we will never know but we don't know right now, right?
 
I'm sure this has been said but I'm not digging all the way back to check.

I love the episode for the performances and the ethical debates instead of pew pew, but Starfleet's legal standing was nonsense.

The one and only person who could have placed a claim of ownership on Data was Dr. Soong, and his intentions were for Data to be an independent being owned by no one.

Starfleet possibly had one chance to claim right of salvage (which would have come to a hearing like this eventually to gain his freedom) when they rescued Data from Omicron Theta, but when they gave him the choice to join Starfleet and go through the whole process a normal applicant would they were acknowledging his personhood and freedom to choose.

So everything in the episode should have been settled law.
 
How do we know what creates self-awareness? Like Picard said to Maddox: ”Prove to me I'm sentient”. Maddox couldn't do it.
I'm not saying we will never know but we don't know right now, right?

I don't know that Picard is self-aware but I posit it on the basis that I am self-aware and that we are similarly made. Self-awareness is a trait granted to human beings because of similarity. Otherwise, you can't be sure that anybody but yourself even exists as Descartes remarked. Human beings may have acquired self-awareness by chance but it was after billions of years of evolution and there may be a connection between self-awareness and survival. Achieve self-awareness in a mechanism in no way similar to us without knowing how it works is unbelievable unlikely, that's the point I was trying to make from the start. of this discussion. You don't create self-awareness out of pure luck any more than you can create a working car by throwing car parts in a container and hoping that they'll somehow join together. Unless we can understand exactly how self-awareness works and how it can be made artificially there's no way that we can make self-aware machines. Just as someone who doesn't understand a computer language is unlikely to write a working program or a monkey with a typewriter is not likely to write Hamlet.
 
The one and only person who could have placed a claim of ownership on Data was Dr. Soong, and his intentions were for Data to be an independent being owned by no one.
Just as a note, Soong was still presumed dead at that point, which I figure meant they had no consideration for what his intentions had been.
Starfleet possibly had one chance to claim right of salvage (which would have come to a hearing like this eventually to gain his freedom) when they rescued Data from Omicron Theta, but when they gave him the choice to join Starfleet and go through the whole process a normal applicant would they were acknowledging his personhood and freedom to choose.

So everything in the episode should have been settled law
Plus the fact that once Louvois, in all her idiocy, declared Data property akin to a toaster, with no rights at all, he should've been stripped of rank right then as well, because holding an officer rank is literally bestowing certain rights & privileges over others in the command structure. So, when you imagine seeing it through that lens, it's a way more devastating ruling she just arbitrarily tosses out. Literally busting this decorated officer of his officer status, when he'd spent decades holding it already would've been a whole different tone of ruling than what hers slipped by as. It's doubtful they could've made such a ruling with those consequences
 
I don't know that Picard is self-aware but I posit it on the basis that I am self-aware and that we are similarly made. Self-awareness is a trait granted to human beings because of similarity. Otherwise, you can't be sure that anybody but yourself even exists as Descartes remarked. Human beings may have acquired self-awareness by chance but it was after billions of years of evolution and there may be a connection between self-awareness and survival. Achieve self-awareness in a mechanism in no way similar to us without knowing how it works is unbelievable unlikely, that's the point I was trying to make from the start. of this discussion. You don't create self-awareness out of pure luck any more than you can create a working car by throwing car parts in a container and hoping that they'll somehow join together. Unless we can understand exactly how self-awareness works and how it can be made artificially there's no way that we can make self-aware machines. Just as someone who doesn't understand a computer language is unlikely to write a working program or a monkey with a typewriter is not likely to write Hamlet.

So does it basically come down to this: Self-awareness can be acquired by chance and the need to survive. It takes time, millions of years.
It cannot be acquired if someone is an artificial organism?
However, are there more than one way of being self-aware? For humans it takes millions of years. If so it takes at least the same amount of time for artificial organisms. First human becomes self-aware and many centuries later develops artificial intelligence. That may be different kind of self-awareness but is it somehow less valuable? Is our way the only way?
 
Just as a note, Soong was still presumed dead at that point, which I figure meant they had no consideration for what his intentions had been.Plus the fact that once Louvois, in all her idiocy, declared Data property akin to a toaster, with no rights at all, he should've been stripped of rank right then as well, because holding an officer rank is literally bestowing certain rights & privileges over others in the command structure. So, when you imagine seeing it through that lens, it's a way more devastating ruling she just arbitrarily tosses out. Literally busting this decorated officer of his officer status, when he'd spent decades holding it already would've been a whole different tone of ruling than what hers slipped by as. It's doubtful they could've made such a ruling with those consequences

Yep. The MoaM trial makes a little sense as something that would've occurred in 2341, but after 24 years as a Starfleet officer, to declare that he was property all that time, and they were just humoring "it", would be lunacy. What about all the people that had served under him? The commands he gave? The lives he affected? The positions he took in place of other, sentient officers?

Louvois would be overturning a massive amount of precedent by not acknowledging Data as a Starfleet officer, including decisions and actions by admiralty and probably the Commander-in-chief if not UFP President given Data's Medal of Honor with clusters. Seems like something that would be above the pay grade of the JAG Officer of Sector 23.
 
First human becomes self-aware and many centuries later develops artificial intelligence. That may be different kind of self-awareness but is it somehow less valuable?
Is it also entirely separate & unrelated? If some kind of developmental build up of what have you is part of the recipe for self-awareness, then surely the use of ours to make another's could give benefits to it, no? In essence, our time developing toward self-awareness in some ways might count towards its, especially if its was more deliberately created as we know it to be...

I need to be more high for this conversation :ouch:
 
Well, the thing is that even if you dismantle Lore, as long as you can still put him back together, you haven't committed murder yet. Murder applies once you've destroyed a component that can't be duplicated. But only someone really stupid would do that since Lore could be used as an organ bank for Data in case part of him was damaged beyond repair.

Well, following logically from your assertion that Data is not a lifeform, you can never "kill" Lore...
 
Well, following logically from your assertion that Data is not a lifeform, you can never "kill" Lore...

True, if like me you believe that he is not alive then you can't really kill him. Although people often use the word kill for inanimate objects like "kill the power" for example or "the line is dead". So I guess metaphorically you can still say it.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top