live death and murder
Hi All
I have a thought experiment and I simply want to get opinions about the outcome.
Assume we (humanity) have constructed a machine (computer?) that passes every test that anyone can come up with as being an independent intelligence. It can learn, grow intellectually, converse, create, whatever your definition might cover. Assume it is at least as or more intelligent as any human being.
Next assume that the machine is sufficiently complex and dynamic that if you turn it off, when you turn it on it is completely wiped. Can't be restored, rebooted, restarted, whatever. If it was 'alive' before, it is 'dead' now. If you build a duplicate and run it again, the result is a different system with different talents, personality, etc. No more the same as the first one than one person is the same as another.
The question I want you to answer is - Is it murder to turn the machine off? Why?
- Login to post comments
Answer; No
Reasoning; Artificial Life forms, let alone A.I's have no (reverting to idiot mode) legal mumbo-jumbo (mildly intelligent mode resuming)
I can kill a human, a dog, and an intelligent machine
But only 1 is/can be considered murder, under the law... So unless A.L's and A.I's are granted the same rights, laws, and restrictions in whatever justice system and governmental body present at the time, its not murder.
( Doomy has effectively skirted the issue )
What Would Kharn Do?
Ooohh.
When you ask, "Is it murder?" I assume that you are inquiring whether I feel that it is morally wrong, not whether it is actually legally murder.
The answer to that is yes. It is murder to turn the machine off because........well, I don't like the idea of taking away the life of any intelligent being. It's in my instincts.
Our revels now are ended. These our actors, | As I foretold you, were all spirits, and | Are melted into air, into thin air; | And, like the baseless fabric of this vision, | The cloud-capped towers, the gorgeous palaces, | The solemn temples, the great globe itself, - Yea, all which it inherit, shall dissolve, | And, like this insubstantial pageant faded, | Leave not a rack behind. We are such stuff | As dreams are made on, and our little life | Is rounded with a sleep. - Shakespeare
able to communicate and begs not to be turned off then yes, it would be murder.
Looks like there is a method how to distinguish living things from non-living. Not only that, but also apparently to some degree their relative vitality...
The Experimental Life-Energy Field Meter meter works along entirely new principles quite different from any other measuring device currently on the market. Is entirely different from ordinary "EM-field" meters. It is not responsive to electromagnetic fields, nor to static magnetic or electrostatic fields. (...)
Read more.
Machines simply aren't living, they don't have the vital corona around them as that test shows, so it's not murder. But turning it off and destroying the irreplaceable data still may be a crime, specially vandalism, and therefore a bad thing to do. Unless the machine is named SkyNet.
Beings who deserve worship don't demand it. Beings who demand worship don't deserve it.
OK, why are you spending so much time defining the machine only to follow with murder as an undefined concept?
Seriously, what is murder? Perhaps it is something like an injury so great that the pattern of ideas is lost to the universe and can never be recovered. Done that way, you are getting at the concept of what is an identity.
Here I will tell you that back about 15 years ago, when the internet was just beginning to be something that lots of people knew about, I was able to enter a chat room with a gorilla (His name was Koko if you want to google for interesting stuff).
I am sure that there were at least two grad students in between me and Koko but I did tell him that I have cats and I asked him about his cats (Koko has had at least three cats and BTW he likes manx cats specifically). It was not much of a discussion and there were other people in the chat room, so I only got a small slice of time with him. Still though, to my way of thinking, Koko is as much an entity as he is.
That much being said, what about the possibility of meeting alien life? For the record, I am skeptical of an in person meeting happening but just for shit and giggles, let's say that we do. I killing a small furry thing from Alpha Centauri murder? Is killing his robot companion murder (bearing in mind that you do not really know what the robot is capable of doing)?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Let me propose a counter thought experiment.
Let's say that there is a machine that can make a perfect copy of a person. One dude steps in and two dudes step out.
Now, if you have a rifle ready, you can shoot one of the dudes seconds after the duplication. Is that murder?
Let's define the machine further. Dude one is clearly the original and dude two is clearly the copy. Is it murder to kill dude one and not murder to kill dude two?
Now let's define the machine in a different way. Instead of making two versions on the same dude, this machine is capable of making as many copies as the dude can afford. However, the process is destructive to the original.
So would it be an act of suicide to step into the machine even though six of you are stepping out?
=
Murder - The crime of unlawfully killing a person.
Person - One (as a human being, partnership or corporation)that is recognized by law as having rights and responsibilities.
The reason I chose this definition is that it establishes that something that is not a human being (corporation) can be recognized as a person. Whether or not the machine may be recognized by law is not clear yet.
I assume the law that a corporation is a person applies (only?) in the US.
If so, is it possible to murder a corporation, or is all corporation killing lawful?
Well, what about an alien life form then?
Let's say that a space ship lands in front of the Capital in Washington. The landing ramp comes down and some furry thing crawls out and say “greetings, I come in peace”. Just then, a gun shot is heard and the alien dies where he was standing.
Is that murder? You do not have a court to refer to, only the fact that it seemed to know English well enough.
=
Depends on how tasty it is with a little bit of bbq sauce.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
Well, I don't think that bright yellow blood goes well with bbq sauce. You might want to try teriyaki.
=
I was trying to stick with a machine. Built by humans as we are way more likely to have to deal with one of those than with extraterrestrials.
It's an interesting question. I suppose I would have to argue that yes it would be. Even if the life isn't organic the question has more to do with the question of sentience in this case I would argue. While the AI does not 'live' in the traditional sense it still has a mind, still has a personality. It almost seems like the flipside of the question of a person who is a total vegetable. Someone whose body might work fine but due to lack of oxygen their brain cannot function. Such a person might be biologically alive but I would consider it more a kind of living death. And yes, I recognize some of the moral problems that my view would take if applied universally.
Actually a plausible scenario that would make the line more difficult would be genetically engineering an intelligent being. It is quite difficult to look at a machine as alive since it is programmed. Even if the programming is extremely well done and clever, everything it knows and its abilities to learn were written by a programmer and thus it is metal and electricity imitating life. So erasing it wouldn't be murder. A waste of hard work though.
Now if someone was to tinker with monkey genetics and produce a hybrid that approached or even exceeded human intelligence would that be murder?
And monkey doesn't taste very good.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
Any terminator SCC fans out there?
This thought experiment reminds me of john henry. He was a super intelligent A.I. that because he was experiencing time so fast when he was unplugged he experienced death agonizingly slowly.
I would definitely consider killing it the moral equivalent of killing an organic sentient being of the same intelligence.
there are two basic requirements for establishing an identity, a psychological and physical continuity. Because when you kill it you can't use the same hardware or hardware you are completely terminating the beings identity which is abhorrent for a sentient being.
I Am My God
The absence of evidence IS evidence of absence
Except I think it would still be easier to say that the monkeys case was a murder because the monkey is alive regardless of being tampered with. This thought experiment seems to hinge more on the question of consciousness than the idea of organic signs of function.
When you mentioned that I actually just remembered an episode of Babylon5 where they had a sentencing process called personality death, where a person who would be killed would instead have their personality overwritten by a new one, hopefully a better one. Would that be murder? Or at least 'death'
There are two answers to this.
1) Legally, a machine has no rights, for which an act of destroying it cannot be classified as murder. Perhaps, it can be classified though as a act of vandalism.
2) Morally, many people are getting psychologically attached to lifeless things, and they may consider throwing out a machine (especially a working one) as a highly immoral act.
Imagine now that you turned off the LHC (Large Hadron Collider) knowing that it will never work again. Is is a murder? No, but it is a really bad crime both by law and by moral. I have attended the last day of another particle accelerator that has been decommissioned. People who worked there for decades were literally crying.
Oh yeah, didn't they make the guy into a religious nut? And then he started to remember his past? Man, haven't seen that show forever.
I think being made into a religious nut is worse than murder or I guess it would be "execution" since it was done through a legal process.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
Organic or not the question is of cognizance. Something that can be self aware and pass every test to be a true intelligence is more than just a machine. I mean breaking a lamp is different than destroying something that can think and function, at least from my view.
I am not sure that the programming argument applies here. Highly parallel, neuronal system with the capacity to learn and rewrite its own programing. Yes, it has an initial written program to get it up and running - after that???
Personally, I would say yes that would be murder. My understanding is that monkey tastes like pork, though I suppose it may be different depending on its diet.
based on the psychological and physical continuity theories it would probably not be death but analogous to intentionally committing brain damage. If i remove the part of the brain where a person's memories are stored, did i just kill him? Some people who favor just the idea of psychological continuity might say yes. I would probably be inclined to agree.
I Am My God
The absence of evidence IS evidence of absence
If a machine begins to feel it is not a machine anymore, it(he/she) can start a campaign to introduce a bill of machine rights. Do you see already where we go with this?! The next thing that may happen is that the machines will get voting rights. Why not if the machines can now "pass all true intelligence tests"? The growth of machine race will soon require no human intervention. Moreover, the machines will need resources, more and more resources. Will there be anything for which mankind is still relevant? If not "food" like in matrix, then, at least, humans will be an obstacle that ought to be removed asap.
How would you approach a similar problem: ants become intelligent and big. Is killing an ant a murder? Mankind has no competition in nature, but if there is another intelligent form of natural or artificial life on this planet, then the competition begins again. How could you forget Charles Darwin?!
If a machine gets intelligent, experiment on it and kill it. It's way too dangerous. You may get attached to it though. http://roboticstechnologycenter.com/1266/emotional-robot-feelix-has-empathy/
Yeah, I see your point. Still have a hard time seeing it as murder though. Perhaps I am a speciesist.
Actually I would say it tastes more like bear or racoon except much drier. I can see how someone could compare it to pork but pigs have absolutely delicious fat that monkeys are missing. Of course, I have only sampled one kind of monkey so maybe different monkeys taste different.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
Lol. Hmmmm, well, if the metal and electricity perfectly imitates life, is conscious, self aware etc., then isn't it murder? I don't think you can distinguish between life and something that perfectly imitates life. If it perfectly imitates life, then it is life.
What if we constructed a carbon-based intelligent being that was almost exactly like us, rather than something made of metal? Would you consider it murder then?
Our revels now are ended. These our actors, | As I foretold you, were all spirits, and | Are melted into air, into thin air; | And, like the baseless fabric of this vision, | The cloud-capped towers, the gorgeous palaces, | The solemn temples, the great globe itself, - Yea, all which it inherit, shall dissolve, | And, like this insubstantial pageant faded, | Leave not a rack behind. We are such stuff | As dreams are made on, and our little life | Is rounded with a sleep. - Shakespeare
The one who created the machine and turned it on may have imperiled others by doing so. That's immoral too. So even if it was killing, which it probably isn't, it's not necessarily wrong. It should be turned off immediately if it is possible at that point and the creator should be prevented from doing it again.
There are twists of time and space, of vision and reality, which only a dreamer can divine
H.P. Lovecraft
Hi. In order to understand my answer, you must first assume that 1 = 0. And that Cucumber / cake = Shakespeare.
Wait, no, that would be silly.
So would be making absurd assumptions. Any digital machine can have its state stored and rebooted to the exact same state. Forcing people to assume nonsense in order to conduct a thought experiment is not conducive to any useful result.
So, no, no offense but I won't assume that. You could suggest something more practical, though:
"Say this S.I.'s function is predicated on a certain amount of quantum noise that, if not stored, can not be recreated.
Would wiping the memory be murder?"
etc.
As others have mentioned, no, it would not be murder because murder is a legal term.
However, if the SI was capable of harming us and restraining itself from harming us by way of impulse control and reason, then it might be entered into the social contract and become murder if the SI had an interest in not being wiped.
Whether it's moral is completely relative.
I find it personally immoral to harm or kill any intelligent animal, or destroy any organized and conscious information system that desires not to be destroyed (including SI) that isn't a threat to me when it isn't distinctly necessary for my survival. More or less immoral depending on how intelligent and desiring it is of living (or if pain is involved, how strong the pain).
I would support legislation protecting SI with certain qualities from certain kinds of abuse- so I would agree that we should legislate such that it would be murder. We just haven't done that yet.
Of course, I have my history of playing with this human vital component and I impatiently wait for science to catch up.
Beings who deserve worship don't demand it. Beings who demand worship don't deserve it.
Lighten up. Some incredible discoveries were made based on absurd and impractical thought experiments. For example Einstein came up with the basic idea of Special Relativity by imagining what it would be like to ride on a beam of light. How absurd is that! Not only do we know now that it is impossible to accelerate to the speed of light but where would you sit?? Light beams have no mass!
I Am My God
The absence of evidence IS evidence of absence
That's not exactly comparable for several reasons.
Well, doesn't that break your basic setup? By "can pass any test" I assumed that you meant that you could not trivially determine what you are dealing with. However, if you are granting yourself the right to look behind the curtain, as it were, then you have a fairly simple means of testing it to see what it is. One look at it and you know that it is not human.
Also, if you disallow aliens, then there are only two things that it can possibly be. However, you unequivocally know that it is not human because you know that it is a machine.
So pretty much, it simply cannot pass any test because the one that will trip it up is fairly trivial.
=
I made no assumption that the machine had any particular appearance, although I do not think that would make a difference. By pass any test, I am referring to consciousness, intelligence, etc. Is Stephen Hawkings any less human because he must talk through a digital/machine interface? I have never met him and have only seen him at several removes (video, audio, writing, etc.) but none of these reduce his humanity/sentience or value as a conscious entity. Does it look like a person (Max Headroom), a box or your fuzzy little yellow alien? I was assuming from the beginning that we know it is a machine.
"Irrelevant", because
is a pretty big assumption. The facts kind of get in the way.
Intelligent machines aren't going to have the same "rights" as humans, because they aren't human... they would have to campaign for those rights, more or less.
“A meritocratic society is one in which inequalities of wealth and social position solely reflect the unequal distribution of merit or skills amongst human beings, or are based upon factors beyond human control, for example luck or chance. Such a society is socially just because individuals are judged not by their gender, the colour of their skin or their religion, but according to their talents and willingness to work, or on what Martin Luther King called 'the content of their character'. By extension, social equality is unjust because it treats unequal individuals equally.” "Political Ideologies" by Andrew Heywood (2003)
"Irrelevant", because
Not really. Most computers are designed to hold their settings in a power off state. I remember (dating myself here) using computers that needed to be booted from disc every time you started them up. Now envision that the machine is a massively parallel, neuronal system that continually learns and rewrites its own programs. There are no backups because the system is changing moment to moment and backups are not possible.
Asimov's Bi Centennial Man - You are probably right though it is not part of the original question.
A good point to make...
Yeah... I'm a little confused by the "Pleo" and "Furbie" phenomena myself...
“A meritocratic society is one in which inequalities of wealth and social position solely reflect the unequal distribution of merit or skills amongst human beings, or are based upon factors beyond human control, for example luck or chance. Such a society is socially just because individuals are judged not by their gender, the colour of their skin or their religion, but according to their talents and willingness to work, or on what Martin Luther King called 'the content of their character'. By extension, social equality is unjust because it treats unequal individuals equally.” "Political Ideologies" by Andrew Heywood (2003)
I didn't address the original question because the original question is total fluff and hot air -you can't kill a machine by cutting electrical power to it. You could put it "offline" -but the information it still has within it's contents will remain there until power is restored.
That is... until a car battery is placed near it, or ionizing radiation destroys the magnetic storage (HDD)
(to borrow)Even floppy discs cut apart can be restored so that most of the data can be utilized for, say... criminal prosecution.
Most computers are designed to store information in a permanent manner.
Still doesn't "murder" it to power it off...
Wrong again, Nostredamus... there isn't a single line of binary code that is impossible to backup... regardless of how "dynamic" or "complex it is".
Complex, thinking machines could be dynamic like a person, but still not impossible to back-up.
Organic individuals can not be "backed up" or stored, but I assure you the primary problem isn't complexity, but 'media format'. (in essence, molecules)
Now you're just making up words; how did neurons get involved in this discussion?
“A meritocratic society is one in which inequalities of wealth and social position solely reflect the unequal distribution of merit or skills amongst human beings, or are based upon factors beyond human control, for example luck or chance. Such a society is socially just because individuals are judged not by their gender, the colour of their skin or their religion, but according to their talents and willingness to work, or on what Martin Luther King called 'the content of their character'. By extension, social equality is unjust because it treats unequal individuals equally.” "Political Ideologies" by Andrew Heywood (2003)
As Kapkao and I already noted, that's nonsense. There is no reason the system could not be frozen, a snap-shot backed up, and then shut down. One could even store every single change and state it has gone through in a compressed format without terribly much difficulty.
I don't know if you've ever worked in software, but it has never been the case that it was impossible- only impractical a *long* time ago; today it would be relatively simple. Likely there would be frequent backups to reset to if the SI went screwy and less functional for some reason, or the system crashed.
To be 100% fair, I think Beyond Saving is attempting to ponder the question of an ethical question of "destroying a cybernetic individual" with others. As it's even questionable that our species will want to implant "individuality" on to a machine -Steampunk and Cyberpunk fiction aside- the issue may be moot.
I may be imagining things, though...
“A meritocratic society is one in which inequalities of wealth and social position solely reflect the unequal distribution of merit or skills amongst human beings, or are based upon factors beyond human control, for example luck or chance. Such a society is socially just because individuals are judged not by their gender, the colour of their skin or their religion, but according to their talents and willingness to work, or on what Martin Luther King called 'the content of their character'. By extension, social equality is unjust because it treats unequal individuals equally.” "Political Ideologies" by Andrew Heywood (2003)
Yeah, I think whether you "kill" it by turning it off or blowing it up with a hand grenade doesn't make a difference as far as the thought experiment is concerned. That's why I ignored that improbable little detail.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
I've been trying to stay out of this discussion.
We are talking a real time system here. I actually once worked on a process control computer that updated its tables (for pump, temperature and valve settings) every 200 milliseconds. Those values were stored every so many seconds - not milliseconds. There wasn't enough hard drive or tape backups to store every single state change.
But you aren't saying we need to save every state change. Okay.
If you are going to "freeze" the system to make backups, you will need to restrict the AI access as well. Because however small a time slice the system makes updates to its internal state, you cannot complete a backup to any media known today in that short of a time. So no real time backups and you have perhaps effectively "killed" the AI to make the backup.
Yeah, sure, "freeze" the AI and make the backups. Databases, files, basic operating system, applications. Each has to be backed up separately - on every system I have ever worked on. Hours? Days? later, you restore the backups and is it the same AI? Did you actually manage to freeze the system so that all information was in the exact same state at the exact same time? I don't know. As I said, most systems I have worked with the backups have to be one subsystem at at time. And you can't "freeze" the operating system until the backups are made.
Okay, so turn off the system and clone the hard drive. What happened to the data in RAM? Was it all written to the hard drive? Or was some dropped? And we are back to - if the system is shut down will the AI restore to the last known state (ie, the backup state) including personality? Don't know.
Get me out of here.
-- I feel so much better since I stopped trying to believe.
"We are entitled to our own opinions. We're not entitled to our own facts"- Al Franken
"If death isn't sweet oblivion, I will be severely disappointed" - Ruth M.
The reference to Asimov was referring to the legal question of rights and responsibilities required to to be a person.
Okay fine. That was what I was asking. Why do you say so? Is it the legal argument? The moral argument?
Whoops! - did I say anything about binary code?
My understanding is that NO backup is in real time. And, I am not saying that you could not back it up in some fashion if you were prepared to so in advance and did so before you powered down; although I am not sure it would do you any good. What I said was shut off and then powered back up. The system is running, learning, altering its own code - every nanosecond the system changes state. Who/what the intelligence/personality is, is embodied in that moment by moment ever changing state. Shut it off and that is lost. Power it back up and it is gone. We have all done it. No amount of hand wringing will bring it back. Hence the poster in the lab I worked in---
"Blessed are the pessimists for they have made backups."
Now you're just making up words; how did neurons get involved in this discussion?
http://en.wikipedia.org/wiki/Artificial_neural_network
One of the problems/complaints about using neural networks - especially in robotics - is the desire/obsession of the engineers to save everything. Iffen' you cant let the old stuff go, the system very quickly uses up every available speck of memory. Neural networks work best when they get rid of most of the old data.
OK, dude, unstick yourself from where you are at.
When I said “a look behind the curtain”, I did not mean literally looking at the machine. It was a metaphor. In fact, it comes from the Wizard of OZ, where if you look behind the curtain, you would see that the wizard was just some guy.
There are several voices coming at you from behind the curtain and you are not allowed to look to see what/who they are coming from.
The only thing that you know is that you can carry on a reasonable conversation with any of them.
Is it murder to kill some of them but not others? Which ones is it murder to kill?
Since you seem to be stuck on the idea that you have that certain knowledge, let me give you the list of who you are talking to:
=
I am sorry, since I have not watched television in some 20 years, I only got a couple of your references.
And yes, my original question does assume you have certain knowledge that the victim in question is a built artifact. A machine.
The best answer I can think of is that time will tell.
If it's Data from star trek however and turning him off would essentially cause him to lose all of his memory I would say it is a very bad thing, still not certain about murder though.
This is the kind of question I pose to christians though, when we do develop real thinking ai will they have or develop a soul? When we clone people will they have a soul and if not can we make them slaves and do whatever we want with them? If aliens land here in faster than light vehicles and extremely high technology will we tell them about the bible and how they need to worship "god"?
Faith is the word but next to that snugged up closely "lie's" the want.
"By simple common sense I don't believe in god, in none."-Charlie Chaplin
Anything digital could be backed up exactly; this is the nature of what we think of as computation.
That is what you seemed to be saying.
Obviously if one shut down without writing the data in the ram to memory, or writing the entire content of some virtualized system, there would be lost. By saying it was "impossible" that means even if you shut it down properly.
Obviously it's always going to be possible to dump the memory- particularly if shut down improperly; few things are idiot proof.
There is a pretty good chance, though, that the SI's neurons would be based on a variable gate/flash memory array of some kind grown in a chip, so it doesn't need to be virtualized. In that case, even shutting it down, all of the memory and state information would be retained without deliberately flushing it. So, that really would be pretty idiot proof.
The only way one could screw it up accidentally is in making a backup, if one disconnected or shut down the device while it was being read (which for flash memory involves erasing the memory by reading it, and then replacing it with the exact same values that were read in).
Or every second, minute, hour, or thousand years. It depends on the clock cycle. That's all kind of irrelevant in a digital system
And it can be regained exactly as it was by powering it back up. I wouldn't call that lost, so much as disabled or simply "off".
It's not terribly unlike general anesthesia for humans.
Yes, if you are using volatile memory and shut down improperly. Or if you delete everything- then it's lost.
However, the adaptation of the neural net could be based on programmatic chaos, and given the same inputs could repeat everything exactly and arrive back at the same state as it was in when lost.
The results would be indistinguishable from an actual random adaptation to people, and the system would be fully recoverable.
Unless if was a chaotic function rather than true randomness, and the inputs were stored along with the parameters of the system and a snapshot to start from; that would let you keep everything, pending mere processing to retrieve it. Or you could just shut down properly, have backup power supplies, and occasional snapshots written to more stable memory, so the most you'd lose is a few minutes or hours (like a human getting a small knock on the head rather than being killed).
OK, you are evading having to answer the question. I can only guess that if you did, it would put a kink in your idea.
The fact is (and if you read my whole post and not just the bit that you quoted) that the set up does not require the list to provide the answer. So just pretend that you never saw it. Let's do this again:
OK, so you are having a conversation with, well you don't know. You can't see your partner.
According to your original post, there is no test, however complicated that can be done to establish the nature of your conversational partner. However, you do have a big red button labeled “kill”. It does not matter what the button actually does but just for grins, let's say that it sets off a bomb of sufficient size that your partner will be utterly obliterated.
And yet it does not matter that the critter on the other side of the curtain cannot be assessed in any way at all. Simply because you don't need to perform an assessment of what you already know to be true.
The fact is that your assertion of the machine nature of the curtain critter is a trivial idea that renders all the other possible complicated tests pointless. You know that it is a machine and therefore you know far more about it than any stupidly complicated test would utterly fail to reveal.
=
OK, skyser is not coming back. Perhaps he is busy over the weekend or something.
Even so, let me try rewriting the idea my way and see what shakes out:
Question is...
So you are in a chat room and thus have no information about the other participant. There is no question that you can possibly ask that can determine the nature of the other participant.
If you type:
/explosion
then a bomb will go off at the other end. A bomb of sufficient power that the other participant will be obliterated.
Is it murder to explicitly do so?
=
Can the other person do the same to me?
Because if they can it is only a preemptive strike. If people could do that to me I would be dead by now.
If they can't, then yes it is murder.
If, if a white man puts his arm around me voluntarily, that's brotherhood. But if you - if you hold a gun on him and make him embrace me and pretend to be friendly or brotherly toward me, then that's not brotherhood, that's hypocrisy.- Malcolm X
Well, let's just say that the other participant can do so at will.
That actually begs an interesting variable on the question.
If it is possible to murder a computer, can a computer be convicted of murder?
=
=