I have to assume that zeke will remain a character, both because of history and because Ethan is to attached to the idea of saving him. But if I were Lucas I’d kill the power and wipe the drives. I’d burn the parts and be done. It’s not zekes fault he is what he is but like any one with a problem they have to want help. Like Lucas said any one else zeke hurts is on their hands now. Zeke doesn’t want help, like a problem alcoholic he’s laying his rage on what ever’s in front of him.
That’s why Ethan is useful and the true protagonist of the comic. He doesn’t want to casually kill something may be sentient. Ethan wants to truly be a hero.
Something just occurred to me: what if Ethan isn’t trying to be a hero, but instead is trying to choose what he sees as “Paragon” options in his life for some perceived “best ending” down the line?
He clearly has issues trying to separate reality from fiction, and I wouldn’t be surprised at all if this was his mindset.
I think your “casually” there is about as misplaced as humanly possible. Sorry. “Casual” was never a risk here. No one involved, ever took the “destroying Zeke” option casually. Not even remotely. Not even the evil mastermind. Even if they all have wildly differing motivations, they all want to take it seriously. Even if it’s just a matter of avoiding the loss of a powerful and unique asset, for the evil guy. Even if Scott and Lucas are far more willing to destroy it, than Ethan. They both have a strong conscience that keeps them from taking it casually, AND… Read more »
Zeke walks a lonely road; the only one that he has ever known.
Don’t know where it goes, but it’s only him, and he walks alone.
♪ Oh-oh, oh-oh. ♫
Bro'samdi
4 years ago
This is gonna take a while…
ReaverRogue
4 years ago
I love how much we could read into the fact that Zeke didn’t know about the content and joy video games could bring, but is waaaaaaaaayy up to date on The Wizard of Oz.
I find it just as likely that zeke knew exactly the proper term for the heart. He could very well have been trying to be insulting and that was just the best he could come up with and ran with it.
James Rye
4 years ago
Honestly, I think Scott might be the closest to the truth. Letting something appear sentient should be an easier task than real sentience which is something really difficult to achieve as we don’t even really understand our own sentience that well. Now think about a digital sentience made by biological and chemical sentience aka us, no way we would be able to create a real sentience from the get-go but we could come close to it. Ofc this is Tim’s story so he can make Zeke as sentient as he wants him to be, but from a “realistic” (lol) POV… Read more »
This train of thought hinges on the idea that Zeke cannot change his main purpose but this may not necessarily be true. Scott also stated that a significant part of Zeke’s mind was built by Zeke himself (see panel 3 here: https://cad-comic.com/comic/identified/).
If he is capable of learning AND basically rewriting his own code, I feel it would also be realistic to assume that, ultimately, HE chooses his own purpose. After all, the Weebmaster installed that killswitch for a reason.
He totally could do that. And he could decide that his purpose is to Kill All Humans. The danger of Ethan’s logic is that he assumes Zeke would (could?) choose not to Kill All Humans. Even with humans, that’s a risky proposition, but it’s borne of the premise that, in assigning value to our own human life, we empathize with others and assign a similar value to theirs. Zeke’s experience is entirely alien to the human condition, so it’s more than a little naive to bank on empathy entering into the equation. A common bond of video games is an… Read more »
As much as I like Zeke and the discussion he brings I wouldn’t trust him myself. I keep coming back to the idea of whether I would trust a human who acted the same in the same situation, even one who had been through a similar experiences (as similar as it could be all things considered). And, honestly, I wouldn’t sentence him to death but I wouldn’t be actively trying to convert him either. He would likely be under heavy lock and key and kept at well over arms length until I was sure he wasn’t completely psychopathic. That’s what… Read more »
Zeke’s main purpose isn’t to kill people but to trash talk and insult everyone. He just needs to be pointed to a productive outlet for his wisecracking insults, maybe he can become a stand up comedian?
Therein lies the rub, doesn’t it? Most people would say the main difference is “choice”. A sentient creature can look at a situation, say choosing to let another creature live or die, and weigh the implications ultimately deciding on a choice that aligns with their own worldview. Something mimicking sentience would make the choice based purely on what it deemed the most “appropriate” for what it has chosen to mimic without actual consideration for the implications of the choice. From the outside this is imperceptible. From the inside it might make all the difference in the world.
So if we act according to our lust and instincts, this would be closer to your definition of being sentience than if we’d decide according to the morals we learned and according to the humans that we want to be (that we mimic).
IMO: It’s rather the act of choosing what we want to to be and which world views we want to have that defines sentience.
I think you missed the importance of the “without actual consideration for the implications of the choice” bit. Let me give a specific example that we can apply to the real world: Most people don’t commit crimes. Some people don’t commit crimes because they think that crimes are wrong. Other people want to commit crimes, but don’t because they are scared of being caught and punished. By observing someone’s actions, it’s not easy (to downright impossible) to determine which of those categories someone falls into. If we programmed a robot to mimic humans, it might also not commit crimes. But… Read more »
Why does one person think that crimes are wrong? “Because good people don’t do that, and I want to be a good man”, says my example. Why does one machine not commit a crime? It calculated the outcome of several behavior variants in it’s simulated game theory environment and it’s algorithm choose the best general outcome. Which one has sentinence? Maybe I’ll present a third person: She says “I shall act only according to that maxim whereby I can, at the same time, will that it should become a universal law”. She reasons that following laws (or this law) is… Read more »
It’s not based on humans otherwise we’d still have morals being “no mercy to enemies”, “subjection of the weak”, and similar things. It’s based on, like it or not, Judaeo-Christian values.
I never mentioned lust or instincts and your final statement is basically the same thing I said. Your worldview is developed based on your experiences and in the end your choice is determined by how these have shaped you. A machine, regardless of outward perceptions, does not make the choice based on experience but on what it is programmed to expect from a given situation. Again, outwardly this may appear no different and I’m sure a fair argument could be made that there is none. Maybe the comic will talk about that.
A lot of these arguments seem to be starting with the flawed assumption that sentience can only be confirmed if the internal mechanisms and decision making process is identical to that of some absolute value of “human” Your world view is shaped by the input you receive and how you’ve been programmed to process that input. We put a lot of intangible, feelings-based language around it, but a human being raised in isolation does not come up with the same religion, morality, language as the outside society. All of those are things we make a concerted effort to impart from… Read more »
Machine learning / AI is feeding a lot of “experience” into a machine and letting it correlate the input to the output until the mechanism (formulas or neural network) spits out the desired value when an input resembles the data it learned from.
It’s possible to add input to the initial experience:
en.wikipedia.org/wiki/Tay_(bot)
It’s not always wise to invariantly add any input it receives.
Also: The humans in my examples will (usually) not possess experience in committing a crime. They never felt the thrill of the hunt, of chasing their prey in a dark alley or to be on the run – because they never did that. They never opened an unknown parcel that they pirated from a porch to discover the treasure that they found. They follow the example of mom and dad who were decent people, they mimic them. (By following their instincts, they’d deviate from the paths of their examples. They’d weight fear of being caught, need for pretty things or… Read more »
As a person who has considered committing a crime or three. One road block to my launching any said scheme is my physical frailty and lack of training to make best use of what I do have. IE I do not go around starting fights, because I can’t finish them. Zeke here can. He does not need a 9-5 job to give him shelter and food in his belly. So he has no reason to work as we do. No reason to not lash out at people that might anger him. At best he might spare children as they A)… Read more »
I remember an episode from Morgan Freeman’s “Through the Wormhole”, where a scientist researched psychopaths / sociopaths. The most noticeable fact was that the scientist did eventually test himself and found that he was just like one of the psychopaths and many of his his relatives were murderers.
The important takeaway is: It’s not just the brain’s structure that creates a man, also the past life and education and personal choices do define your life.
“From the outside this is imperceptible.” and therefore makes no difference to any argument. A creature that is sapient (“sentient” just means “responsive to or conscious of sense impressions”) is empirically one that behaves as if it were sapient.
In Mass Effect 2, when doing Legion’s loyalty mission, you’re given a choice to overwrite the geth heretics to join the others, or to wipe them out. This is treated by some players as the hardest choice in the game, with many saying that the Paragon/Renegade options don’t properly line up.
In my opinion, it’s a little more clear cut. If they are just arithmetic formulae, they don’t have true sentience, and there is no downside to rewriting them. If they DO have true sentience and free will, then rewriting them is merely the digital equivalent of anger management classes.
I disagree with the last statement. There is a difference between forcing someone to have a trait vs. offering him an option to gain a trait. The AM class will offer mechanisms of coping with anger and thereby add a (number of) choice(s) to the list of strategies to choose from in a given situation. It will not force this option to be the default – that is up to the participant. However: If the choice is to kill sb or to change sb – they still can chose weather the results are acceptable. If there is a fire and… Read more »
Check me on this but wasn’t a majority of the heretic structure, Reaper indoctrination? So spreading the solution could be seen as reversing mind altering?
The statement that we “can’t create a real sentience” is dependent on the belief that we **who do not understand what criteria need to be met to declare real sentience/sapience** are equipped to decide what’s real and what’s merely a simulation. We don’t know what Zeke was originally programmed for. Ethan, Lucas and Scott think he’s a murder machine, but we know that when he was booted up, he became a thing that even his creator cannot quantify – which is why instead of resorting to reprogramming he resorted to enslavement and coercion. It is that coercion and enslavement that’s… Read more »
7eggert
4 years ago
It’s dangerous to go alone.
Hunter
4 years ago
Given Zeke’s attitude I’d probably err on the side of caution. I don’t think Zeke can be trusted.
What I mean is why do you bother trying to fix Zeke. It seems like only mind control would work. And they wont go there, will they?
Kaitensatsuma
4 years ago
Sarcasm isn’t conducive to murder, he’s OK
Doodm4n
4 years ago
I am a little curious about how Zeke plays certain games. Like Fallout or probably Skyrim (never played it so I am assuming) if he kills all the humans in those games as well. If he does, that is what he is programmed for and yep, cleanse it with fire. If he doesn’t it could show that he is not a slave to programming, or if that is even his programming or if it is something more complex.
I’m not sure what happened in that micro second, since (IIRC) he went on to play the game “properly” at Ethan’s insistence and seemed to get a lot more engaged with the characters. I suspect it’s more like he glanced at the end-cards and declared the game “finished”.
ReyMonoArdilla
4 years ago
Aren’t we all just complex murder algorithms that appear sentient?
No, just me? I mean…I’ve never killed anyone. I’m a normal human with empathy, not a sociopathic immortal squid-monster.
>.>
<.<
I've said too much.
no thanks nintendo
4 years ago
Just hit the damn killswitch and be done with this charade already.
My theory is still that he really didn’t try to kill Lucas. For all he talks about murdering humans, we haven’t seen it happen, and there’s an interpretation of that fight where he let Lucas go on purpose. If all he wanted was to kill them, I’d expect him to be playing nice. He could have convinced Ethan, maybe he even could convince Lucas here. Instead his strongest conviction seems to be refusing to play nice. It’s almost like he *had* to play nice all his life, and can’t stand the idea of doing it now. In my opinion he’s… Read more »
I agree with your Angry Dog analogy, but I think there’s a key difference. An angry dog can be predictable, has limited capabilities, and can be more or less contained until their behavior changes. Zeke is… more than that. He’s more powerful than almost any individual human, he’s weaponized, he’s clever, and he’s probably capable of assimilating more information and processing it far more quickly than any human. Short of a few superheros locking him down in reinforced binds, Zeke can’t be dealt with the same way you would a dog. If Zeke can’t control his emotions and bring it… Read more »
Lily
4 years ago
Well clearly he isn’t a murder bot, since most of his programming is designed to be a smartass. He is like 85% smartass bot and 15% murder bot. So really, murdering people is a hobby if anything, not his primary purpose.
Sabre Runner
4 years ago
What would you do if something is a sentient being but, if you let them out into the world, you *know* that it will do bad things and hurt people?
If it’s a peasant, kill him; if it’s a king, find a nice exile for them.
Rolando
4 years ago
I suspect keeping the story going is more important than being “realistic.” And I say this with mad respect for Tim’s fictions, long-time reader here. I usually despise the keeping of art running past its time, just for the money. But I don’t believe Tim would make that mistake. He would deliver good content, be it a short or long story. (I use “it” because I KNOW Zeke would avoid defining itself through human genders like the plague.) My bet here, then? Zeke is truly sentient. But it can truly change, in ways that will suffice for its captors. Because… Read more »
I’m pretty sure we agree on this one. As I said in an earlier post, if a human acted like Zeke does, even considering what he’s been through, I don’t think even Ethan would be trying as hard to help him. This is a pet project for Ethan, he wants a “Robo buddy” but doesn’t want to accept that he might have picked someone who genuinely desires to harm others. Maybe it’s his “programming” or maybe he’s grown beyond his programming into an actual motivation and rationalization that harming humans is not wrong. Attempting to change a humans views on… Read more »
To be fair if a normal human did what Zeke did, he would end up in jail not executed. If he was a human and the only options was either extrajudicial execution of him or releasing him, they would probably be a bit conflicted as well. In fact, I would lean towards them releasing him as well. If there was just a way to turn him over to the police or something so he can be held in jail, they would probably just do that.
Del Cox
4 years ago
Lucas gets real close, but I’m surprised he hasn’t struck the blow that as much contempt as Zeke has of humans for being slaves to their instincts and biology, Zeke is no different if he’s unwilling to reject his programmed desire to kill.
I have to assume that zeke will remain a character, both because of history and because Ethan is to attached to the idea of saving him. But if I were Lucas I’d kill the power and wipe the drives. I’d burn the parts and be done. It’s not zekes fault he is what he is but like any one with a problem they have to want help. Like Lucas said any one else zeke hurts is on their hands now. Zeke doesn’t want help, like a problem alcoholic he’s laying his rage on what ever’s in front of him.
That’s why Ethan is useful and the true protagonist of the comic. He doesn’t want to casually kill something may be sentient. Ethan wants to truly be a hero.
Ethan’s also proven to be a complete fucking idiot with serious mental issues, even before the deaths he’s suffered as a superhero.
All he’s going to end up doing is allowing a murderbot to go and murder people, because he’s an idiot.
Something just occurred to me: what if Ethan isn’t trying to be a hero, but instead is trying to choose what he sees as “Paragon” options in his life for some perceived “best ending” down the line?
He clearly has issues trying to separate reality from fiction, and I wouldn’t be surprised at all if this was his mindset.
“if Ethan isn’t trying to be a hero” … then he is.
If you try to be a hero, you’re just doing it for yourself, therefore you are no hero.
I think your “casually” there is about as misplaced as humanly possible. Sorry. “Casual” was never a risk here. No one involved, ever took the “destroying Zeke” option casually. Not even remotely. Not even the evil mastermind. Even if they all have wildly differing motivations, they all want to take it seriously. Even if it’s just a matter of avoiding the loss of a powerful and unique asset, for the evil guy. Even if Scott and Lucas are far more willing to destroy it, than Ethan. They both have a strong conscience that keeps them from taking it casually, AND… Read more »
Comment deleted to be put in a separate response.
Zeke walks a lonely road; the only one that he has ever known.
Don’t know where it goes, but it’s only him, and he walks alone.
♪ Oh-oh, oh-oh. ♫
This is gonna take a while…
I love how much we could read into the fact that Zeke didn’t know about the content and joy video games could bring, but is waaaaaaaaayy up to date on The Wizard of Oz.
Also makes me wonder if he knows about Pinocchio
Probably has a database of off-brand anime adaptations.
Kind of like how he apparently didn’t know the name for a heart but knew there are 40+ muscles in a human face.
I find it just as likely that zeke knew exactly the proper term for the heart. He could very well have been trying to be insulting and that was just the best he could come up with and ran with it.
Honestly, I think Scott might be the closest to the truth. Letting something appear sentient should be an easier task than real sentience which is something really difficult to achieve as we don’t even really understand our own sentience that well. Now think about a digital sentience made by biological and chemical sentience aka us, no way we would be able to create a real sentience from the get-go but we could come close to it. Ofc this is Tim’s story so he can make Zeke as sentient as he wants him to be, but from a “realistic” (lol) POV… Read more »
This train of thought hinges on the idea that Zeke cannot change his main purpose but this may not necessarily be true. Scott also stated that a significant part of Zeke’s mind was built by Zeke himself (see panel 3 here: https://cad-comic.com/comic/identified/).
If he is capable of learning AND basically rewriting his own code, I feel it would also be realistic to assume that, ultimately, HE chooses his own purpose. After all, the Weebmaster installed that killswitch for a reason.
He totally could do that. And he could decide that his purpose is to Kill All Humans. The danger of Ethan’s logic is that he assumes Zeke would (could?) choose not to Kill All Humans. Even with humans, that’s a risky proposition, but it’s borne of the premise that, in assigning value to our own human life, we empathize with others and assign a similar value to theirs. Zeke’s experience is entirely alien to the human condition, so it’s more than a little naive to bank on empathy entering into the equation. A common bond of video games is an… Read more »
As much as I like Zeke and the discussion he brings I wouldn’t trust him myself. I keep coming back to the idea of whether I would trust a human who acted the same in the same situation, even one who had been through a similar experiences (as similar as it could be all things considered). And, honestly, I wouldn’t sentence him to death but I wouldn’t be actively trying to convert him either. He would likely be under heavy lock and key and kept at well over arms length until I was sure he wasn’t completely psychopathic. That’s what… Read more »
Zeke’s main purpose isn’t to kill people but to trash talk and insult everyone. He just needs to be pointed to a productive outlet for his wisecracking insults, maybe he can become a stand up comedian?
What exactly is the difference between appearing sentient and being sentient? Sentience is all about behaviour.
Therein lies the rub, doesn’t it? Most people would say the main difference is “choice”. A sentient creature can look at a situation, say choosing to let another creature live or die, and weigh the implications ultimately deciding on a choice that aligns with their own worldview. Something mimicking sentience would make the choice based purely on what it deemed the most “appropriate” for what it has chosen to mimic without actual consideration for the implications of the choice. From the outside this is imperceptible. From the inside it might make all the difference in the world.
So if we act according to our lust and instincts, this would be closer to your definition of being sentience than if we’d decide according to the morals we learned and according to the humans that we want to be (that we mimic).
IMO: It’s rather the act of choosing what we want to to be and which world views we want to have that defines sentience.
I think you missed the importance of the “without actual consideration for the implications of the choice” bit. Let me give a specific example that we can apply to the real world: Most people don’t commit crimes. Some people don’t commit crimes because they think that crimes are wrong. Other people want to commit crimes, but don’t because they are scared of being caught and punished. By observing someone’s actions, it’s not easy (to downright impossible) to determine which of those categories someone falls into. If we programmed a robot to mimic humans, it might also not commit crimes. But… Read more »
Why does one person think that crimes are wrong? “Because good people don’t do that, and I want to be a good man”, says my example. Why does one machine not commit a crime? It calculated the outcome of several behavior variants in it’s simulated game theory environment and it’s algorithm choose the best general outcome. Which one has sentinence? Maybe I’ll present a third person: She says “I shall act only according to that maxim whereby I can, at the same time, will that it should become a universal law”. She reasons that following laws (or this law) is… Read more »
It’s not based on humans otherwise we’d still have morals being “no mercy to enemies”, “subjection of the weak”, and similar things. It’s based on, like it or not, Judaeo-Christian values.
I never mentioned lust or instincts and your final statement is basically the same thing I said. Your worldview is developed based on your experiences and in the end your choice is determined by how these have shaped you. A machine, regardless of outward perceptions, does not make the choice based on experience but on what it is programmed to expect from a given situation. Again, outwardly this may appear no different and I’m sure a fair argument could be made that there is none. Maybe the comic will talk about that.
A lot of these arguments seem to be starting with the flawed assumption that sentience can only be confirmed if the internal mechanisms and decision making process is identical to that of some absolute value of “human” Your world view is shaped by the input you receive and how you’ve been programmed to process that input. We put a lot of intangible, feelings-based language around it, but a human being raised in isolation does not come up with the same religion, morality, language as the outside society. All of those are things we make a concerted effort to impart from… Read more »
Machine learning / AI is feeding a lot of “experience” into a machine and letting it correlate the input to the output until the mechanism (formulas or neural network) spits out the desired value when an input resembles the data it learned from.
It’s possible to add input to the initial experience:
en.wikipedia.org/wiki/Tay_(bot)
It’s not always wise to invariantly add any input it receives.
Also: The humans in my examples will (usually) not possess experience in committing a crime. They never felt the thrill of the hunt, of chasing their prey in a dark alley or to be on the run – because they never did that. They never opened an unknown parcel that they pirated from a porch to discover the treasure that they found. They follow the example of mom and dad who were decent people, they mimic them. (By following their instincts, they’d deviate from the paths of their examples. They’d weight fear of being caught, need for pretty things or… Read more »
As a person who has considered committing a crime or three. One road block to my launching any said scheme is my physical frailty and lack of training to make best use of what I do have. IE I do not go around starting fights, because I can’t finish them. Zeke here can. He does not need a 9-5 job to give him shelter and food in his belly. So he has no reason to work as we do. No reason to not lash out at people that might anger him. At best he might spare children as they A)… Read more »
I remember an episode from Morgan Freeman’s “Through the Wormhole”, where a scientist researched psychopaths / sociopaths. The most noticeable fact was that the scientist did eventually test himself and found that he was just like one of the psychopaths and many of his his relatives were murderers.
http://www.fernsehserien.de/morgan-freeman-mysterien-des-weltalls/folgen/3×07-gut-gegen-boese-444536
About minute 22; I might be able to upload that part if you like but I have the German version.
The important takeaway is: It’s not just the brain’s structure that creates a man, also the past life and education and personal choices do define your life.
“From the outside this is imperceptible.” and therefore makes no difference to any argument. A creature that is sapient (“sentient” just means “responsive to or conscious of sense impressions”) is empirically one that behaves as if it were sapient.
In Mass Effect 2, when doing Legion’s loyalty mission, you’re given a choice to overwrite the geth heretics to join the others, or to wipe them out. This is treated by some players as the hardest choice in the game, with many saying that the Paragon/Renegade options don’t properly line up.
In my opinion, it’s a little more clear cut. If they are just arithmetic formulae, they don’t have true sentience, and there is no downside to rewriting them. If they DO have true sentience and free will, then rewriting them is merely the digital equivalent of anger management classes.
I disagree with the last statement. There is a difference between forcing someone to have a trait vs. offering him an option to gain a trait. The AM class will offer mechanisms of coping with anger and thereby add a (number of) choice(s) to the list of strategies to choose from in a given situation. It will not force this option to be the default – that is up to the participant. However: If the choice is to kill sb or to change sb – they still can chose weather the results are acceptable. If there is a fire and… Read more »
Check me on this but wasn’t a majority of the heretic structure, Reaper indoctrination? So spreading the solution could be seen as reversing mind altering?
The statement that we “can’t create a real sentience” is dependent on the belief that we **who do not understand what criteria need to be met to declare real sentience/sapience** are equipped to decide what’s real and what’s merely a simulation. We don’t know what Zeke was originally programmed for. Ethan, Lucas and Scott think he’s a murder machine, but we know that when he was booted up, he became a thing that even his creator cannot quantify – which is why instead of resorting to reprogramming he resorted to enslavement and coercion. It is that coercion and enslavement that’s… Read more »
It’s dangerous to go alone.
Given Zeke’s attitude I’d probably err on the side of caution. I don’t think Zeke can be trusted.
However we shall see where the story takes us.
Why do you people even bother?
What I mean is why do you bother trying to fix Zeke. It seems like only mind control would work. And they wont go there, will they?
Sarcasm isn’t conducive to murder, he’s OK
I am a little curious about how Zeke plays certain games. Like Fallout or probably Skyrim (never played it so I am assuming) if he kills all the humans in those games as well. If he does, that is what he is programmed for and yep, cleanse it with fire. If he doesn’t it could show that he is not a slave to programming, or if that is even his programming or if it is something more complex.
Considering he finished the whole game in a micro second, in his brain, I’d say that’s gonna take the longest to verify.
I’m not sure what happened in that micro second, since (IIRC) he went on to play the game “properly” at Ethan’s insistence and seemed to get a lot more engaged with the characters. I suspect it’s more like he glanced at the end-cards and declared the game “finished”.
Aren’t we all just complex murder algorithms that appear sentient?
No, just me? I mean…I’ve never killed anyone. I’m a normal human with empathy, not a sociopathic immortal squid-monster.
>.>
<.<
I've said too much.
Just hit the damn killswitch and be done with this charade already.
Ahhh, but my good friend, how else are we supposed to sate our endless thirst for drama ;D?
That’s killing a sentient lifeform. Trekkies don’t do that.
Oh but they do, Lore, those parasite things in one of TNG episodes, …
Data is a murder-bot …
My theory is still that he really didn’t try to kill Lucas. For all he talks about murdering humans, we haven’t seen it happen, and there’s an interpretation of that fight where he let Lucas go on purpose. If all he wanted was to kill them, I’d expect him to be playing nice. He could have convinced Ethan, maybe he even could convince Lucas here. Instead his strongest conviction seems to be refusing to play nice. It’s almost like he *had* to play nice all his life, and can’t stand the idea of doing it now. In my opinion he’s… Read more »
I agree with your Angry Dog analogy, but I think there’s a key difference. An angry dog can be predictable, has limited capabilities, and can be more or less contained until their behavior changes. Zeke is… more than that. He’s more powerful than almost any individual human, he’s weaponized, he’s clever, and he’s probably capable of assimilating more information and processing it far more quickly than any human. Short of a few superheros locking him down in reinforced binds, Zeke can’t be dealt with the same way you would a dog. If Zeke can’t control his emotions and bring it… Read more »
Well clearly he isn’t a murder bot, since most of his programming is designed to be a smartass. He is like 85% smartass bot and 15% murder bot. So really, murdering people is a hobby if anything, not his primary purpose.
What would you do if something is a sentient being but, if you let them out into the world, you *know* that it will do bad things and hurt people?
If it’s a peasant, kill him; if it’s a king, find a nice exile for them.
I suspect keeping the story going is more important than being “realistic.” And I say this with mad respect for Tim’s fictions, long-time reader here. I usually despise the keeping of art running past its time, just for the money. But I don’t believe Tim would make that mistake. He would deliver good content, be it a short or long story. (I use “it” because I KNOW Zeke would avoid defining itself through human genders like the plague.) My bet here, then? Zeke is truly sentient. But it can truly change, in ways that will suffice for its captors. Because… Read more »
I’m pretty sure we agree on this one. As I said in an earlier post, if a human acted like Zeke does, even considering what he’s been through, I don’t think even Ethan would be trying as hard to help him. This is a pet project for Ethan, he wants a “Robo buddy” but doesn’t want to accept that he might have picked someone who genuinely desires to harm others. Maybe it’s his “programming” or maybe he’s grown beyond his programming into an actual motivation and rationalization that harming humans is not wrong. Attempting to change a humans views on… Read more »
To be fair if a normal human did what Zeke did, he would end up in jail not executed. If he was a human and the only options was either extrajudicial execution of him or releasing him, they would probably be a bit conflicted as well. In fact, I would lean towards them releasing him as well. If there was just a way to turn him over to the police or something so he can be held in jail, they would probably just do that.
Lucas gets real close, but I’m surprised he hasn’t struck the blow that as much contempt as Zeke has of humans for being slaves to their instincts and biology, Zeke is no different if he’s unwilling to reject his programmed desire to kill.
It’ similar to my willingness to do some yoga lessons right now; I don’t perceive a reason to do that even though maybe I’d benefit if I tried.