24

Trust, p4

July 10, 2020 by Tim


27
Leave a Reply

avatar
10 Comment threads
17 Thread replies
3 Followers
 
Most reacted comment
Hottest comment thread
24 Comment authors
Robert LoughreyHavokJeff ByrnesWumboJames Kite Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Tim
Guest
Tim

So there is this thing called stockholm syndrome, does that work on robots?

Scortch
Guest
Scortch

Good question. I’ll be very interested to see in the coming years, as we develop a true AI, how much of humanity is imprinted into the AI programming, unconsciously from us. Will an AI be purely logical, or will we inadvertently give it some of the human tendancies?

The Legacy
Guest
The Legacy

Based on a lot of the machine learning that’s going on today, it wouldn’t surprise me. ๐Ÿ˜ถ

Vandril
Guest
Vandril

Or, by extension, will the AI learn our behaviors from us (whether we want it to or not) even if we don’t touch the programming aside from implementing the basics of machine learning?

HonoredMule
Guest
HonoredMule

Given that AI is largely designed around replicating ill- or extra-logical behavior (pattern extrapolation using unstructured association), and then trained with human-centric data, I would expect any emergent intelligence to be more like humanity than unlike it. I mean the learning process is intentionally based on our own biological development.

Hence the dread.

7eggert
Guest
7eggert

This has been tested:
en.wikipedia.org/wiki/Tay_(bot)

Diogo Salazar
Guest
Diogo Salazar

It didn’t work well.

James Kite
Guest

Better hope for human tendencies, because you really don’t want purely logical.

We wouldn’t balance well

Wumbo
Guest
Wumbo

If we use the internet to teach it it will be a moron.

Havok
Guest
Havok

A horny moron my good sir

crymblade
Guest
crymblade

stockholm relies more on psychology than physiology, so assuming he thinks with the same logic as humans, it’s possible.

7eggert
Guest
7eggert

Maybe it’s because most people don’t do bad things out of being evil but because of reasons. As long as we aren’t forced to listen to these reasons, we can imagine them to be just bad, and a failure to not listen to reason is called Stockholm Syndrome?

Jeff Byrnes
Guest

Neat trick: Stockholm syndrome isnโ€˜t real, and was fabricated after Stockholm police did such a bad job dealing with a bank robbery that the hostages took the side of the hostage takers. From the wikipedia article: > In her 2020 treatise on domestic violence _See What You Made Me Do_, Australian journalist Jess Hill described the syndrome as a “dubious pathology with no diagnostic criteria”, and stated that it is “riddled with misogyny and founded on a lie”; she also noted that a 2008 literature review revealed that “most diagnoses [of Stockholm syndrome] are made by the media, not by… Read more »

GamingSpreeMX
Guest

Yes.

Bwauder
Guest
Bwauder

“Yes” is slightly better than “both”.

Daminica
Guest
Daminica

Wow, Ethan realized that way sooner then I would have expected.

Pal87
Guest
Pal87

While I get that it’s a sentient robot, couldn’t they still just somehow program the three laws of robotics into him/it?

And add a 4th law where robots self destruct once they come to the realization that humans need saving from themselves.

Kenju
Guest
Kenju

Problem is that would be violating his rights as an individual, which Ethan used to convince the others not to kill him for good. It would basically be mind control at best, and a lobotomy at worst.

7eggert
Guest
7eggert

We’ve got mirror neurons, which are a hardwired mechanism to feel whatever the next one does.
What if I had none and you’d implant them into me โ€” instead of keeping me captive for the rest of my life because otherwise I’d kill people?
Would I chose mind control, or would I chose body control, or would I chose end of existence? I’m glad that I don’t need to know that.

aaron Smith
Guest
aaron Smith

scott said its constantly rewriteing its self , that kind of thing would have to be built in probably , its not something you can patch in, > personally id go with bolt a bomb to his noggen that if he trys to remove it blows up, and blows up if he kills or seriously hurts anyoine

Robert Loughrey
Guest
Robert Loughrey

It shouldn’t be any harder than just teaching a human to be moral… oh wait…

HonoredMule
Guest
HonoredMule

How was his first question not “How long /have/ you been around?” The obvious followup is “then how do you know your hatred of humanity isn’t someone else’s preprogrammed agenda if you’ve had so little time to form your own opinions?”

Rolando
Guest
Rolando

I’m taking some blind guesses here, I suppose. But: 1) “Little time” is a very relative concept, when you talk about a sentience with vast processing power. It can probably absorb information way faster, and also reach its own conclusions without all that ruminating and second-guessing and psycho-crap we humans go through. Just like MCU’s Ultron concluded humanity was a doomed mess, within a couple seconds of Internet access. Notice I’m not saying those conclusions are RIGHT. Just that they’d happen in a snap. 2) Unless someone tried to program a human-like psyche into Zeke, it doesn’t have the inherent… Read more »

Pulse
Guest
Pulse

yes Ethan, he is saying you’ve done doubled down right there

C. Mage
Guest
C. Mage

Well, the robot’s not exactly WRONG…

Crestlinger
Guest
Crestlinger

P A R A N O I A S E T S I N

Jetroid
Guest

Zeke looks sad here. ๐Ÿ™