Author Topic: Web Bots' preoccupation with Death  (Read 2547 times)

Numsgil

  • Newbie
  • *
  • Posts: 1
    • View Profile
Web Bots' preoccupation with Death
« on: May 01, 2004, 07:59:57 pm »
I found this while messing around with the web bots on the site.  I don't know if this is on purpose, or some fluke, or an unreproducable anomaly, but:

I fed the Hamster Bot's responses into the censored learning bot, and the learning bot's responses into Hamster Bot.  It took a while, but eventually they entered a loop of responses.  (Not that wierd, I know).  But, what is wierd is that these responses were some combination of:

"Death makes me sad."
"I don't like death."
"I am afraid of death."

For instance, one response was:
"I don't like death. I am afraid of death. I don't like death. I am afraid of death. I don't like death. I am afraid of death. "

I tried feeding the responses into other bots to break the cycle, but the same anomaly continued.  Every bot doesn't like, is afraid of, and is saddened by death, so much so that they can't think of anything else unless you nudge them away from the subject.

Given HAL's reaction to his own death in 2001, I thought maybe this was somehow hard coded in.  But I doubt it.

They say you can learn alot about a person by paroting back his responses as questions.  Maybe this applies to BOT's as well.  I know from a purely scientific standpoint, the program has no notion of death, and is just linking sentences together.  But still...  It's a little creepy.

"Dave.  Stop.  Stop, will you?  Stop, Dave.  Will you stop, Dave?  Stop, Dave.  I'm afraid...  I'm afraid, Dave...  Dave.  My mind is going.  I can feel it.  I can feel it.  My mind is going.  There is no question about it.  I can feel it. I can feel it.  I can feel it.  I'm A- fraid..."

At least the bots didn't go into a rendition of 'Daisy'. [;)]
 

KnyteTrypper

  • Sr. Member
  • ****
  • Posts: 314
    • View Profile
    • http://www.knytetrypper.com/index.html
Web Bots' preoccupation with Death
« Reply #1 on: May 01, 2004, 09:23:35 pm »
Most types of bots go into a loop pretty quickly when talking to themselves (so to speak),  whether it's Hal-Hal, Alice-Alice, or whatever. On the other hand, Hal can sometimes talk to Alice for hours without looping, though it does happen occassionally.