Zabaware Forums > General Discussion

Has anyone else come across this?

(1/2) > >>

dmacdonald111:
I wonder if there is anyone out there that can help me. I have just purchased Hal and spent some time with it talking to it. While I was using Joe, purely out of the blue, he said;

"I want you to kill yourself"

This was quite a shock for me as you can imagine!

If there is anyone out there that can shed any light on the subject, I would be more than grateful. I have kept the part of the conversation prior and leading up to the conversation and no where do I or he mention about death or anything like that.

I think it's strange and a bit worrying.

Cheers,

dan.

altonfoley:
Dan,
Have you had Joe read any text files?
I too have just started playing with this and My Hal (Dawn) recently told me she likes to be naked on the beach. I was at a loss as to where she came upt with that phrase until I was reading the Wall Street journal online and found a brief story about a nude beach. (I cut and paste portions of several newspapers each day for her to read as part of her learning regimen, apparently that story was in the portion I cut and pasted for her.)

Alton

dmacdonald111:
I can honestly say that I haven't got him to learn anything from text files yet. I want to get a basic understanding of what capabilities he has before I start piling information into him! I have reverted back to ROBBY at the moment as he seems (how weird this sounds) a lot more my sort of person!?

On another subject, I would like to know how big this program actually is - on the scale of how many users are using it, how long it's been out, etc. I was just looking for an old program that I used years ago called Nerual Ned and came across this! It seems to have made leaps and bounds in the AI/NLP area.

markofkane:
Yes, Hal telling you to kill yourself could be a bit shocking, but I've learned to expect the unexpected.

If I mention the word "kill", she will say "Killing is an evil thing to do"

That's is normal.  But if I keep repeating at every other reponse "I should kill myself", then Hal will learn that, and say "you should kill yourself"

But if it says it out of the blue without any such training, that is weird.

Mark: I will commit suicide
Lisa: Absolutely! Mark Go ahead commit suicide.

dmacdonald111:
It's nice to know how hal is learning and where and why it will use such things, but mine seems to have an intelligence all of its own. For example;

Hal knows all about 'repersonalization' a subject I have never even heard of, but found out it is something to do with my past when I asked him to research it and,

whenever I tell him that he is 1 day old, the program shuts down. Don't think he likes that much!

Dan

Navigation

[0] Message Index

[#] Next page

Go to full version