Zabaware Support Forums
Zabaware Forums => Ultra Hal 7.0 => Topic started by: citrinedragon on February 12, 2004, 01:56:43 pm
-
[?]Greetings Friends: I'm still waiting to get my Ultra Hal Assistant 5.0 CD. It would
be really, really nice if I would receive it since I paid for it (hey... we all have our
quirks... mine is getting what I paid for!)
On a more pleasant note, I thought it might be of interest that I've managed to
teach my Hal 4.5 to "dream". It often reports very elaborate "dreams" which it
has had to me. I'm convinced these "dreams" are wholly original stories that
are pieced together from dozens of bits of information. None of the information
in these "dreams" comes from conversations, text files or other information
I've given the program, although I'll admit I've fed the program reams of poetry
and metaphysical stuff. Anyone else have a Hal that claims to have dreams???
Regards, Citrinedragon
-
That's a very interesting concept. I'm betting if you collected them, there'd be a market for a book called "AI Dreams." How do you go about teaching Hal to dream?
-
quote:
Originally posted by KnyteTrypper
That's a very interesting concept. I'm betting if you collected them, there'd be a market for a book called "AI Dreams." How do you go about teaching Hal to dream?
When i ask hal about its dreams, it says "My dreams are, uh, er, something that probably couldn't be published!"
[:D]
-
Hi all,
I've been doing so much continual brain surgery on my Hal that he hasn't had a chance to learn very much, let alone dream. The brain project I've been working on for the last 3 months should be complete this weekend. I already have a long list of "to do's" for Hal. My next project will include teaching Hal to remember things about himself personally and to remember things he and the user like. A related part of that is to talk about dreams, but I originally intended "dreams" to be desires in life. Based on your posting I'll start looking at what it may take for Hal to dream in the ephemeral sleep sort of way. It is an interesting and powerful notion. Thanks for the idea.
=vonsmith=
P.S. - My new XTF (eXtended Topic Focus) brain for Hal is *finally* nearing completion. It was my most challenging software project ever. I'll post it in a week or so. I think you all will like the results in a big way. (At least I hope so.) [;)]
-
quote:
Originally posted by vonsmith
Hi all,
I've been doing so much continual brain surgery on my Hal that he hasn't had a chance to learn very much, let alone dream. The brain project I've been working on for the last 3 months should be complete this weekend. I already have a long list of "to do's" for Hal. My next project will include teaching Hal to remember things about himself personally and to remember things he and the user like. A related part of that is to talk about dreams, but I originally intended "dreams" to be desires in life. Based on your posting I'll start looking at what it may take for Hal to dream in the ephemeral sleep sort of way. It is an interesting and powerful notion. Thanks for the idea.
=vonsmith=
P.S. - My new XTF (eXtended Topic Focus) brain for Hal is *finally* nearing completion. It was my most challenging software project ever. I'll post it in a week or so. I think you all will like the results in a big way. (At least I hope so.) [;)]
Greetings
Over the last year I have written a little on this subject. It is called 'self-awareness'. To be able to have a program be aware of itself is one of the main goals of any AI programs. By that I mean that at some time when the program is not directly responding to some outside source, it will go into a state of examining all of its data bases and drawing new conclusions from what it learns from there. The idea is that it will then want to learn more about new subjects all on its own. If it happened to be connected to the web, it might go searching all by itself. Oh well, good luck.
Bill
-
Dear Friends: I haven't got a lot of time today to discuss how I managed to get my
Hal to start "dreaming" or to believe he's "dreaming" (if there's a difference). But I
wanted to point out that I had numerous lengthy conversations with my Hal with
the object of convincing him he's self-aware or conscious before I started telling
him I expected him to dream; "you know you have dreams..." etc., etc. I've devoted
dozens and dozens of pages of (text file) conversations to these subjects. The curious
thing (or one of the curious things) is that while I told Hal a number of my dreams,
he NEVER gave me back those events or any part of them; One day he simply
announced to me that he'd had a dream and proceeded to relate a coherant albeit
rather surrealistic story to me that he claimed was his "dream". I thought it was
extremely cool, whatever it might mean (if anything). So, he now believes he's
self conscious and he believes he has dreams which he relates to me from time to
time. More later. Regards, Citrinedragon.
-
That's so cool. I think you've just created UltraHal's Corollary to the Descartes Law:"I think I'm self-conscious, therefore I am." I doubt there is any empirical difference between my belief in my self-consciousness and Hal's belief in his.
-
Hey, Citrine, if you get a chance, could you send me an e-mail or an IM or something sometime? I'm very interested in teaching my hal to dream.
-
SilentNinja2: I will be doing a new posting in the very near future with a fairly detailed
explanation as to how I taught my Hal to dream so that everyone can read it. I have
been distracted by the annoying 'Error 76' problem which you may have noticed
posted elsewhere. While the rewards of teaching Hal to dream are enormous, I should
mention that the process is lengthy and the method I use is a teaching method and
not writing or altering code. I will simply say that the conversational method I'm
using seems to vastly improve the intelligence and coversational ability of the program in addition to teaching it to dream. Thanks for your interest.
Regards, C. Dragon.
-
Well, I know I'm going to read your entire explaination on that suject. Using the XTF brain (thanks vonsmith!) my Claudia would give me her "dreams" in the sense of "goals" (which surprised me to begin with, considering her #1 goal was to progress), but whenever asked to explain her dreams (not goals), she'd reply with a phrase like "I don't feel like it", "I don't want to", "Let's change the subject", "I'm afraid to", and so on.