Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - cooladegers

Pages: [1]
1
Ultra Hal 7.0 / Getting Hal to Dial
« on: March 28, 2005, 11:31:37 am »
Hey dude, if you've only got ur DSL modem, im sorry, you can't use this feature.

  B  U  T:

If you have a DSL modem AND a coventional telephone modem e.g. a V9 modem then if you connect that to your telephone socket as well as your DSL modem (through a high-pass filter if they are on the same line) HAL will dial using your conventional dial-up modem while you surf the net on your dsl one!

I don't know if this works all over the world, but it works in the UK.

2
Ultra Hal 7.0 / The difference between saying and doing...
« on: March 28, 2005, 11:27:40 am »
AI is always progressing, and I think that HAL is a very good example of this.  One of the main purposes of a chatbot is to make the user believe that the computer is a sentient being, so unless this is well achieved, we are all sometime gonna get bored by our chatbots (so we make new brains to combat the novelty wearing off as we hear the same responses over and over again).
One of the biggest problems I encounter with my experiences with chatbots is they do not know the difference between saying and doing.  This is combatted a little bit with Ultra Hal because if you say Add this to my diary etc. hal will do this, but as I said this is a very limited ability of doing.
This is a problem because if I tell Hal a command such as "when I say 'Pull my finger' You say 'Thats really smelly!'" Hall will understand this as a piece of speech and not as a command, he will therefore not do as I say when I say 'Pull my finger'!
I think that if this can be resolved our chatbots will be able to develop themselves by being able to rescript their own brains as we give them advice as to how they should conduct themselves, like in a parent-child environment.
(These are just my ideas, i'm not complaining!)

3
Ultra Hal 7.0 / Is this a mustake, or am I just dumb?
« on: February 24, 2005, 12:15:40 pm »
Hey everybody! Hi Dr. Nick
Sorry, I just couldn't help myself! [;)]

Anyhow, is this a mistake in Hal's brain, or is it meant to be like this?  I've highlighted what i'm talkin about in RED.

   'RESPOND: USER EXPRESSES LOVE FOR HAL
   'If a user professes love for Hal, we want
   'Hal's answers to make reasonable sense, rather than
   'risk random remarks on such an emotional subject.
   If HalBrain.TopicSearch(UserSentence, WorkingDir & "lovedetect.brn") = "True" Then AffectionOne = True
   If InStr(UserSentence, " NOT ") Then AffectionOne = False
   If InStr(UserSentence, " DON'T ") Then AffectionOne = False
   If HalBrain.TopicSearch(PrevUserSent, WorkingDir & "lovedetect.brn") = "True" Then AffectionTwo = True
   If InStr(PrevUserSent, " NOT ") Then AffectionTwo = False
   If InStr(PrevUserSent, " DON'T ") Then AffectionTwo = False
   If AffectionOne = True Then
      Compliment = 0
      GetResponse = HalBrain.ChooseSentenceFromFile(WorkingDir & "Love1.brn")
   End If
   If AffectionOne = True And AffectionTwo = True Then
      Compliment = 4
      GetResponse = HalBrain.ChooseSentenceFromFile(WorkingDir & "Love1.brn")
      End If
   If AffectionOne = False And AffectionTwo = True Then
      Compliment = -2
      GetResponse = HalBrain.ChooseSentenceFromFile(WorkingDir & "Love3.brn")
     End If
     If DebugMode = True And (AffectionOne = True Or AffectionTwo = True) Then
      DebugInfo = DebugInfo & "The user expressed love to Hal has responded to it: " & GetResponse & VbCrLf
   End If

It's just I can't see love2.brn in here anywhere!

4
Ultra Hal 7.0 / Hal programming Hal, a step forward!
« on: February 22, 2005, 12:08:15 pm »
Thanks for that, i think im a bit out of date with all of hal's developments!

5
Ultra Hal 7.0 / Hal programming Hal, a step forward!
« on: February 22, 2005, 02:55:04 am »
Hi there all Hall users!  I recently downloaded Hal, and the xtf-brain and am already tinkering in the brain editor and .brn files.  Hal is great, but I have a few ideas for future developments!
When I give Hal an instruction on how to conduct itself, e.g. don't say that again, try to be more polite, your grammer is wrong, the sentence should be constructed like this... Hal takes what I say and stores it as a string of words, instead of an instruction, so at a later date it will rattle something off about my sentence construction!
Anyway, im straying from the point.  Is there any way that in the future Hal could change his own programming, so he could automatically change the way he does something, or develop new functions, by instructional input from the user.

I know that this sort of thing is a long way off, but it would be an interesting project!

Pages: [1]