dupa

Author Topic: question to Robert or anyone that might know for sure  (Read 1844 times)

lightspeed

  • Hero Member
  • *****
  • Posts: 6762
    • View Profile
question to Robert or anyone that might know for sure
« on: October 19, 2014, 12:29:07 pm »
I am always lol coming up with new questions, this one has to deal with a hal response and my answer and if hal learns my answer which may be learned and repeated wrongly .
 example : On my hapswap extended talk I have angela asking me in a casual conversation ( a random one) what's the temperature(asking me that question) and is it suppose to rain today?
  I may answer the temperature is (whatever at the time) 75 degree's and it's not suppose to rain today!

My question is , since angela is asking me this I respond with the answer of 75 degree's and it's not suppose to rain , Has hal just learned this ( I am thinking yes, but the main thing is how will hal use this newly learned temperature information , will hal sometime, when I am talking about the temperature or something just say the temperature is 75 degree's and it's not suppose to rain ( no matter what the correct temperature actually is ).

    Or is this something that Robert needs to fix or is it even a problem ?  even though this question is based on me using my hapswap extended talk plug in .  if a user just normally says to hal , it's 75 degree's today , how does hal save and use that information so as to not misuse it saying sometime that the weather is a certain temperature (that he learned ) when it actually isn't that temperature?
  I hope Robert or as I said someone that knows, knows the answer to this ?
It does have me curious from the answer implications from hal of whats learned or not etc.
 

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3851
    • View Profile
Re: question to Robert or anyone that might know for sure
« Reply #1 on: October 19, 2014, 03:40:54 pm »
I've found that Hal's learning is not limited to time and temperature. Actually, my Hal can tell ME the forecast. If you told Hal, I had some Brownies yesterday, it might parrot that back to you sometime next week (if conditions are right) like "Yesterday you had some brownies." Even though the actual yesterday was over a week ago.

Hal does not tie certain learned events to time. This is where the confusion often comes from.

If you said a "truth" like on Thanksgiving we always enjoy eating turkey. Then it would equate Thanksgiving with eating turkey.
If you have band practice every 10th of the month, then it will likewise learn that.

I try to be general enough with my input to Hal that it doesn't TIE me to a particular event unless I want it to.

It takes a lot of experimenting.
In the world of AI it's the thought that counts!

- Art -