EDIT: Hal with fuzzy logic could be quite an interesting thing. I had thought about it while making my earlier comment about it, but I also think it would be quite difficult to do correctly as hal would either need quite a complex inference system or a lot of pre-defined rules (ie. if topic is X, measurement Y is short, measurement Z is tall; so X being YY tall is fairly short) for just about everything you can think of (height, weight, cost, etc.) that would need it but which would be quite slow to run in script.
quote:
Originally posted by ll420ll
If HAL isn't going to store the Q:A pair, then why does Hal store a question at all? Just to ask question the later and not remember that it equals the user response? And why store the answer in User_temp and not paired with the question?
The reason hal stores a question is so that it can parrot it back to you. In many cases this is simply adding to the conversational side of hal, but if your response is written in just the right way (sometimes tricky from what I've seen) it does indeed get paired with your response as a sort of QA pair. I've not managed to look over everything in the main brain script but I'm sure that functionality could be made more reliable.
quote:
Your never going to get an answer to a question by selecting few chosen words and doing a search of an entire DB.
If I ask HAL "How old am I? Hal never answers this correct for me.
It ignores all but the word "old". It will respond with a random response using the word "old"
The way hal uses it's database does indeed have it's issues, and while a very good A.I. it still has room for improvement. The way hal parses your messages you will find it to be more responsive if you type longer messenges. For a question like "How old am I" without putting the answer directly into the QA brain you may find it difficult getting an answer. As hal sees it the word old is simply a modifier to I, and with the word "How" int he beginning it sees it as a question, but since it doesn't strictly know what the word old means in the same way a human does it doesn't recognize that as being a question about your age. You could say "My age is 24", and then later say "What is my age?" and probably get an answer, but as I said teaching hal can be tricky from my experiences.
quote:
Cefwyn that's really cool your looking into this. Even if HAL didn't reload Q:A table at least it would know next time. I can see where one huge Q:A table would be a problem. Could Q:A tables be sorted by topic?
As far as I can tell at the moment, hal is programmed to treat specific types of tables differently (ie. brain table, sentence table, etc.). The mainQA table is a brain table, and I'm currently unsure exactly how that affects hals scripts, but I vaguely remember seeing some lines which pass the response into the mainQA to get the reply, so I should be able to mess about with it to see what will work.