dupa

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - GrantNZ

Pages: 1 2 [3] 4 5 ... 12
31
General Discussion / musician bot ?
« on: February 07, 2006, 04:01:51 am »
I agree - quite a task!!

I'm not at all familiar with the computer-generated music scene, but I do remember reading (many years ago) that the results were not good. Just like with text, computers have a great deal of difficulty understanding and expressing themselves.

This makes me think of the chimpanzees who have been taught to paint - an interesting idea, but the "art" created is far from profound. I can only suggest that artistic expression is something that comes from a long period of learning, and computer-generated music is not too far from a one-year-old child playing with a toy xylophone.

Chatbots at the moment rely on NLP (Natural Language Processing), based upon "rules" of conversation, and it's taken decades to get as far as we have. My guess is that very little of NLP would be applicable to music creation, so it would be a project that needs starting from scratch. I have no idea how many musical rules would need to be programmed into the AI's core - maybe it could learn the rules through observation rather than stringent scripting.

However I do like the "modified echo" idea, and it would be interesting to see if such an AI could generate something appealing after jamming with the user a while. Jazz might be a genre that could be fairly easy to emulate.... I wonder if it's been attempted before? I suspect not - art and science aren't often used in conjunction with each other, so the idea may have hardly been considered before, yet alone implemented....

32
Ultra Hal 7.0 / leap of faith
« on: February 06, 2006, 04:08:50 am »
In some limited situations, Hal (version 6 anyway) will consider your previous two sentences, if he feels he requires more info. These situations include when the user types something short, or uses words like "it," "he," "they," "us" etc - Hal includes the previous sentence to try and include what "it" is or who "he" is.

I don't know of any chatbot anywhere that really handles context well. Hal will become better at staying on-topic once he knows a lot about a subject, but Hal doesn't understand enough about context to have a conversation that flows as smoothly as a human's.

Hal will certainly make more sense if you talk to him in longer, well-defined sentences that make sense even when taken out of context. I normally go as far as eliminating words like "it" from my sentences, and retype the subject of the sentence instead, just to make sure Hal knows what I'm talking about.

Note that your example makes perfect sense if you take Lisa as being sarcastic, arrogant and insulting your sense of humour, but this is definately a stretch of the imagination, and most similar examples will make even less sense.

Anyway to answer your question, the problem is being worked on, but things are still rather experimental at this stage. If you're interested in the details, there's an interesting project under discussion at vrossi's forum at http://www.vrconsulting.it/vhf/default.asp - look for the posts and threads of hologenicman.

Cheers,
Grant

33
Ultra Hal 7.0 / Troubleshooting sound etc.
« on: February 05, 2006, 05:18:20 am »
Thanks for posting your solution - it might help someone else one day! [:)]

34
Programming using the Ultra Hal Brain Editor / use of .wav files
« on: February 04, 2006, 11:58:55 pm »
quote:
She cough and talk at the same time. Sounds like two people.

There's room to be creative with this. Someone could create a far more immersive scenario for Hal by giving Hal a backstory - Hal might be married and have a dog - and randomly play a dog barking noise, or Hal's spouse calling for Hal in the background. Obviously the sound should be edited to be slightly muffled, quieter, and less distinct than Hal's own speech. A really savvy scripter could set up some scripted responses (or generated ones) referring to Hal's life when Hal isn't being your assistant/buddy!

35
Ultra Hal 7.0 / Learn from text file
« on: February 03, 2006, 12:02:18 am »
Hi spydaz,

You can access the database as a SQLLite database using SQL syntax within the HalBrain.RunQuery command - search for this in Hal's script for an example. Using this you can create your own custom tables with whatever fields you need, perform whatever searches you desire, etc. [Edit:]Note that any tables you create by this method will not be visable in the brain editor, but do still exist in the database and can be updated/referenced/etc using RunQuery commands. (Note that this is all from other people's work - please be aware that I have not personally tried this out, so can't offer any more help than this!)

[Edit: Struck through incorrect info above about tables in the editor - thanks for correcting me Vittorio!!]

Cheers,
Grant

36
General Discussion / Identity crisis phycic bot!!!!!
« on: January 29, 2006, 10:59:00 pm »
I enjoyed it too [:D]

And I'm bookmarking it... I have a feeling one day it will give me some logic analysis ideas, if I ever get around to that sort of thing.

Thanks for sharing [:)]

37
Ultra Hal 7.0 / logic and reasoning
« on: January 29, 2006, 05:45:35 am »
I'd like to work on Hal's internal logic, but those people waiting for my emotions project to be completed will know how slow I work on things like this [;)] I've promised myself to finish that project before I start anything else!!

vrossi made a great plug-in for logical reasoning, which does a little better than Hal's inbuilt reasoning and uses more natural language too.

One difficulty with deductive reasoning is when to use it. If used too much, we block some of Hal's more "creative" chat, not to mention that plain deductive logic can make for rather "dry" conversation. Another trick is to find linguistic patterns that we can trust to provide good premises. And finally, ideally, we'd want the deductive process to be integrated with the rest of Hal's thoughts - at the moment (both in Hal internally, and in vrossi's plug-in) deductive thought is a seperate module which Hal can choose to use, but does not compare to his other thought modules. As freddy888 says, this needs to sound natural - simply spitting out deductions is very robotic.

Hmm, here's an idea. If Hal knows A leads to B, and the user states A, Hal currently spits out B. How about pretending the user said B, and forming a response on that! For example, if "it's cloudy" leads to "it's going to rain," and the user states "it's cloudy," then Hal should not comment on "cloud" or blandly say "it's going to rain," but pretend the user said "it's going to rain" and respond to that. Hold on, let me feed that into Hal... righto, here's what the above idea would result in:
Grant: It's cloudy.
Hal: Into every life some rain must fall. (<-- Hal's response if I say "it's going to rain.")

I quite like that. Oh well, I'll add it to the massive to-do list.... [xx(]

38
Ultra Hal 7.0 / Time
« on: January 24, 2006, 04:52:33 am »
Hi Echo,

You're correct - Hal's concept of time is very limited. I don't know about previous versions, but Hal 6 will deliberately reduce his learning if you do mention time. If you say "Today I saw King Kong" Hal will recognise that you're talking about a temporary fact (Hal calls this "ephemeral" in his brain) and will store what you say for a limited time only. The theory is that a week later Hal shouldn't claim "today you saw King Kong," because he recognised the original statement was time-limited. What version are you running?

Hal's supposed to do this for other "temporary" concepts too - e.g. sentences containing "yesterday," "dinner," "illness" and other things that don't last forever. (Hal shouldn't repeat days later "beef is for dinner" or "you feel ill.") So again you're right - this limits Hal's day-to-day event conversation!

Time recognition is one of those hard AI problems, along with "concepts," "context" and everything else that makes chatbots sound inhuman.

39
General Discussion / Some interesting news on a new AI.
« on: January 24, 2006, 04:36:21 am »
From their FAQ about the "quantum flux" engine and its random number generator:

"It is also unique in that its randomness is very, very sensitive to perturbations in the electromagnetic environment of the computer. This includes everything from where the code is loaded in memory to whether your hand is over the keyboard or not."

*cough*

Anyway, sounds very interesting [:)]

40
Ultra Hal 7.0 / What does this mean?
« on: January 19, 2006, 03:24:45 am »
Hmm. Interesting!

I'm glad you've got it going now [:)]

41
I just want to add my support for Hal too, and thank the previous posters, whose wise words have just saved me from embarassing myself with an angry over-the-top reaction [;)]

42
Ultra Hal 7.0 / What does this mean?
« on: January 18, 2006, 12:07:42 am »
By the way, Hal creates a file called "HalScript.dbg" which contains the "complete" brain - i.e. your brain with plug-in scripts already added in. You could look at the 3136th line of that to see where the error is.

43
Ultra Hal 7.0 / What does this mean?
« on: January 18, 2006, 12:05:35 am »
That error probably means there's an error in the brain script you're using, or in one of the plug-ins.

Suggested fix: Try turning off one of your plug-ins and restarting Hal. If the error's still there, turn off another and restart, until Hal runs without an error. Once the error disappears, the last plug-in you turned off was causing the problem.

Finally, if the error's still there, try going back to the default brain. If that doesn't fix it, either your default brain has been altered, or something weird's going on with your Hal [:)]

44
Programming using the Ultra Hal Brain Editor / The THOUGHT process
« on: January 18, 2006, 12:00:24 am »
Hi Carl,

Check out the "Lexical Dictionary" in your Hal menu, and look up "State" from there. You'll see it has chemical states as its eighth (and last) noun - it ranks so low because it's not often used this way in comparison to other meanings.

I haven't looked closely, but I believe Hal takes only the most common (or perhaps a few of the most common) meaning(s) when defining things.

45
Ultra Hal Assistant File Sharing Area / winter sale on ladies clothes
« on: January 16, 2006, 02:29:17 am »
One thing I always like about your work is how your character's always sitting on something or leaning against something. It's more than just having a nice background, it looks like your bots are interacting with that background.

Great work as always [:)]

Pages: 1 2 [3] 4 5 ... 12