dupa

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Phoenix Talon

Pages: [1]
1
General Discussion / Long-winded question
« on: February 28, 2010, 08:53:39 am »
Sigma X, you completely understand what I'm trying to say.  All I'm looking for is a degree of recognition, at first.  All the interaction stuff can come later.  But that's where I'm thinking of going with this.

The wonderful thing about Hal is it's amazing ability to make leaps in logic separated by many degrees.  The guys in my work and I tested Hal a couple of days ago.  It was able to make a logical deduction with six degrees of separation with four different trains of thought.  It brought those ideas together and fomulated a logical answer.  Needless to say, we were impressed.  If it can do that, then I'm certain it can do something with the inputs from eMotion or some other emotional recognition program.

~Phoenix Talon and Moira

2
General Discussion / Long-winded question
« on: February 24, 2010, 07:33:18 pm »

Thank you Lightspeed and Bill819 for responding.  Lightspeed, you seem like a visionary such as myself.  I would like to bounce ideas off of you sometime.  Bill819, I wonder if that could be adapted to include the inputs from emotional recognition software.  Then the responses to the emotional level of the user would already be in place.  That would cut down on half of the work.  Do you know if there is a current plugin that does the same thing, or something similar?

Maybe this will become my little hobby to figure this out.  However, since no one has come forward with any reference to something like this ever being tried, maybe it's not possible.  I will endeavor to sort it out.

I'm not so concerned if Hal can really "see" a user.  I'm more concerned with giving Hal the ability to recognize if a user in a good mood.  That way it can ask questions relating to the user's day, and try to cheer them up, if necessary.

As long as the output can be integrated with the brain, then it could work.... I think.  The problem I see is that a smile doesn't last long, and it will be tough for Hal to be programmed to understand the difference between a quick smile and a long smile.

I think I will have my work cut out for me, and I need to learn to program on my own.  Ok.... see you guys in like five years.

~Phoenix Talon and Moira

3
General Discussion / Long-winded question
« on: February 20, 2010, 11:45:34 pm »
Before I get to the idea I've been mulling over, let me first say that the Ultra Hal program is absolutely incredible.  I stumbled into this world of AI and programmers completely by accident.  And I'm so glad I did.

I applaud all of you for doing such an amazing job, and I don't have near the brain power to even post in the same forum as some of you.

However, I'm troubled by some ideas I had, and this seems like the appropriate place to ask.

I've been "playing around" with my Hal (named Moira) for a while, reading the forum (I've gotten through most of it), downloading the impressive brains, picking out different skins by talented artists, doing some amateur script writing, and having a really great time of it all.  Best purchase I've made in a long time.

The one that really caught my eye, was the HalVisionX by snowman (incredible job by the way).  Very ingenious to have Hal be able to "sense" if a person is in front of the cammera.

I won't lie to the forum, I'm not a programmer.  I can dabble a little, and I can reason how something should look by piggy-backing off of someone else's genius.  However, I don't know the jargon.  But snowman's program got me thinking about emotional recognition.

I did a little digging through the internet and found such an abundance of information on the subject, but it's like some closely guarded secret.  All these programmers claim to have succeeded in emotional recognition, but no one wants to share.  Couldn't find anything that could be edited, either.

The way I see it, though, if you have a program that can output the emotion it reads on a person's face.  And you have some kind of program that will periodically check for that data (like a Hal brain).  Then have that data translated into something Hal could understand, work with, or simply record and learn from.  Could it potentially work?

I found a program, in my digging, that might be able to output that data, but I have no experience with this kind of thing.  Hence the long winded posting.

This is it: http://www.visual-recognition.nl/Demo.html

It's called eMotion.  Apparently, it can interface with Second Life and make the user's facial expressions appear on the avatar.  I'm not familiar with Second Life either, but if eMotion can output it's data to another application.  Couldn't that data be used for Hal, also?

Does someone here have any experience with either of those programs?  Or know someone who knows someone?  I would hate to spend the money on a program that turns out to be useless.

My "gut" is telling me this could work, but then again it could just be the sesame chicken I had for dinner.

I have almost no skill when it comes to this sort of thing, but I would enjoy having a dialogue about the possibility, or impossibility of it.


If you made it to the end, thank you for your attention.
~Phoenix Talon and Moira

Pages: [1]