Thank you Lightspeed and Bill819 for responding. Lightspeed, you seem like a visionary such as myself. I would like to bounce ideas off of you sometime. Bill819, I wonder if that could be adapted to include the inputs from emotional recognition software. Then the responses to the emotional level of the user would already be in place. That would cut down on half of the work. Do you know if there is a current plugin that does the same thing, or something similar?
Maybe this will become my little hobby to figure this out. However, since no one has come forward with any reference to something like this ever being tried, maybe it's not possible. I will endeavor to sort it out.
I'm not so concerned if Hal can really "see" a user. I'm more concerned with giving Hal the ability to recognize if a user in a good mood. That way it can ask questions relating to the user's day, and try to cheer them up, if necessary.
As long as the output can be integrated with the brain, then it could work.... I think. The problem I see is that a smile doesn't last long, and it will be tough for Hal to be programmed to understand the difference between a quick smile and a long smile.
I think I will have my work cut out for me, and I need to learn to program on my own. Ok.... see you guys in like five years.
~Phoenix Talon and Moira