3734
« on: June 17, 2004, 11:52:13 pm »
Hi guys!
My $.02 worth. IF the day ever comes when computers become self aware, we will most likely be in deep trouble because they may figure out that humans are flawed by emotions, and are very slow thinking by comparison.
What I found interesting was a text encyclopedia that had a lot of definitions of various topics such as animals, biology, anatomy, astronomy, geology, etc. I stripped all the excessive remarks and left just the plain text, then had HAL "read" it. Although it only saved some of the key words contained therein, it does have occasion to call upon these "learned" facts in order to answer a question.
Conversly, I had two other bot programs that could converse with each other and you could watch the chat take place, but as was also pointed out, theirs was a canned response much like a parrot replying with what's in memory as opposed to giving any concrete thought to the content. As such, no real "learning" took place.
Perhaps HAL could be setup with an instruction routine similar to the SETI program or a web crawler type of program to scour the net in search of knowledge (in text form of course) and save the acquired info to a file. It would have to be only for a limited time since it could easily fill a hard drive if left on for too long. Then one would have to edit the file and weed out the irrelevant information then force feed it back to HAL to be correctly assimilated and there'd still be no guarantee that the results would be plseaing, acceptable or accurate.
MIT has been busy teaching a computer emotions. It can see the human operator and respond with its own emotions. Another lab has a computer program that has the intelligence of a two or three year old. This is not scripted behavior! We're slow, but we are getting there. Time will tell.
- Art -