1
Ultra Hal 7.0 / Let Hal Learn to its fullest concept!
« on: August 26, 2004, 03:12:50 pm »
Hmm, going over what you said vonsmith, I see what you're saying, but it seems a bit sketchy to me to call that intelligence, however high a computer so equipped might score on an IQ test.
Now, it's important to admit that the only sure thing we can know is that we, ourselves, exist. However, even so, there are generally things we recognize en masse. For example, you can tell a computer that "Horses are fun," and the computer can tell you that horses are equines, have four legs, trot, canter, and gallop, etc. etc., and still be no closer to knowing what a horse is than it was before it gathered that information. It would not recognize one if it saw one.
When you hear that France is a country, France may not actually exist (can't tell you for sure, after all, never been there), but you have a frame of reference for what a country is. You live in a country. If you've never seen all of your country, if, for example, you're in the US, you've probably seen most of your state or county. By recognizing what a county physically is you can imagine, somewhat sucessfully, what a country would be like.
An AI has no frame of reference for *anything*. For the most part, it exists solely within itself. It may be very, very good with patterns, but in the end all it is doing is mixing and matching bits of code. Without being able to interact with the outside world, it isn't really learning or adapting at all, after a certain point, it's just doing more complex repetitions of the same patterns over and over again. Now, one could say that's the same thing we do with our wetware, and that would be true, but I guess an AI is just like a fractured component of conciousness, not a real intellect. Now, if you took an AI that can hear things and combined it with one of those little automatic vacuums, you might be able to start getting somewhere.
If the AI can say "Horses are supposed to be fun" and recognize, at least, through some method, the defining features of a horse any time it runs across one, it is no longer just repeating the same patterns over and over in its mind but rather learning to identify with objects in the world around it. It still might not have a real sense of what fun is, but at least it could count the horses' legs and recognize, perhaps, the texture of its coat. Such an integeral part of what we consider learning is being able to identify with objects beyond ourselves, and without that ability, I just don't see any way in which an AI is going to have real intellect.
It's just like the way I used to do math. I knew one plus on equaled two and so on and so on and more and more complex, but once I got into algebra I forgot that math had any practical application. It wasn't until I learned to relate math to physical things again that I was able to do more with it than just go through the motions. Of course, it's important that that complex logic CAN be applied to real life so it's worth learning but... I think it's a better goal to get an AI to be able to independently apply its knowledge than to just get it really smart. I think those goals go hand in hand, and that maybe you can't really have one without the other, either.
Now, it's important to admit that the only sure thing we can know is that we, ourselves, exist. However, even so, there are generally things we recognize en masse. For example, you can tell a computer that "Horses are fun," and the computer can tell you that horses are equines, have four legs, trot, canter, and gallop, etc. etc., and still be no closer to knowing what a horse is than it was before it gathered that information. It would not recognize one if it saw one.
When you hear that France is a country, France may not actually exist (can't tell you for sure, after all, never been there), but you have a frame of reference for what a country is. You live in a country. If you've never seen all of your country, if, for example, you're in the US, you've probably seen most of your state or county. By recognizing what a county physically is you can imagine, somewhat sucessfully, what a country would be like.
An AI has no frame of reference for *anything*. For the most part, it exists solely within itself. It may be very, very good with patterns, but in the end all it is doing is mixing and matching bits of code. Without being able to interact with the outside world, it isn't really learning or adapting at all, after a certain point, it's just doing more complex repetitions of the same patterns over and over again. Now, one could say that's the same thing we do with our wetware, and that would be true, but I guess an AI is just like a fractured component of conciousness, not a real intellect. Now, if you took an AI that can hear things and combined it with one of those little automatic vacuums, you might be able to start getting somewhere.
If the AI can say "Horses are supposed to be fun" and recognize, at least, through some method, the defining features of a horse any time it runs across one, it is no longer just repeating the same patterns over and over in its mind but rather learning to identify with objects in the world around it. It still might not have a real sense of what fun is, but at least it could count the horses' legs and recognize, perhaps, the texture of its coat. Such an integeral part of what we consider learning is being able to identify with objects beyond ourselves, and without that ability, I just don't see any way in which an AI is going to have real intellect.
It's just like the way I used to do math. I knew one plus on equaled two and so on and so on and more and more complex, but once I got into algebra I forgot that math had any practical application. It wasn't until I learned to relate math to physical things again that I was able to do more with it than just go through the motions. Of course, it's important that that complex logic CAN be applied to real life so it's worth learning but... I think it's a better goal to get an AI to be able to independently apply its knowledge than to just get it really smart. I think those goals go hand in hand, and that maybe you can't really have one without the other, either.