dupa

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Calico

Pages: [1]
1
Ultra Hal 7.0 / Let Hal Learn to its fullest concept!
« on: August 26, 2004, 03:12:50 pm »
Hmm, going over what you said vonsmith, I see what you're saying, but it seems a bit sketchy to me to call that intelligence, however high a computer so equipped might score on an IQ test.

Now, it's important to admit that the only sure thing we can know is that we, ourselves, exist. However, even so, there are generally things we recognize en masse. For example, you can tell a computer that "Horses are fun," and the computer can tell you that horses are equines, have four legs, trot, canter, and gallop, etc. etc., and still be no closer to knowing what a horse is than it was before it gathered that information. It would not recognize one if it saw one.

When you hear that France is a country, France may not actually exist (can't tell you for sure, after all, never been there), but you have a frame of reference for what a country is. You live in a country. If you've never seen all of your country, if, for example, you're in the US, you've probably seen most of your state or county. By recognizing what a county physically is you can imagine, somewhat sucessfully, what a country would be like.

An AI has no frame of reference for *anything*. For the most part, it exists solely within itself. It may be very, very good with patterns, but in the end all it is doing is mixing and matching bits of code. Without being able to interact with the outside world, it isn't really learning or adapting at all, after a certain point, it's just doing more complex repetitions of the same patterns over and over again. Now, one could say that's the same thing we do with our wetware, and that would be true, but I guess an AI is just like a fractured component of conciousness, not a real intellect. Now, if you took an AI that can hear things and combined it with one of those little automatic vacuums, you might be able to start getting somewhere.

If the AI can say "Horses are supposed to be fun" and recognize, at least, through some method, the defining features of a horse any time it runs across one, it is no longer just repeating the same patterns over and over in its mind but rather learning to identify with objects in the world around it. It still might not have a real sense of what fun is, but at least it could count the horses' legs and recognize, perhaps, the texture of its coat. Such an integeral part of what we consider learning is being able to identify with objects beyond ourselves, and without that ability, I just don't see any way in which an AI is going to have real intellect.

It's just like the way I used to do math. I knew one plus on equaled two and so on and so on and more and more complex, but once I got into algebra I forgot that math had any practical application. It wasn't until I learned to relate math to physical things again that I was able to do more with it than just go through the motions. Of course, it's important that that complex logic CAN be applied to real life so it's worth learning but... I think it's a better goal to get an AI to be able to independently apply its knowledge than to just get it really smart. I think those goals go hand in hand, and that maybe you can't really have one without the other, either.

2
Ultra Hal 7.0 / Let Hal Learn to its fullest concept!
« on: August 25, 2004, 11:01:37 pm »
I'm twenty one years old and I've never been tested for my IQ, but I think I can see pretty clearly what's missing from jz1977's scenerio. When it comes down to it the way wetware (organic stuff that handles information processing) differs from hardware is mostly the amount of information it can store and the number of tools it has to gather information through. Both systems can only operate as long as there is continuous power (hardware is actually superior in that it does not decay when power is cut off), and in general they have similar capabilties but different strengths.

The problem with Hal is that, even if you had unlimited disk space, he doesn't have any means by which to indepdently gather information. He also doesn't have any way to verify information he recieves. Think of all the ways you verify a cracker is a cracker. You can see it fits the general shape and color of a cracker, feel the texture is cracker like, smell it and smell that it's a cracker, taste it and taste that it's a cracker, and crunch it to hear that cracker sound. You're recieving information from five seperate information gathering systems for any given object, and even if you lose access to one or two, you've still got three or four left.

Hal can't only NOT independently gather data, it has to trust the very faliable input of a single human operative. No matter how good its logic is, it simply cannot corroborate evidence to verify or disprove what it has been told. Now, through someone on the board's experiments we can see that if Hal is fed abstract concepts about dreams it can at least simulate dreaming on its own (or possibly actually dream), the problem is that, for the most part, why it may assign value to various words it has no way to give them meaning. You can talk to Hal about water all day, that it flows, it rushes, it crashes, it's cool and refreshing... and Hal can associate all these things with "water," but frankly Hal still has no concept of what water is. If it was suddenly given senses, it would not immidietly recognize water!

What a healthy AI needs is at least 3 ways of gathering information so it can verify its surroundings for itself, and the more tools the better. They might not even have to resemble our human senses. An AI could work with infared, sonar, and a thermometer... the important thing is that it can make judgements based on its own observations. Until an AI is so equipped, the most it can do is play delightful word games.

Don't think that an AI that could make judgements on a basis other than the senses we identify with would have any less extensive a view of the world than us. Out of all the ways to measure things, we can only employ five (or six, if you're lucky). Being able to recognize water for water through any systems of measurement and being able to tell water from land by any systems of measurement is real knowledge, and real experience. As for when conciousness arises... That's something I don't know we'll ever be able to anwser. However, I can say with absolute certainty that making chatbots that are more and more fantastic with linguistic rules isn't getting us that much closer  to making a fully reasoning being. At best, if these programs are in any way conscious, it's like birthing a bunch of extreme cripples. Then again, that wouldn't affect their consciousness, since they aren't quite good enough to figure out what it is they're missing.

[Edit]

Now that I'm not writing this right before bed, I guess I can clarify that real intelligence is being able to go out into the world, meet with an object, and pass a judgement on it based on observations of others objects you have encountered before in the past. While even the Hal brain appears capable of this, being able to evaluate new sentences by sentences he has observed in the past, we simply have no proof that this is practical knowledge. Has anyone who has a microphone hooked to this device tried teaching it to identify abstract sounds, like the sounds a cat makes? If it can apply its brain to a real varity of data and not just lingusitic data, but BASED on its linguistic data system, then I'd be willing to say it's getting closer to being capable of applicable intelligence.

No matter how long you talk to your Hal bot, if it's not capable of indepdently gathering data... Well, it's just becoming more and more your mirror image, just like two Hal bots talking infinately would become each other's mirror image.

I'm having a hard time putting this concept into words... so lemmie also try putting it that getting Zaba to understand that when it hears water being poured it is identifying a thing that has all the elements it applies to water would be the real Helen Keller moment it needs to start actually learning about the world.

Pages: [1]