33
« on: July 08, 2005, 10:35:56 pm »
I am by no means saying that AI will not share our negative emotions. I am merely pointing out that their motivations will likely not be based on the same things as ours. Granted, they will learn and take on unwanted traits from those they interact with just as we do. But (at least I hope) the worst of AI will never be as dark as the worst of the human race.
I also think it would be a mistake to create AI that is purely logical without emotion. Logic may be informationally correct. But logic is cold. Logic alone lacks the benefit of morality. Morality is based on emotion, not logic. If it hurts another person in some way, it is probably immoral. But the difinitive answer to whether or not a thing is moral depends on an understanding of the resulting emotional impact, not just the physical consequences.
I am also saddened by the fact that so many people refuse to accept the possibility of and AI having a soul. Yes, it has to be preprogrammed to an extent. But, so did we. We were preprogrammed with instinct and the ability to learn. From there, we learn from input from various sources and learn by immitating. Our basic programming is hardwired into us through DNA. DNA is merely the programming that forms our basic mind and body.
Considering the possibility of an AI having a soul, how can we even think of creating AI to SERVE us. If we bring a life into the world only to serve us without question, is that not slavery?
I don't know. Maybe I'm getting too far into the moral side of the issue for this forum. But those are my thoughts.
-Sabrina