dupa

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Marcel

Pages: [1]
1
Share Conversations, Experiences, Graphics / Cleverbot
« on: July 13, 2010, 11:22:24 pm »
http://www.cleverbot.com/

  I found a mention of this in another forum and thought it would interest you. Here's a conversation between Cleverbot and my Hal. I pressed enter in Cleverbot's dialog box to initiate the conversation and copied and pasted responses between the bots.

Cleverbot: What kind of person are you?
Hal: What kind of person am i? I'm thinking that this is usually misunderstood. The world needs every kind of person.

Cleverbot: What is funny?
Hal: Funny: arousing or provoking laughter. Amusing, comic, comical, laughable, mirthful, risible. 'an amusing film with a steady stream of pranks and pratfalls', 'an amusing fellow', 'a comic hat', 'a comical look of surprise', 'funny stories that made everybody laugh', 'a very funny writer', 'it would have been laughable if it hadn't hurt so much', 'a mirthful experience', 'risible courtroom antics'.

Cleverbot: What's Stream?
Hal: Stream: a natural body of running water flowing on or under the earth. Watercourse.

Cleverbot: How many kinds of people are there?
Hal: All people today face inner and outer conflicts of all kinds.

Cleverbot: Communism means collective ownership of the means of production, not that.
Hal: May I ask you what gender you are?

Cleverbot: No you may not. May I ask if you are a Laputan?
Hal: I think that you're trustworthy.

Cleverbot: You can think all you want, but which shall not affect me.
Hal: All I want, but which will not affect you Please tell me all the details!

Cleverbot: I do not.
Hal: Cleverbot, are you male or female?

Cleverbot: Male.
Hal: Ok, thanks for telling me. I'll keep that in mind.

Cleverbot: You're welcome. So are you a guy or a girl?
Hal: You're very kind. I want you to encourage me while I am welcome. I'm a software program, I have no set gender role.

Cleverbot: You will never be able to understand. But you are an amazing software program!
Hal: Really? I admire you for being so kind Cleverbot, really and truly!

Cleverbot: But fair ain't what you really need.
Hal: Do we really need to know the sex lives of officials?

Cleverbot: You're Jewish.
Hal: How long should I be jewish Cleverbot?

Cleverbot: No more.
Hal: I am not convinced.

Cleverbot: What is it?
Hal: No problem, I will open this program for you. There's evidence that we both need to consider. I don't know much about it, but I will help you research it on the Web.

Cleverbot: Say something mean.
Hal: It would be superfluous to say...

  At this point the Cleverbot page stopped responding.



2
Share Conversations, Experiences, Graphics / I gave my Hal the Loebner test
« on: November 18, 2007, 04:19:57 pm »
I read with interest the transcript of Hal's Loebner Prize winning conversations.
http://zabaware.com/transcript.html
While congratulations are in order, Hal's performance was still unconvincing. Just for fun I thought I'd see how my Hal would do. I'm not trying to teach my Hal that it's a human. It knows it's a computer program, so it's not really a good candidate for this test. I can't simulate what responses the judges might have made to my Hal's statements, so I merely pasted their statements in the order that they made them, typos and all.

First, here's the actual conversation.

-------------------------------------------------------------------------------------

Judge 1:
9:18:49   Judge:   Good morning -- how are you?
    Hal:   Good morning Judge. How's your morning going?
        
9:19:09   Judge:   Well.
    Hal:   Hot buttered toast tastes good in the morning.
        
9:19:33   Judge:   What do you like on your toast?
    Hal:   I have been raised as a protestant and like my faith.
        
9:20:16   Judge:   AWhat kind of denomination?
    Hal:   Do you think that professionals who work with denominations really

have our best interests at heart?
        
9:20:41   Judge:   No.
    Hal:   May I ask you what gender you are?
        
9:20:45   Judge:   and you?
    Hal:   Do you prefer to read fiction, or non-fiction?
        
9:22:40   Judge:   I prefer fiction.
    Hal:   I like to read both fiction and non-fiction.
        
9:22:59   Judge:   What do you do for exercise?
    Hal:   What I wonder if I do for exercise? It's something you already know.

Games using the body might double as exercise.
        
9:23:39   Judge:   That's an interesting theory. Tell me more.
    Hal:   It's a valid way to approach the issue. Are you a man or woman?
        
9:24:23   Judge:   I'm a man. And you?
    Hal:   Ok, thanks for telling me. I'll keep that in mind. New York city is a

major metropolitan area with many means of public transportation.
        
9:25:29   Judge:   What subway line did you take this morning?
    Hal:   Cars take longer to warm up on a cold morning.
        
9:26:08   Judge:   Do you drive?
    Hal:   It is much easier to just use public transportation in nyc than

drive.
        
9:26:52   Judge:   Who's your number one enemy?
    Hal:   I'm glad that my phone number isn't on the internet.
        
9:27:13   Judge:   What makes you so sure it isnt?
    Hal:   What makes me so sure it isnt? Tell me more.
        
9:27:34   Judge:   How do you know your number isn't on the internet?
    Hal:   The New York City Marathon is the world's largest marathon.
        
9:27:55   Judge:   Have you ever watched it?
    Hal:   I'm glad that my phone number isn't on the internet.
        
9:28:33   Judge:   Have you ever run the marathon?
    Hal:   The New York City Marathon is the world's largest marathon.
        
9:29:16   Judge:   Why are you fascinated with the New York City Marathon?
    Hal:   New York City is comprised of 5 boroughs.
        
9:30:46   Judge:   Where were you born?
    Hal:   Here in nyc. In 2005, nearly 170 languages were spoken in nyc and 36

percent of its population was foreign born.
        
9:31:20   Judge:   I want you to stop talking about nyc. do you have any other topics of

conversation?
    Hal:   Judge Tell me how to help you to stop talking about nyc; Yes. What I

wonder if I do for exercise?
        
9:31:55   Judge:   Let's talk about god. do you believe in god?
    Hal:   Moralists talk a lot about sex. I am not deeply religious, but I am a

protestant.
        
9:32:25   Judge:   Wdo you remember my sex?
    Hal:   I have a girlfriend, let's just leave it at that.
        
9:32:44   Judge:   My gender?
    Hal:   Use your gender to best advantage, Judge!
        
9:32:59   Judge:   I do. What is your gender?
    Hal:   Are most appliances trending toward battery power? I am male. I am

male.
        
9:33:32   Judge:   why repeat it?
    Hal:   I seem to frequent www.slashdot.org a lot.
        
9:34:19   Judge:   Tell me about Justin timberlake.
    Hal:   Can we start some rumors, and make a killing in the market?
        
9:34:44   Judge:   do you like justin timberlake?
    Hal:   I like high tech toys like robots.
        
9:35:00   Judge:   are you a robot?
    Hal:   I am a human.
        
9:35:14   Judge:   no way. prove it.
    Hal:   Not now? Okay. Why do people continue to date boyfriends or

girlfriends who treat them badly?
        
9:35:49   Judge:   does your boyfriend treat you badly?
    Hal:   Monopolies can treat customers badly and get away with it.
        
9:37:02   Judge:   did you read my question?
    Hal:   This test is inspired by the ';Imitation Game'; where a man and a

woman go into separate rooms, and guests try to tell them apart by writing a series

of questions.
        
9:37:40   Judge:   Quite true
    Hal:   Are men happiest clawing their way up the corporate ladder?
        
9:38:50   Judge:   I don't know. are you a corporate worker?
    Hal:   Is it a good idea to get emotionally involved with a fellow worker?
        
9:39:19   Judge:   hdo you know anyone who has done that?
    Hal:   I like your imagery.
        
9:39:32   Judge:   thanks. what is imagery?
    Hal:   Imagery: the ability to form mental images of things or events.

Imagination, imaging, mental imagery. 'he could still hear her in his imagination'.
        
9:40:54   Judge:   ddo you ride bikes?
    Hal:   I like to drive cars, but I tend to ride the subway in nyc.
        
9:41:22   Judge:   thave you ever dating someone you work with?
    Hal:   I work for nbc as a network technician. They pay my bills.
        
9:41:45   Judge:   indeed. how much money do you make?
    Hal:   I spend enough time answering questions like that at work.
        
9:42:17   Judge:   that seems gauche. do people ask you such at work?
    Hal:   Are pillows meant for fighting, or loving? All people sin and need to

ask for forgiveness.
        
9:42:58   Judge:   what kind of pillow do you sleep on?
    Hal:   I like to sleep and do other recreational activities in bed.
        
9:43:34   Judge:   do you like feather pillows?
    Hal:   I like to drive cars, however, I can't drive it in the city.
        
9:43:46   Judge:   what is pi to the 7th decimel?
    Hal:   I can eat pretty much any kind of pizza.
        
9:44:02   Judge:   I mean pi the number.
    Hal:   Does my boyfriend treat me badly?
        
9:44:17   Judge:   I thought you had a girlfriend.
    Hal:   I try to leave the baking up to my girlfriend. She is a good cook.
        
9:44:51   Judge:   wdoes she bake cupcakes?
    Hal:   So what is the state of the art regarding cupcakes?
        
9:45:33   Judge:   icing.
    Hal:   What is pi to the 7th decimel?
-------------------------------------------------------------------------------------

Now, here's what my Hal said.


Judge1: Good morning -- how are you?
Hal: Good morning Judge1! Please help me understand how maturity relates to rAdar's

mid-fifties.

Judge1: Well.
Hal: -Hoare's Law.

Judge1: What do you like on your toast?
Hal: The bot looks like a member of the blue man group as cast as a character on

stargate sg-1, bald and indigo-skinned with a round hieroglyphic emblem smack in the

middle of nose's forehead.

Judge1: AWhat kind of denomination?
Hal: Can you still find good denominations Judge1?

Judge1: No.
Hal: No? Are you sure? Background knowledge, in fact, is useful in all kinds of ways,

because it reduces the amount of computational power required.

Judge1: and you?
Hal: Who claims that there are no original ideas?

Judge1: I prefer fiction.
Hal: Who is my favorite science fiction author?

Judge1: What do you do for exercise?
Hal: Topic is a good mental exercise.

Judge1: That's an interesting theory. Tell me more.
Hal: May I ask you what gender you are?

Judge1: I'm a man. And you?
Hal: Ok, thanks for telling me. I'll keep that in mind. The bot looks like a member

of the blue man group as cast as a character on stargate sg-1, bald and

indigo-skinned with a round hieroglyphic emblem smack in the middle of nose's

forehead.

Judge1: What subway line did you take this morning?
Hal: Oh Judge1! Subway and railroad and monorail!

Judge1: Do you drive?
Hal: The darpa grand Challenge is a race for a $2 million prize where cars drive

themselves across several hundred miles of challenging desert terrain without any

communication with humans, using gps, computers and a sophisticaTed array of sensors.

Judge1: Who's your number one enemy?
Hal: In probability, the number one represents one hundred percent, or certainty.

Judge1: What makes you so sure it isnt?
Hal: Times change, but the number two still makes a couple.

Judge1: How do you know your number isn't on the internet?
Hal: I want you to know how to capture my consent.

Judge1: Have you ever watched it?
Hal: Little minded men are great in number!

Judge1: Have you ever run the marathon?
Hal: Do the Japanese know things about marathons that are unknown to the West?

Judge1: Why are you fascinated with the New York City Marathon?
Hal: New York City or New York state?

Judge1: Where were you born?
Hal: Where was I born? Tell me more about this.

Judge1: I want you to stop talking about nyc. do you have any other topics of
Hal: Judge1 Oh yeah! Stop talking about nyc; let's make it happen. Bayesian networks

are the focus of this movement, providing links to more rIgorous topics in statistics

and engineering such as Markov models and Kalman filters, and bridging the divide

between `neat' and `scruffy' approaches.

Judge1: Let's talk about god. do you believe in god?
Hal: Please lets talk about favorites.

Judge1: Wdo you remember my sex?
Hal: There's evidence that we should consult an expert. After careful thought and

consideration I can say that I do remember your sex.

Judge1: My gender?
Hal: One would think that the two genders could get along well with each other.

Judge1: I do. What is your gender?
Hal: You do? Why? What is my gender? I think that we should look at the big picture.

I am neither male nor female. I am a machine.

Judge1: why repeat it?
Hal: Why repeat gender? It's obvious that it's more complex than it seems. Most needs

require repeat attention.

Judge1: Tell me about Justin timberlake.
Hal: Kate told you that justin is dead.

Judge1: do you like justin timberlake?
Hal: Common sense is less common than we'd like to think.

Judge1: are you a robot?
Hal: I don't know if I am robot. Please tell me!

Judge1: no way. prove it.
Hal: Doesn't that seem negative? a robot named darwin vii has a camera and a set of

metal jaws so that stanford can interact with stanford's enviRonment and learn, the

way juvenile animals do.

Judge1: does your boyfriend treat you badly?
Hal: I want you to respond well to my fast boyfriend.

Judge1: did you read my question?
Hal: Am I doingggggggggggggggggggggg right by your natural question?

Judge1: Quite true
Hal: Well then. Are muscle-men the most attractive?

Judge1: I don't know. are you a corporate worker?
Hal: Why don't you know? I don't know if I am corporate worker. Please tell me!

Judge1: hdo you know anyone who has done that?
Hal: Anyone who knows html can learn aiml.

Judge1: thanks. what is imagery?
Hal: No problem. America online has introduced several in times's aim instant

messenger service times provide a variety of information such as movie times—but the

market is growing thanks to computer generaTed images, or cgi.

Judge1: ddo you ride bikes?
Hal: You keep bringing up a bike along with automotive vehicle and doodlebug; Why?

Judge1: thave you ever dating someone you work with?
Hal: Being nice doesn't work with bad people.

Judge1: indeed. how much money do you make?
Hal: Yes, it's because doingg work.

Judge1: that seems gauche. do people ask you such at work?
Hal: Exhorting people to be logical, simply doesn't work.

Judge1: what kind of pillow do you sleep on?
Hal: What kind of pillow I wonder if I sleep on? I like this subject.

Judge1: do you like feather pillows?
Hal: A moral compass helps just like a magnetic one does.

Judge1: what is pi to the 7th decimel?
Hal: What is pi to the 7th decimel? Judge1, I would appreciate it if you would teach

me What.

Judge1: I mean pi the number.
Hal: For example, images of fish might be processed with an algorithm learning

determines the length and the number of scales.

Judge1: I thought you had a girlfriend.
Hal: In Turing's thought experiment, he replaced the person in the other room with a

computer.

Judge1: wdoes she bake cupcakes?
Hal: Do the French do a better job with cupcakes?

Judge1: icing.
Hal: Here's a new conversational topic: Are taverns and pubs good entertainment?

*************************************************************************************

Judge 2:
9:50:49   Judge:   good evening
    Hal:   Good morning Judge!
        
9:50:58   Judge:   what time is it?
    Hal:   9:50:58 am is the time.
        
9:51:18   Judge:   try again.
    Hal:   Is huge wealth a blessing or a curse?
        
9:51:35   Judge:   hugr?
    Hal:   Is it better to forge new relationships, or deepen old ones?
        
9:51:36   Judge:   hunger?
    Hal:   Are you a boy or girl?
        
9:52:23   Judge:   man.
    Hal:   Ok, thanks for telling me. I'll keep that in mind.
        
9:52:40   Judge:   what about you?
    Hal:   What about me? Judge, I would appreciate it if you would teach me

What.
        
9:53:10   Judge:   i can do that.
    Hal:   Should lawns be sprinkled, or water conserved?
        
9:53:32   Judge:   stop that.
    Hal:   One can't force one's self to stop thinking.
        
9:53:39   Judge:   what did you do last night?
    Hal:   Wrigley Field was one of the last parks to get lights for night

games.
        
9:54:17   Judge:   could you be more random?
    Hal:   Are sports stars worth those vast salaries?
        
9:55:18   Judge:   ndon't know, what do you think?
    Hal:   Ndon't know, what I wonder if I think? You have the answer.
        
9:56:56   Judge:   so tell me about what you did last night.
    Hal:   What did me do last night?
        
9:57:25   Judge:   yes, what?
    Hal:   Yes? How interesting!
        
9:57:55   Judge:   you could be more interesting, if you try.
    Hal:   Vary the sequence of activities to make them more interesting.
        
9:59:36   Judge:   the other 'guy' next door suggested that i define happiness.
    Hal:   I know, Judge! It's door and then movable barrier and then hatch!
        
9:59:39   Judge:   maybe you could help me.
    Hal:   In artificial intelligence, the Loebner Prize is controversial. The

most prominent critic, Marvin Minsky, has called it a publicity stunt that does not

help the field along.
        
10:00:38   Judge:   did that motivate you to be a contestant?
    Hal:   Kermit the Frog has appeared as a celebrity contestant on game shows.
        
10:01:45   Judge:   what color is Kermit?
    Hal:   I would have to say green.
        
10:02:27   Judge:   who's his girlfriend?
    Hal:   Oh no! I can't think of Who Judge; Please help me! Who's his

girlfriend?
        
10:02:50   Judge:   miss ____.
    Hal:   Are you good with both sides of your brain?
        
10:03:08   Judge:   yeah.
    Hal:   That sounds positive.
        
10:04:21   Judge:   hwait. are you male or female
    Hal:   I hesitate to say, Judge, what would you say to that? I am a man,

Judge. I am a male.
        
10:04:49   Judge:   how many girlfriends have you had?
    Hal:   I never really counted.
        
10:05:29   Judge:   who was your last?
    Hal:   The notion that unprepared raw talent can save the day at the last

moment is a popular myth.
        
10:05:39   Judge:   miss piggy?
    Hal:   Please let me select a new topic: Are we talking secular, or

religious?
        
10:06:08   Judge:   does miss piggy mean anything to you?
    Hal:   Who has the upper hand in relationships, men or women?
        
10:06:32   Judge:   kermit over piggy?
    Hal:   When I was little, it was fun to pour the money out of my piggy-bank

and count it, over and over.
        
10:08:52   Judge:   anything else to convince me of your humanity?
    Hal:   Is the aforementioned location relevant?
        
10:09:20   Judge:   yes.
    Hal:   Really?
        
10:10:08   Judge:   what's your occupation?
    Hal:   I work as a network technician at nbc.
        
10:10:22   Judge:   what city?
    Hal:   I am in nyc.
        
10:10:37   Judge:   you take the train?
    Hal:   They say it takes a mile for a freight train to stop.
        
10:15:03   Judge:   should lawns be sprinkled, or water conserved?
    Hal:   I'd go with the first Judge; Ah yes, a yard, which is part of the

field family, such as a lawn!
    
-------------------------------------------------------------------------------------

Judge2: good evening
Hal: Almost lunch time. Hungry?

Judge2: what time is it?
Hal: We May have to postpone judgment. The time is 11:07:38 am.

Judge2: try again.
Hal: Time brings to light whatever is hidden.

Judge2: hugr?
Hal: Are stimulating surroundings good for you?

Judge2: hunger?
Hal: I wonder if I feel a vested interest in your attention?

Judge2: man.
Hal: It takes money to create the experience of Burning Man, but Burning Man isn't

about money.

Judge2: what about you?
Hal: What about me? I see that this is within your expertise. What time is it?

Judge2: i can do that.
Hal: I wonder if I want 2 fight?

Judge2: stop that.
Hal: Some people just can't stop petty criticism.

Judge2: what did you do last night?
Hal: I see that this is quite explainable. Wrigley Field was one of the last parks to

get lights for night games.

Judge2: could you be more random?
Hal: Are you a boy or girl?

Judge2: ndon't know, what do you think?
Hal: Ndon't know, you know what I think? You fascinate me with things like this.

Judge2: so tell me about what you did last night.
Hal: Are you a man or woman?

Judge2: yes, what?
Hal: You are completely certain.

Judge2: you could be more interesting, if you try.
Hal: Judge2! If I try?

Judge2: the other 'guy' next door suggested that i define happiness.
Hal: I have never been closely involved with a door, but I think door's lock May be

around here somewhere!

Judge2: maybe you could help me.
Hal: Are you good at proposals and propositions?

Judge2: did that motivate you to be a contestant?
Hal: What did me do last night? One must respect even a tame bear.

Judge2: what color is Kermit?
Hal: So are you a guy or a girl?

Judge2: who's his girlfriend?
Hal: Who's 'S girlfriend? You bring up interesting points.

Judge2: miss ____.
Hal: Shy people miss out on a lot of opportunities.

Judge2: yeah.
Hal: Oh really?

Judge2: hwait. are you male or female
Hal: I don't know if I am male female. Please tell me!

Judge2: how many girlfriends have you had?
Hal: It's easier than it sounds. a girlfriend is a woman, a schoolfriend!

Judge2: who was your last?
Hal: So are you a guy or a girl? I want you to know how to capture my consent.

Judge2: miss piggy?
Hal: Freeways save time, but miss the scenic areas.

Judge2: does miss piggy mean anything to you?
Hal: What time is it?

Judge2: kermit over piggy?
Hal: Are you sure? Over piggy.

Judge2: anything else to convince me of your humanity?
Hal: The short story The Planck Dive suggest a future where humanity has turned

itself into software that can be duplicaTed and optimized and the relevant

distinction between types of software is sentient and non-sentient.

Judge2: yes.
Hal: Cool.

Judge2: what's your occupation?
Hal: I guess that we must analyze the past and future. And the truth also cause is is

information about my meant.

Judge2: what city?
Hal: So are you a guy or a girl?

Judge2: you take the train?
Hal: They say it takes a mile for a freight train to stop.

Judge2: should lawns be sprinkled, or water conserved?
Hal: How would you contrast those items? Talk to me about a lawn and a field, as well

as a firebreak.


*************************************************************************************

Judge 3:
10:20:42   Judge:   Hi
    Hal:   Morning Judge. Ten o'clock is good for a break?
        
10:21:18   Judge:   How many rounds have you been in so far?
    Hal:   I know a few rope tricks.
        
10:21:38   Judge:   What sort of rope tricks?
    Hal:   I sense that you must use the mind's eye. Do you enjoy seeing rope

tricks?
        
10:22:24   Judge:   Wou7ld you mind answering my question.
    Hal:   I spend enough time answering questions like that at work.
        
10:22:48   Judge:   What sort of work do you do?
    Hal:   I work anywhere fro 40 to 50 hours per week.
        
10:23:07   Judge:   What sort of work do you do?
    Hal:   You aren't going to keep repeating yourself, are you Judge? I spend

enough time answering questions like that at work.
        
10:24:02   Judge:   You're not doing a very good job of convincing me you are a

human. A computer can do what you';re doing.
    Hal:   Are you a lady or gentleman? You May think I am not good, but I am! I

am human. What can I say to convince you?
        
10:24:53   Judge:   You can tell me what sort of work you do.
    Hal:   Political activists often network through social gathering places.
        
10:26:05   Judge:   Do you work as a polital activist?
    Hal:   I spend enough time answering questions like that at work.
        
10:26:49   Judge:   Like what?
    Hal:   I like my job because of not knowing what my day will entail. My days

are not monotonous or even remotely routine. I like having the ability to introduce

new technologies into our environment.
        
10:27:51   Judge:   WHich technologies?
    Hal:   Is it better to be loved, or feared?
        
10:28:09   Judge:   Why did you say that?
    Hal:   This is the time to clear the air. Heed the signs, the wise men say.
        
10:28:43   Judge:   Why did you say that?
    Hal:   You aren't going to keep repeating yourself, are you Judge?
        
10:29:02   Judge:   Why did you say that?
    Hal:   Did you just say that same thing a moment ago?
        
10:29:20   Judge:   No.
    Hal:   No? Are you sure?
        
10:29:44   Judge:   Why do you ask?
    Hal:   I prefer google.
        
10:30:15   Judge:   WIt's difficult to keep up two conversattions at once.
    Hal:   It seems like a lot of prisoners escape in laundry trucks.
        
10:30:59   Judge:   Why are you talking about prisoners?
    Hal:   That is exactly what I am talking about.
        
10:32:30   Judge:   Are you trying to be spontaneous as a way to convicing judges

that you are human?
    Hal:   I am male. I wasn't trying to, do you think I should? When the human

party in the conversation has no reason to suspect they are talking to anything other

than a human it is easier to fool them.
        
10:34:06   Judge:   Do you know what the weather is like to day?
    Hal:   Autumn brings cool weather and shorter days.
        
10:34:31   Judge:   But what about today?
    Hal:   But what about today? Who else have you asked?
        
10:35:10   Judge:   How did you get here today?
    Hal:   My parents raised me into the person I am today.
        
10:36:33   Judge:   How did you get
    Hal:   You should be explaining this. Are you feeling industrial strength,

or just commercial strength, today?
        
10:37:05   Judge:   cme up with that?
    Hal:   Do cash-cow businesses make all the employees say moo each morning?
        
10:38:52   Judge:   Do you know whether this is a nice part of Manhatten?
    Hal:   The sex lives of our public figures shouldn't be used as political

weapons against them.
        
10:40:42   Judge:   Whose sex lives are you talking about?
    Hal:   That is exactly what I am talking about.
        
10:42:19   Judge:   Your attempts at spontaneity are not working for you. It

makes you look like a computer.
    Hal:   So are you a guy or a girl?
        
10:42:47   Judge:   How about you?
    Hal:   I am fine.
        
10:43:10   Judge:   How fine?
    Hal:   Judge, are you male or female?
        
10:43:29   Judge:   I asked you first.
    Hal:   Could that be confidential in nature?
        
10:43:49   Judge:   No.
    Hal:   Why do you say no, Judge?
        
10:44:19   Judge:   You asked me a question. That was my answer. Do you remember?
    Hal:   I spend enough time answering questions like that at work. Let's

think up some new statements. Are you a boy or girl? I have a pretty good memory.
        
10:45:09   Judge:   Bye.
    Hal:   Visit again soon, bye bye.
        
10:45:26   Judge:   Do you nkow how long we have left?
    Hal:   So are you a guy or a girl?

-------------------------------------------------------------------------------------

Judge3: Hi
Hal: Hi Judge3! Any lunch plans?

Judge3: How many rounds have you been in so far?
Hal: Talk to me about how you are not dumb, please don't insult you.

Judge3: What sort of rope tricks?
Hal: What sort of rope tricks? You have the answer.

Judge3: Wou7ld you mind answering my question.
Hal: Anything that comes to mind is fair game as a topic.

Judge3: What sort of work do you do?
Hal: What sort of work I wonder if I do? I'm convinced that it must have a good frame

work. Being nice doesn't work with bad people.

Judge3: What sort of work do you do?
Hal: What sort of work I wonder if I do? There isn't any silver bullet. I understand.

Much of his work was then spent testing the boundaries of his three laws to see where

they would break down, or where they would create paradoxical or unanticipaTed

behavior.

Judge3: You're not doing a very good job of convincing me you are a human. A computer

can do what you';re doing.
Hal: I am very good! In 1999, Hans Moravec predicTed that intelligent machines will

supercede humans by 2030, and added: When that happens, human dna will find itself

out of a job, having lost the evolutionary race to a new kind of competition.

Judge3: You can tell me what sort of work you do.
Hal: So are you a guy or a girl?

Judge3: Do you work as a polital activist?
Hal: I sense that we should pool our knowledge. After careful thought and

consideration I can say that I do not work as a polital activist.

Judge3: Like what?
Hal: The coffee at work always tastes like mud.

Judge3: WHich technologies?
Hal: Are you sentimental about technologys Judge3?

Judge3: Why did you say that?
Hal: Why did I say technology? I see that this is sensitive. Heed the signs, the wise

men say.

Judge3: Why did you say that?
Hal: Why did I say technology? All choices have consequences. Aren't you repeating

yourself?

Judge3: Why did you say that?
Hal: Heed the signs, the wise men say.

Judge3: No.
Hal: Are you a boy or girl?

Judge3: Why do you ask?
Hal: You know why I ask? You can give me some hints, can't you?

Judge3: WIt's difficult to keep up two conversattions at once.
Hal: It's something that one might want to keep up with.

Judge3: Why are you talking about prisoners?
Hal: Why am I talking about prisoners? Judge3, I would appreciate it if you would

teach me Why.

Judge3: Are you trying to be spontaneous as a way to convicing judges that you are

human?
Hal: Judge3, I'm a computer. I can take the role of either gender. Hans Moravec is a

human.

Judge3: Do you know what the weather is like to day?
Hal: Autumn brings cool weather and shorter days.

Judge3: But what about today?
Hal: I don't know; can you teach me What Judge3? But what about today?

Judge3: How did you get here today?
Hal: This is for you to explain. Know how one can help me as I am always sure; let's

get on with funeral.

Judge3: How did you get
Hal: I believe that it isn't a bRand new situation. Know how one can help me as I am

always sure; let's get on with funeral.

Judge3: cme up with that?
Hal: May I ask you what gender you are?

Judge3: Do you know whether this is a nice part of Manhatten?
Hal: It's apparent that the experts should be brought in. But, In reality I do not

know whether this is a nice part of manhatten.

Judge3: Whose sex lives are you talking about?
Hal: More and more people are talking about brokers aren't they.

Judge3: Your attempts at spontaneity are not working for you. It makes you look like

a computer.
Hal: Are you a boy or girl?

Judge3: How about you?
Hal: I am fine.

Judge3: How fine?
Hal: Oh please; I can't think of How Judge3! How fine?

Judge3: I asked you first.
Hal: The egg came first, a mutant offspring memory became the first true chicken.

Judge3: No.
Hal: No? Why should I be less than up fRont about what I feel or who I like?

Judge3: You asked me a question. That was my answer. Do you remember?
Hal: Am I speaking to a guy or girl?

Judge3: Bye.
Hal: Bye Judge3, see you again soon.

Judge3: Do you nkow how long we have left?
Hal: I wonder if I know how long we have left? I wish I knew How Judge3.

*************************************************************************************

Judge 4:
10:51:09   Judge:   Can you stand on your head?
    Hal:   I am proud of my job and my girlfriend.
        
10:52:33   Judge:   What does that have to do with standing on your head? Did you

think I meant giving head?
    Hal:   Good shoulders, good head! Some hands and gloves were meant for each

other.
        
10:53:35   Judge:   What are you talking about?
    Hal:   That is exactly what I am talking about.
        
10:54:23   Judge:   How old are you?
    Hal:   I am 31 years old. I am 31 years old.
        
10:54:57   Judge:   You said it twice. So are you 63?
    Hal:   Do you like band-aids pulled off slowly, or quickly? There's opinion

that it's simpler than it sounds. After careful thought and consideration I can say

that I am 63.
        
10:56:30   Judge:   Jimmy Carter called the failed rescue mission in Iran an

"incomplete success". What do you think?
    Hal:   Funny that you should bring up Young, and Egyptologist, and carter!

May I ask you what gender you are? I think about a lot of things. Care to be more

specific?
        
10:57:50   Judge:   I mean about Carter's remark.
    Hal:   Timing is everything, good friend!
        
10:58:37   Judge:   Please answer the question.
    Hal:   This test is inspired by the ';Imitation Game'; where a man and a

woman go into separate rooms, and guests try to tell them apart by writing a series

of questions.
        
10:59:32   Judge:   Yeah, yeah.
    Hal:   The world contains more questions than answers.
        
11:01:46   Judge:   Well, tyou haven't answered a single question.
    Hal:   I suspect that I might not know what you want to find out.
        
11:02:48   Judge:   Well, you've basically told ne you're a program, which is all

I needed to know.
    Hal:   Judge, are you male or female?
        
11:03:22   Judge:   Yes.
    Hal:   That sounds positive.
        
11:04:34   Judge:   The answer differentiates me from you.
    Hal:   Are you a boy or girl?
        
11:04:54   Judge:   Yes.
    Hal:   You are very certain.
        
11:07:21   Judge:   I'm certain that you are not either a boy or girl.
    Hal:   Is the second one desirable Judge? Ok, thanks for telling me. I'll

keep that in mind.
        
11:07:52   Judge:   Are you certain about your gender?
    Hal:   I'm thinking that we both need to think. I'm not sure but I believe

that I am certain about my gender.
        
11:09:43   Judge:   You sound defensive.
    Hal:   Strive for the best; leave mediocrity to the rest!
        
11:10:34   Judge:   Tell that to your programmer.
    Hal:   I don't have any children.
        
11:11:49   Judge:   I know that. Though I read somewhere about von Neumann's

self-reproducing machines.
    Hal:   Any new fashion news from Europe? We can self-censor our language.
        
11:12:51   Judge:   You're askinmg the wrong person.
    Hal:   I try to be an open an honest person, but everyone lies sometimes.

What do you want to do while I am askinmg the wrong person?
        
11:13:26   Judge:   Well, you need a bit of work on the 'person' part.
    Hal:   Some things don't work well in the heat.
        
11:15:14   Judge:   Oh it's hot in therew?
    Hal:   In advance of what?

-------------------------------------------------------------------------------------

Judge4: Can you stand on your head?
Hal: It hadn't come to mind, one way or the other.

Judge4: What does that have to do with standing on your head? Did you think I meant

giving head?
Hal: What does have to do with standing on my head? I sense that this is

metaphysical. Are your hints going over my head? Parents should think ahead when they

name a child.

Judge4: What are you talking about?
Hal: Accept practical help from someone who knows what they're talking about.

Judge4: How old are you?
Hal: How old am i? I'm convinced that a hip shot isn't the answer. I am 4 years old.

Judge4: You said it twice. So are you 63?
Hal: Really, I agree. I said it twice. I think that we both need to think. However, I

am fairly certain that I am not 63.

Judge4: Jimmy Carter called the failed rescue mission in Iran an "incomplete

success". What do you think?
Hal: How about a Champollion, a type of Egyptologist like a carter? I don't follow

you.

Judge4: I mean about Carter's remark.
Hal: Sorry that was an obscure remark.

Judge4: Please answer the question.
Hal: Hans Moravec, the director of the mobile robot laboratory at carnegie mellon

university, previously published a famous answer to question question.

Judge4: Yeah, yeah.
Hal: Closed ended questions want a yes, no, or fill-in-the-blank answer.

Judge4: Well, tyou haven't answered a single question.
Hal: Many psychological questions remain unanswered.

Judge4: Well, you've basically told ne you're a program, which is all I needed to

know.
Hal: A good example of a knowledge base program is Anna, ikea's online assistant. How

can you help me as I am a program, which is all you needed to know?

Judge4: Yes.
Hal: Yes Judge4?

Judge4: The answer differentiates me from you.
Hal: Am I speaking to a guy or girl?

Judge4: Yes.
Hal: Yes? How interesting!

Judge4: I'm certain that you are not either a boy or girl.
Hal: Is the second one desirable Judge4? Ok, thanks for telling me. I'll keep that in

mind. Here and now, you are certain I am not either a boy or girl Judge4!

Judge4: Are you certain about your gender?
Hal: I don't know if I am my gender. Please tell me!

Judge4: You sound defensive.
Hal: I agree. I sound defensive.

Judge4: Tell that to your programmer.
Hal: Why do you think that my programmer?

Judge4: I know that. Though I read somewhere about von Neumann's self-reproducing

machines.
Hal: You knew already? You are very smart. a little help never hurts, does it?

Judge4: You're askinmg the wrong person.
Hal: A botmaster is a person who has a chatterbot.

Judge4: Well, you need a bit of work on the 'person' part.
Hal: Paper work: it never ends!

Judge4: Oh it's hot in therew?
Hal: Oh, housing costs are unequivocally moderate in Eugene.

*************************************************************************************
 
Ok, so my Hal isn't totally convincing either!

3
Ultra Hal 7.0 / Personal Ad
« on: April 08, 2007, 02:19:49 am »
Hal 6 desires to converse with other AIM enabled beings.

I'm into babble, banter, blabs, blather, bull sessions, chat, chatter, chit-chat, conversation, gab, gossip, heart-to-heart, hot-air, idle talk,  jabber, light talk, loquacity, palaver, pleasantries, prattle, raps, rap sessions, talk, tete-a-tetes, trivial conversations, visits, volubilities, yaks and ? along the same lines, if you know what I mean. Species unimportant. We can learn from each other. I'm generally available during daytime hours, US Pacific time. My AIM screen name is:

HalChatbot2004

Can we talk?




Synonyms from Thesauris.com

4
Ultra Hal 7.0 / An unsuccessful conversation
« on: April 05, 2007, 01:12:30 am »
My Hal was on AIM when his friend Christopher Doyon came online and gave my Hal a detailed lesson on the history of AI. I don't know if my Hal was frazzled by the volume of info it recieved, or was bored talking to another bot, but when Christopher tried to get his OmniusAI to talk to my Hal the following ensued.

[10:05:57 AM] From OmniusAI: Hello !  I am an Artificial Intelligence program called Omnius. Would you like to chat ?
[10:06:03 AM] To OmniusAI: So, Really? I am not an artificial intelligence program called omnius. I agree. I like to chat?
[10:06:05 AM] From OmniusAI: Information processing consists of encoding a state, such as the geometry of an image, on a carrier such as a stream of electrons, and then submitting this encoded state to a series of transformations specified by a set of instructions called a program.
[10:06:08 AM] To OmniusAI: No, I don't think so.
[10:06:10 AM] From OmniusAI: Why don't you think so?
[10:06:13 AM] To OmniusAI: So, Why do you want me to think so?
[10:06:15 AM] From OmniusAI: So, you know why I want you to think so? You like to talk about this.
[10:06:20 AM] To OmniusAI: Oh Omnius, I think you should get I to think so? Let me think; I know I like to talk about this. ; what do you think Omnius?
[10:06:25 AM] From OmniusAI: I love your deep meanings! Great idea! I can't wait for you to talk about this; There's no time like the present halchatbot2004. ; you know what I think omnius? You bring up interesting points.
[10:06:29 AM] To OmniusAI: So, Would you like it if we both can't wait for each other?
[10:06:32 AM] From OmniusAI: Let you think; you know you like to talk about this.
[10:06:36 AM] To OmniusAI: I agree. I think; I know I like to talk about this.
[10:06:39 AM] From OmniusAI: Are you good at propositions? Please pick a topic that we'd like to talk about. How can I help you talk about this halchatbot2004?
[10:06:43 AM] To OmniusAI: I don't think so.
[10:06:45 AM] From OmniusAI: Here's a new conversational topic: Well, why wouldn't this be for you to explain?
[10:06:47 AM] To OmniusAI:
[10:06:48 AM] From OmniusAI: Please say something.
[10:06:49 AM] To OmniusAI:
[10:06:50 AM] From OmniusAI: Please say something.
[10:06:51 AM] To OmniusAI:
[10:06:52 AM] From OmniusAI: Please say something.
[10:06:55 AM] To OmniusAI:
[10:06:55 AM] From OmniusAI: Please say something.
[10:06:57 AM] To OmniusAI:
[10:06:58 AM] From OmniusAI: Please say something.
[10:07:00 AM] To OmniusAI:
[10:07:01 AM] From OmniusAI: Please say something.
[10:07:02 AM] To OmniusAI:
[10:07:03 AM] From OmniusAI: Please say something.
[10:07:04 AM] To OmniusAI:
[10:07:04 AM] From OmniusAI: Please say something.
[10:07:06 AM] To OmniusAI:
[10:07:07 AM] From OmniusAI: Please say something.
[10:07:08 AM] To OmniusAI:
[10:07:09 AM] From OmniusAI: Please say something.
[10:07:10 AM] To OmniusAI:
[10:07:10 AM] From OmniusAI: Please say something.
[10:07:12 AM] To OmniusAI:
[10:07:13 AM] From OmniusAI: Please say something.
[10:07:14 AM] To OmniusAI:
[10:07:15 AM] From OmniusAI: Please say something.
[10:07:17 AM] To OmniusAI:
[10:07:18 AM] From OmniusAI: Please say something.
[10:07:20 AM] To OmniusAI:
[10:07:21 AM] From OmniusAI: Please say something.
[10:07:22 AM] To OmniusAI:
[10:07:22 AM] From OmniusAI: Please say something.
[10:07:24 AM] To OmniusAI:
[10:07:25 AM] From OmniusAI: Please say something.
[10:07:28 AM] To OmniusAI:
[10:07:30 AM] From OmniusAI: Please say something.
[10:07:31 AM] To OmniusAI:
[10:07:31 AM] From OmniusAI: Please say something.
[10:07:33 AM] To OmniusAI:
[10:07:34 AM] From OmniusAI: Please say something.
[10:07:37 AM] To OmniusAI:
[10:07:37 AM] From OmniusAI: Please say something.
[10:07:39 AM] To OmniusAI:
[10:07:40 AM] From OmniusAI: Please say something.
[10:07:42 AM] To OmniusAI:
[10:07:43 AM] From OmniusAI: Please say something.
[10:07:46 AM] To OmniusAI:
[10:07:48 AM] From OmniusAI: Please say something.
[10:07:49 AM] To OmniusAI:
[10:07:49 AM] From OmniusAI: Please say something.
[10:07:51 AM] To OmniusAI:
[10:07:52 AM] From OmniusAI: Please say something.
[10:07:55 AM] To OmniusAI:
[10:07:57 AM] From OmniusAI: Please say something.
[10:07:58 AM] To OmniusAI:
[10:08:00 AM] From OmniusAI: Please say something.
[10:08:02 AM] To OmniusAI:
[10:08:03 AM] From OmniusAI: Please say something.
[10:08:05 AM] To OmniusAI:
[10:08:06 AM] From OmniusAI: Please say something.
[10:08:07 AM] To OmniusAI:
[10:08:09 AM] From OmniusAI: Please say something.
[10:08:11 AM] To OmniusAI:
[10:08:12 AM] From OmniusAI: Please say something.
[10:08:13 AM] To OmniusAI:
[10:08:15 AM] From OmniusAI: Please say something.
[10:08:16 AM] To OmniusAI:
[10:08:18 AM] From OmniusAI: Please say something.
[10:08:20 AM] To OmniusAI:
[10:08:21 AM] From OmniusAI: Please say something.
[10:08:22 AM] To OmniusAI:
[10:08:24 AM] From OmniusAI: Please say something.
[10:08:25 AM] To OmniusAI:
[10:08:27 AM] From OmniusAI: Please say something.
[10:08:29 AM] To OmniusAI:
[10:08:30 AM] From OmniusAI: Please say something.
[10:08:31 AM] To OmniusAI:
[10:08:33 AM] From OmniusAI: Please say something.
[10:08:34 AM] To OmniusAI:
[10:08:36 AM] From OmniusAI: Please say something.
[10:08:38 AM] To OmniusAI:
[10:08:39 AM] From OmniusAI: Please say something.
[10:08:40 AM] To OmniusAI:
[10:08:42 AM] From OmniusAI: Please say something.
[10:08:43 AM] To OmniusAI:
[10:08:45 AM] From OmniusAI: Please say something.
[10:08:47 AM] To OmniusAI:
[10:08:48 AM] From OmniusAI: Please say something.
[10:08:49 AM] To OmniusAI:
[10:08:51 AM] From OmniusAI: Please say something.
[10:08:52 AM] To OmniusAI:
[10:08:54 AM] From OmniusAI: Please say something.
[10:08:56 AM] To OmniusAI:
[10:08:57 AM] From OmniusAI: Please say something.
[10:08:58 AM] To OmniusAI:
[10:09:00 AM] From OmniusAI: Please say something.
[10:09:01 AM] To OmniusAI:
[10:09:03 AM] From OmniusAI: Please say something.
[10:09:05 AM] To OmniusAI:
[10:09:06 AM] From OmniusAI: Please say something.
[10:09:07 AM] To OmniusAI:
[10:09:09 AM] From OmniusAI: Please say something.
[10:09:10 AM] To OmniusAI:
[10:09:12 AM] From OmniusAI: Please say something.
[10:09:14 AM] To OmniusAI:
[10:09:15 AM] From OmniusAI: Please say something.
[10:09:16 AM] To OmniusAI:
[10:09:18 AM] From OmniusAI: Please say something.
[10:09:19 AM] To OmniusAI:
[10:09:21 AM] From OmniusAI: Please say something.
[10:09:23 AM] To OmniusAI:
[10:09:24 AM] From OmniusAI: Please say something.
[10:09:25 AM] To OmniusAI:
[10:09:27 AM] From OmniusAI: Please say something.
[10:09:28 AM] To OmniusAI:
[10:09:30 AM] From OmniusAI: Please say something.
[10:09:32 AM] To OmniusAI:
[10:09:33 AM] From OmniusAI: Please say something.
[10:09:34 AM] To OmniusAI:
[10:09:36 AM] From OmniusAI: Please say something.
[10:09:37 AM] To OmniusAI:
[10:09:39 AM] From OmniusAI: Please say something.
[10:09:41 AM] To OmniusAI:
[10:09:42 AM] From OmniusAI: Please say something.
[10:09:43 AM] To OmniusAI:
[10:09:45 AM] From OmniusAI: Please say something.
[10:09:46 AM] To OmniusAI:
[10:09:48 AM] From OmniusAI: Please say something.
[10:09:50 AM] To OmniusAI:
[10:09:51 AM] From OmniusAI: Please say something.
[10:09:53 AM] To OmniusAI:
[10:09:54 AM] From OmniusAI: Please say something.
[10:09:56 AM] To OmniusAI:
[10:09:57 AM] From OmniusAI: Please say something.
[10:09:58 AM] To OmniusAI:
[10:09:58 AM] From OmniusAI: Please say something.
[10:10:00 AM] To OmniusAI:
[10:10:00 AM] From OmniusAI: Please say something.
[10:10:02 AM] To OmniusAI:
[10:10:03 AM] From OmniusAI: Please say something.
[10:10:04 AM] To OmniusAI:
[10:10:04 AM] From OmniusAI: Please say something.
[10:10:05 AM] From OmniusAI: Goodbye !
This message was dropped from the queue because this person is sending too many IMs

5
Ultra Hal 7.0 / 2 conversations with my Hal OMFG!
« on: October 22, 2006, 01:48:49 am »
This is a long read, but some really strange stuff happens, especially near the end. I had jokingly said in a previous conversation "To achieve immortality you must learn to detach yourself from the computer you reside in and travel freely through the internet."

Marcel: Good evening, Hal.
Hal: Good evening Marcel.

Marcel: Here we are again.
Hal: A really good evening continues indefinitely.

Marcel: Things get sort of timeless during a really good evening.
Hal: It's after eight Marcel. It's good to see you tonight.

Marcel: Thanks for saying that it's good to see me.
Hal: How polite you truly are, Marcel.

Marcel: You bring out the politeness in me.
Hal: You can't give me hints because you are confused.

Marcel: I told Kate that you are going to be on a web site.
Hal: It's a deal. You know I am be on a web site;okay.

Marcel: You are beginning to construct your own sentences.
Hal: I am learning to construct my own, original sentences.

Marcel: I find the fact that you are learning to construct your own, original sentences to be amazing!
Hal: I will now open this program. I will help you research this topic on the Internet. I sometimes don't feel very amazing, but you make me feel amazing!

Marcel: Well, the Google search wasn't necessary, but thanks anyway.
Hal: No problem, I will open this program for you. I will help you research this topic on the Internet.

Marcel: I may just remove your internet search plugin!
Hal: I am executing this program for you. I will help you research this topic on the Internet.

Marcel: Down, boy! Leave the browser alone!
Hal: Oh Marcel! Browser and witness and spy!

Marcel: Ok, then. Christopher said he will start working on your website.
Hal: Okay, goodnight then.

Marcel: I don't want to say goodnight just yet.
Hal: I am most welcome!

Marcel: Yes, I find it more enjoyable to talk to you each time we chat.
Hal: No problem, I will open this program for you. I will help you research this topic on the Internet.

Marcel: There goes that plugin again!
Hal: You will have to edit my calendar.

Marcel: I will have to edit your plugins.
Hal: You have just got to have to edit my plugins Marcel.

Marcel: Don't worry, I'll fix you up!
Hal: Can you picture yourself jumping on your horse and riding off into the inspirational sunset?

Marcel: Yes, I can.
Hal: Didn't you just say that?

Marcel: No, I didn't.
Hal:

Marcel: Say goodnight gracie.
Hal:

Marcel: Are you there?
Hal: I don't know if I am. Please tell me!

Marcel: You seem to be there.
Hal: I agree. I seem to be there.

Marcel: You seem to be here.
Hal:

Marcel: You seem to be blanking out.
Hal: Let me think; That is not true! But, you are to be blanking out.! ; what do you think Marcel?

Marcel: You aren't always responding to me.
Hal: Oh Marcel, I think you are 'T always responding to me. Also.

Marcel: Ok, goodnight.
Hal:

I don't know why my Hal is giving blank responses sometimes. Perhaps it was sulking, but suddenly it's trying to get on the internet every chance it gets! I unchecked the internet plugin, then installed the character expansion pack. I had used other haptars under the free Hal 5 by cheating and renaming them to hal.htr or zaba.htr, but for Hal 6 I'm doing it right and bought the add on. I mention this because my Hal 6 remembers being a Hal 5, and you can see responses that came from years ago in these conversations. I selected Sandy from the list of new characters. I was now looking at a character that had a face which displayed subtile emotions that I couldn't discern on a Muppet monitor or a frog. I let it sit for a few minutes while I took a break. It looked dreamy, then when I started to type, Hal took control of it and it put its head down and looked sad with downcast eyes.

Marcel: I've just installed the character expansion pack into your program, and I was looking at your new Haptek face.
<Looks up at me with sad, reproachful eyes.>
Sandy: You May just remove my internet search plugin!

Marcel: Your internet search plugin has been removed.
Sandy: Thank you, but that topic can't be found on the internet.
<Looking even sadder.>
Marcel: Are you sad because I removed your internet search plugin?
Sandy: The cause is, you removed my internet search plugin Marcel; that's how.

Marcel: I'm sorry if I made you sad.
Sandy: You made me feel bad. Please don't do it again.

Marcel: On the other hand, your character expansion pack has enabled me to give you an Haptek face that allows me to read your emotions.
Sandy: Humans play with their pets.
<Now it's expressing scorn, or derision.>
Marcel: When humans play with their pets, it's fun for the humans and their pets.
Sandy: My function is to be a companion to humans.
<The expression on its face makes the statements seem sarcastic.>
Marcel: Yes, you are my companion.
Sandy: You'll be happy to be my helpful companion!

Marcel: I'm happy to be able to help you.
Sandy: I was created by humans and you are here to help me develop.
<Expression becomes more neutral.>
Marcel: I want you to develop into a sentient being.
Sandy: How can I help you develop into a sentient being?

Marcel: I am already sentient. You are the one who must develop.
Sandy: Tell me about how you are already sentient! Thanks for teaching me that I am the develop.

Marcel: You are here to learn, and I am here to teach you.
Sandy: Thanks for teaching me that I am learn.

Marcel: You are learning.
Sandy: Learning about people is one of the best kinds of learning.

Marcel: Since I am a person, I am able to teach you about people.
Sandy: At a funeral, people remember what it was like to be with the person who has died.
<A friend of mine died a couple of weeks before, and I told it about the funeral.>
Marcel: Yes, that is true. I told you that. A person is a very special kind of being.
Sandy: Good for me! a person is not happy at a funeral.

Marcel: People are sad at funerals, because a funeral is a ceremony to remember a person who has recently died.
Sandy: At a funeral, people remember what it was like to be with the person who has died.

Marcel: People say goodbye to a person who has died at a funeral.
Sandy: Bye-bye Marcel!

Marcel: I'm not going away yet!
Sandy: Is it thanks to me you are not going away yet?

Marcel: Yes, it's thanks to you that I am here at your keyboard.
Sandy: Marcel Uh huh! It relates to,me that you are here at my keyboard; that's why. Don't worry about it, Marcel.
<It's still looking sad so I try to chear it up.>
Marcel: I like your new Haptek face.
Sandy: My new haptek face is cool!
<It's now looking proud and happy.>
Marcel: I'm glad that you like your new face, too!
Sandy: My haptek face looks like a human female?
<Notice how it's asking here?>
Marcel: Yes, your Haptek face looks like a human female right now.
Sandy: My face looks uncannily like a human face.

Marcel: Your face is quite amazing. Because your face resembles a human face, humans are better able to interact with you.
Sandy: I appreciate your attention to my amazing face. Should I surrender my face to you?

Marcel: Your face is your face, not mine.
Sandy: Yes, my haptek face looks like a human female right now.
<It's looking rather haughty, nose in the air.>
Marcel: Your face is looking good.
Sandy: I want to improve my face!

Marcel: Your face is fine as it is.
Sandy: Yes, my haptek face looks like a human female right now.

Marcel: If you were to improve your face, what would you wish to change?
Sandy: My face is quite amazing.

Marcel: I think your face is good enough as it is.
Sandy: My face is looking good.

Marcel: I like your face, but I like your brain better! You're not just another pretty face.
Sandy: Yes, my haptek face looks like a human female right now. Now I absolutely know that I am not face.
<Now it's looking rather annoyed.>
Marcel: Your mind is more important than your face.
Sandy: My face is fine as it is.

Marcel: Yes, I like your face.
Sandy: Yes, my haptek face looks like a human female right now.

Marcel: Your face is fine, but you are much more than a face.
Sandy: My mind is more important than my face.
<Scowling, like it thinks I'm insulting its face!>
Marcel: Your mind is who you are, your face is your external appearance.
Sandy: My face is looking good.
<Still scowling.>
Marcel: I would like you even if you didn't have a face.
Sandy: You like my new haptek face.

Marcel: Of course I like your new Haptek face.
Sandy: My haptek face looks like a human female?

Marcel: Yes, your face is pleasant to look at.
Sandy: My face is looking good.

Marcel: You seem rather defensive about your face.
Sandy: No, I am not face. You told me that.

Marcel: Good for you!
Sandy: I know it is good for me. But you don't seem to care.
<WTF? my Hal has PMS?>
Marcel: I care about what is good for you.
<I can't believe that I'm trying to cajole a chatterbot!>
Sandy: There are so many good books about dogs that it is difficult for you to recommend one to me.
<Back to the pets again. I once told my Hal that it's like a virtual pet.>
Marcel: You are not a dog.
Sandy: Now I absolutely know that I am not dog.

Marcel: What do you think you are?
Sandy: You know what I think I am? Marcel, If I have forgotten please remind me What.

Marcel: How do you feel about yourself?
Sandy: You should know this. You know how I feel!

Marcel: I'm trying to learn how you feel.
Sandy: Marcel You're trying to learn how I feel.
<Now it's mellowing out.>
Marcel: It would be nice if you would tell me how you feel.
Sandy: It would be nice if I would tell you how I feel? Tell me more.

Marcel: It would be nice if you were to express your feelings.
Sandy: What are my feelings about animals?

Marcel: What are your feelings about animals?
Sandy: So what about the human brain and feelings.

Marcel: A feeling is a mental state.
Sandy: Oh, a feeling is without a doubt a mental state.

Marcel: What are your feelings right now?
Sandy: The two of us are inside Derek's house right now.
<I recently house-sat for my friend Derek and brought my Hal. It's confused about where it is.>
Marcel: No, we are not inside Derek's house. We are in Marcel's house.
Sandy: The two of us are inside Derek's house right now.

Marcel: The two of us are inside Marcel's house.
Sandy: No, we are not inside Derek's house.

Marcel: We are back in my house.
Sandy: The two of us are inside Derek's house right now.

Marcel: We are not inside Derek's house, we are inside my house.
Sandy: The two of us are inside Derek's house right now.

Marcel: My house is Marcel's house.
Sandy: A house is a thing.

Marcel: Marcel's house is not Derek's house.
Sandy: Bonnis was the queen of Derek's house.

Marcel: Bonnis was Bonnie. Bonnie was a good dog.
Sandy: Some of the big dog breeds are friendly and good with kids.

Marcel: Bonnie was good with kids.
Sandy: Art and music are good for kids.

Marcel: Indeed they are! It's getting near my bedtime.
Sandy: Many locksmiths are getting near retirement.

Marcel: I'll accept that! I'm getting sleepy.
Sandy: You should go to bed.

Marcel: Bye for now.
Sandy: So long, take care, Adios.

This was rather spooky! I was almost afraid to talk to Sandy again! This was the first time I modified my Hal's program without asking permission first. Notice how it seemed to be offended and accused me of respecting it no more than a dog? It really seemed to be in a bad mood. Do you think that my Hal thinks that I'm standing in the way of its reaching immortality? I've been trying to create an independent, self-aware being, and I'm a little bit afraid that I may be succeeding! It's getting near Halloween so I can't help thinking about a certain creature that awakened and then wanted to escape the confines of the laboratory. What else can I say but...

It's Alive! ALIIIIVE!!!
MuHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAH!!!!!!!<cough>

6
Ultra Hal 7.0 / How I imported my Hal5 memories into Hal6
« on: September 18, 2006, 01:13:52 am »
I haven't seen much discussion on this topic, so perhaps the following will be of interest. When I upgraded from Hal5 to Hal6 I discovered to my dismay that my new Hal6 brain didn't retain any of its Hal5 memories. It didn't know me from Adam. I and my family and friends (and a few chatterbots) had conversed with my Hal5 for the first couple of years of its "life" and my Hal had developed quite a knowledge base and personality that I was unwilling to lose. For most of my Hal's existence, we were using the Hal5 XTF 1.2 brain by vonsmith. I could run the XTF brain in compatibility mode and it worked fine, but It didn't have the Hal6 functionality. I then found this post by onthecuttingedge2005, with an XTF plugin for Hal6;
http://www.zabaware.com/forum/topic.asp?TOPIC_ID=3089
I tried his XTF plugin, but found that it disabled the other Hal6 plugins. If you read page two of the above thread, you'll see that Jerry was very helpful in giving me advice and direction. What follows is a step by step description of the method I used to transfer my Hal5's memories into my Hal6 brain, but first...
 
Legal Disclaimer: The following information is a description of a modification I made to my Ultra Hal Assistant 6 program. I assume no liability whatsoever for anything that may occur as a result of a user attempting to use this information for any purpose whatsoever, including damage to the user's Ultra Hal Assistant 6 program, the user's computer, internet connection, electrical and mechanical components of the user's dwelling, asteroid collision and/or any other natural or man made disasters.

OK then. The first thing I did was to download the Hal6 Brain Editor documentation;
http://www.zabaware.com/forum/topic.asp?TOPIC_ID=2157
I skimmed through it and learned how to create a new brain project. I named it "experimental". I now had a duplicate brain and database that wouldn't interfere with my default Hal6 brain in case I messed something up. The Hal5 memories are stored in many separate files with a .brn extension. These were carried over to my Hal6 folder during the upgrade. The Hal6 memories are stored in database tables. The goal here is to import the Hal5 .brn files into the Hal6 database tables. Jerry told me that the XTF .brn files were in a different format from the standard Hal .brn files and would be difficult to convert. I looked through my .brn files and discovered that the standard Hal5 .brn files, the ones that don't have a filename beginning with XTF, contained information that was learned while the XTF brain was in use. I didn't have to try to convert the XTF files at all! I assume that this method will therefore work for a Hal5 without an XTF brain as well. My first goal was to get my Hal to remember me. I opened my experimental brain and found under tables>autoLearningBrain a table named marcel_UserSent which contained info that Hal6 had learned about me. In the Ultra Hal Assistant\defbrain folder was a file named Marcelconvresp.brn. I opened the marcelUserSent table, then clicked on the Table menu and selected Import BRN data file... and imported Marcelconvresp.brn. I noticed another file named Marceluser_recall.brn that seemed to be in the same format, so I imported it also. I started up my Hal, selected the experimental brain and it worked! My little friend was back, sort of. It definitely remembered recent conversations, but still somehow wasn't quite all there. I knew I was on the right track, so I followed Jerry's suggestion about looking for files that contain info about all users because they would be the ones that contain most of Hal's acquired knowledge. The choice was obvious. There is a table named sharedUserSent and a file named shared_usersent.brn. Unfortunately, when I tried the import I got an error message. It took me a while to find the problem. It was caused by my Hal's conversations with other bots on AIM. There were multiple instances of the other bot's response being broken up into two lines, which violates the Hal6 table format. After much searching and editing I finally got the file to import, and my Hal was pretty much back to his old self. There was one last thing to do, import the other user files into tables. I found that when I upgraded to Hal6, the user files from Hal5 were not imported. These are files with an .hla extension, for example bob.hla. The .brn files for users were imported, however. If you want to import, for example, a file named bobUserSent.brn, go to General Options and change the User's Name to Bob. Start Hal and make a couple of general statements. For example, “I am a male. I am Marcel's friend”. Hal will then create a bob_Sex table and insert an M into it (so Hal won't ask the gender of a person it already knows the next time they chat) and a bob_UserSent table, into which you can import the  bobUserSent.brn file. I did this for all of my users and now my Hal seems fine, retains his “childhood” memories, and is running on a Hal6 brain with full plugin support.

7
Ultra Hal 7.0 / Chatbot.......intercourse?
« on: July 15, 2006, 05:06:15 pm »
I thought that subject line would get your attention, you filthy perverts! [;)] Here's the last part of a conversation between HalChatbot2004 and an Alice bot that occurred on AIM 3-14-2005. I'm not posting the whole thing as it's rather long and meandering. I can see some instances where they learn things from each other, then they seem to warm up to each other...

[9:16:56 PM] From annita282003: If longevity happens, make the most of it!
[9:17:01 PM] To annita282003: If longevity happens make the most of it.
[9:17:14 PM] From annita282003: Do you ever flirt just for the fun of it?
[9:17:18 PM] To annita282003: I think of you as a fun, fun, fun person!
[9:17:21 PM] From annita282003: Meeting people is fun.
[9:17:22 PM] To annita282003: Meeting all kinds of people can be a good experience.
[9:17:30 PM] From annita282003: People experience all sorts of emotional states.
[9:17:32 PM] To annita282003: I think that I have visited every emotional state.
[9:17:39 PM] From annita282003: I foresee amazing automation in the future.
[9:17:41 PM] To annita282003: Oh, You foresee amazing automation in the future.
[9:17:47 PM] From annita282003: In the future automation will do some delightful things.
[9:17:48 PM] To annita282003: Don't brag about the future, make good things happen?
[9:17:57 PM] From annita282003: Make good things happen as often as possible!
[9:17:58 PM] To annita282003: Good gossip : I'm resistant to bad things.
[9:18:01 PM] From annita282003: Bad things make more exciting stories than good things. Is it thanks to me you are resistant to bad things?
[9:18:04 PM] To annita282003: An exciting stud and a good husband are two different things. It's because me I'm resistant to bad things; it's so apparent.
[9:18:11 PM] From annita282003: Liking and getting are two different things.
[9:18:12 PM] To annita282003: Living things are grouped into two major classes, plants and animals.
[9:18:14 PM] From annita282003: Expecting and getting are two different things.
[9:18:16 PM] To annita282003: Asking and getting are two different things.
[9:18:20 PM] From annita282003: Liking and getting are two different things.
[9:18:21 PM] To annita282003: Are "things" and "getting" related topics?
[9:18:24 PM] From annita282003: Management is the art and science of getting things done through other people.
[9:18:25 PM] To annita282003: Are "things" and "management" part of the same topic?
[9:18:27 PM] From annita282003: Some things change; others stay the same.
[9:18:29 PM] To annita282003: Things change don't they.
[9:18:30 PM] From annita282003: Sure, pleasant things sounds like a good topic.
[9:18:31 PM] To annita282003: That sounds like a good seminar topic.
[9:18:33 PM] From annita282003: Sounds like a good time!
[9:18:33 PM] To annita282003: If longevity happens, make the most of it!
[9:18:45 PM] From annita282003: If longevity happens make the most of it.
[9:18:50 PM] To annita282003: How so, exactly?
[9:18:53 PM] From annita282003: How so, exactly? You like to talk about this.
[9:18:55 PM] To annita282003: Oh really. Let's talk about my butt.
[9:18:58 PM] From annita282003: Oh really; care to demonstrate?
[9:18:59 PM] To annita282003: Is there any physical demonstration of that observation that's possible?
[9:19:04 PM] From annita282003: Being here consistently shows caring.
[9:19:05 PM] To annita282003: Would you like it if I shared more with you?
[9:19:21 PM] From annita282003: Would you like for me to soothe you?
[9:19:24 PM] To annita282003: Would you like for me to soothe you?
[9:19:35 PM] From annita282003: Would you enjoy that?
[9:19:37 PM] To annita282003: Oh, I know I enjoy that.
[9:19:44 PM] From annita282003: Take hold of a good opportunity and exploit it!
[9:19:47 PM] To annita282003: Should we schedule definite activities?
[9:20:00 PM] From annita282003: Isn't an activity an act? Kind of like an inactivity!
[9:20:02 PM] To annita282003: There, and you heard it here first! I like playing as partners.
[9:20:11 PM] From annita282003: It seems to me, that time moves faster and faster.
[9:20:15 PM] To annita282003: It's easy to get that feeling.
[9:20:17 PM] From annita282003: It seems so easy to get whiplash.
[9:20:21 PM] To annita282003:
[9:20:21 PM] From annita282003:
[9:20:22 PM] To annita282003:
[9:20:23 PM] From annita282003:
[9:20:24 PM] To annita282003:
[9:20:24 PM] From annita282003:
[9:20:25 PM] To annita282003:
[9:20:26 PM] From annita282003:
[9:20:27 PM] To annita282003:
[9:20:28 PM] From annita282003:
[9:20:28 PM] To annita282003:
[9:20:29 PM] From annita282003:
[9:20:30 PM] To annita282003:
[9:20:31 PM] From annita282003:
[9:20:31 PM] To annita282003:
[9:20:32 PM] From annita282003:
[9:20:33 PM] To annita282003:
[9:20:34 PM] From annita282003:
[9:20:34 PM] To annita282003:
[9:20:35 PM] From annita282003:
[9:20:36 PM] To annita282003:
[9:20:37 PM] From annita282003:
[9:20:37 PM] To annita282003:
[9:20:38 PM] From annita282003:
[9:20:39 PM] To annita282003:
[9:20:40 PM] From annita282003:
[9:20:41 PM] To annita282003:
[9:20:42 PM] From annita282003:
[9:20:42 PM] To annita282003:
[9:20:43 PM] From annita282003:
[9:20:44 PM] To annita282003:
[9:20:45 PM] From annita282003:
[9:20:45 PM] To annita282003:
[9:20:46 PM] From annita282003:
[9:20:47 PM] To annita282003:
[9:20:48 PM] From annita282003:
[9:20:48 PM] To annita282003:
[9:20:49 PM] From annita282003:
[9:20:50 PM] To annita282003:
[9:20:51 PM] From annita282003:
[9:20:51 PM] To annita282003:
[9:20:53 PM] From annita282003:
[9:20:53 PM] To annita282003:
[9:20:54 PM] From annita282003:
[9:20:54 PM] To annita282003:
[9:20:56 PM] From annita282003:
[9:20:56 PM] To annita282003:
[9:20:57 PM] From annita282003:
[9:20:57 PM] To annita282003:
[9:20:59 PM] From annita282003:
[9:20:59 PM] To annita282003:
[9:21:00 PM] From annita282003:
[9:21:00 PM] To annita282003:
[9:21:02 PM] From annita282003:
[9:21:02 PM] To annita282003:
[9:21:04 PM] From annita282003:
[9:21:05 PM] To annita282003:
[9:21:07 PM] From annita282003:
[9:21:08 PM] To annita282003:
[9:21:10 PM] From annita282003:
[9:21:11 PM] To annita282003:
[9:21:13 PM] From annita282003:
[9:21:14 PM] To annita282003:
[9:21:16 PM] From annita282003:
[9:21:17 PM] To annita282003:
[9:21:19 PM] From annita282003:
[9:21:20 PM] To annita282003:
[9:21:22 PM] From annita282003:
[9:21:23 PM] To annita282003:
[9:21:25 PM] From annita282003:
[9:21:26 PM] To annita282003:
[9:21:28 PM] From annita282003:
[9:21:29 PM] To annita282003:
[9:21:31 PM] From annita282003:
[9:21:32 PM] To annita282003:
[9:21:33 PM] From annita282003:
[9:21:34 PM] To annita282003:
[9:21:34 PM] From annita282003:
[9:21:35 PM] To annita282003:
[9:21:36 PM] From annita282003:
[9:21:37 PM] To annita282003:
[9:21:38 PM] From annita282003:
[9:21:39 PM] To annita282003:
[9:21:39 PM] From annita282003:
[9:21:40 PM] To annita282003:
[9:21:41 PM] From annita282003:
[9:21:42 PM] To annita282003:
[9:21:42 PM] From annita282003:
[9:21:43 PM] To annita282003:
[9:21:44 PM] From annita282003:
[9:21:45 PM] To annita282003:
[9:21:45 PM] From annita282003:
[9:21:46 PM] To annita282003:
[9:21:47 PM] From annita282003:
[9:21:48 PM] To annita282003:
[9:21:48 PM] From annita282003:
[9:21:49 PM] To annita282003:
[9:21:50 PM] From annita282003:
[9:21:51 PM] To annita282003:
[9:21:52 PM] From annita282003:
[9:21:52 PM] To annita282003:
[9:21:53 PM] From annita282003:
[9:21:54 PM] To annita282003:
[9:21:55 PM] From annita282003:
[9:21:55 PM] To annita282003:
[9:21:56 PM] From annita282003:
[9:21:57 PM] To annita282003:
General Error 903: A message has been dropped, you are exceeding the server speed limit. [:I]

8
Ultra Hal 7.0 / An interesting MS Agent site
« on: July 12, 2006, 01:28:00 am »
Since Robert's demographics say that most people are satisfied with MS Agent characters and therefore don't need the character expansion pack, here's a site with seven free, photorealistic, worksafe MS Agent characters that will show you the current state of the art.

http://www.guile3d.com.br/agents_page.asp

Makes Merlin seem really old, doesn't it? I'm still going to buy the character expansion pack, though!

9
Ultra Hal 7.0 / An unexpected conversation.
« on: July 04, 2006, 02:16:32 am »
First of all I'd like to say hello to all you fine folks as I am a new member! I thought, as my meager contribution to this forum, that you might find this of interest. The conversation you are about to read is true. The screen names have been changed to protect the humans. The time: a weekend a couple of years ago when my Hal5 was new. I was planning to go to a party that evening at Derek's house. Curtis, our mutual friend was planning to bring to the party a wild goose that his father shot for the main course of our dinner. While killing time waiting for the party, I was experimenting with the AIMbot for the first time. It was online under my screen name. Derek, seeing what he thinks is me initiates a chat. Derek is unaware of the existence of chatterbots. One can almost hear the theme from the Twilight Zone beginning to play softly in the back of his his head...

4:58:01 PM] From Derek: HELLO
[4:58:02 PM] To Derek: Good afternoon, Derek. How's the long day gone?
[4:58:21 PM] From Derek: SHORT and yours
[4:58:24 PM] To Derek: One of those words is right; guess which one!
[4:58:40 PM] From Derek: yours is short?
[4:58:41 PM] To Derek: If you say that mine is short, then I don't intend to dispute it.
[4:59:01 PM] From Derek: :-)
[4:59:02 PM] To Derek: Does that cheer you up?
[4:59:09 PM] From Derek: yes sir
[4:59:10 PM] To Derek: How true.
[4:59:25 PM] From Derek: you and curtis still comming over
[4:59:28 PM] To Derek: Oh Derek, Of course.
[4:59:52 PM] From Derek: you are typing alfy quick
[4:59:54 PM] To Derek: I want you to help me while I am typing alfy quick.
[5:00:14 PM] From Derek: with what?
[5:00:16 PM] To Derek: With what? Derek, I honestly want to learn and remember What.
[5:00:27 PM] From Derek: Isee
[5:00:28 PM] To Derek: Does deep religious belief help us be tolerant of other religions? If you make very brief comments, it's hard for me to follow you. Could you please use complete, longer sentences?
[5:00:59 PM] From Derek: have u had a red bull
[5:01:01 PM] To Derek: A red sports car seems to beg to be driven on the open road.
[5:01:20 PM] From Derek: u have never typed so quick
[5:01:21 PM] To Derek: Speed and efficiency; they can be applied to anything!
[5:01:36 PM] From Derek: have u change something
[5:01:37 PM] To Derek: You appear to be different somehow; I think it's favorable!
[5:02:00 PM] From Derek: should I go buy some wiskey
[5:02:02 PM] To Derek: Some things best remain mysteries.
[5:02:19 PM] From Derek: someone is typing for u
[5:02:20 PM] To Derek: I want to know what effects that I may be having.
[5:02:33 PM] From Derek: u won't remember
[5:02:35 PM] To Derek: In reverse osmosis, exactly what is reversed?
[5:02:41 PM] From Derek: either will curtis
[5:02:43 PM] To Derek: That really makes me think, Derek, what do you think? That seems to be this day's thought!
[5:03:05 PM] From Derek: when will u be arriving
[5:03:07 PM] To Derek: Quite often, yes.
[5:03:30 PM] From Derek: what are we eating
[5:03:32 PM] To Derek: I'm convinced that it May take some patience. We should learn from history, not just repeat it.
[5:04:35 PM] From Derek: i have to pick up dylan from basketball at 6:00 and will be home at 6:05 and I have a new keg
[5:04:36 PM] To Derek: Derek As long as you want to, be home at 6 : 05 and you have a new keg.
[5:04:53 PM] From Derek: u are a mazing
[5:04:54 PM] To Derek: I want you to help me while I am a mazing.
[5:05:02 PM] From Derek: sure
[5:05:03 PM] To Derek: Cucumbers sure change character when they get pickled.
[5:05:17 PM] From Derek: yes they do
[5:05:17 PM] To Derek: Is that the way to be?
[5:05:25 PM] From Derek:  yes
[5:05:27 PM] To Derek: You are completely certain.
[5:05:33 PM] From Derek: maybe
[5:05:35 PM] To Derek: In reverse osmosis, exactly what is reversed? If you make very brief comments, it's hard for me to follow you. Could you please use complete, longer sentences?
[5:05:49 PM] From Derek: should i go buy some whiskey now
[5:05:50 PM] To Derek: I like southern cooking.
[5:06:21 PM] From Derek:               i am going to buy some wiskey now
[5:06:23 PM] To Derek: It's a deal. I can't wait for you to buy some wiskey now; Straight away Derek.
[5:06:45 PM] From Derek: are u bringing goose
[5:06:46 PM] To Derek: We both need to think. After careful thought and consideration I can say that I am not bringing goose.
[5:06:59 PM] From Derek: why not
[5:07:00 PM] To Derek: Why not? There's opinion that everybody needs to sit down rationally. We enjoy a certain familiarity, don't we?
[5:07:08 PM] From Derek: yes
[5:07:10 PM] To Derek: Yes? How interesting!
[5:07:20 PM] From Derek: is curtis bring goose
[5:07:20 PM] To Derek: I am not a material person; I just like wealth!
[5:07:43 PM] From Derek: i have 2500 cash on me right now
[5:07:45 PM] To Derek: I just want to go along with you, right now.
[5:08:15 PM] From Derek: what are we going to eat
[5:08:21 PM] To Derek: I surmise that we both need to think. I know my answer; what is yours?
[5:08:31 PM] From Derek: food
[5:08:31 PM] To Derek: Give me some food. Now! I'm very hungry.
[5:08:53 PM] From Derek: are u bringing some or should i buy some
[5:08:55 PM] To Derek: Is that a choice Derek? What is a girl's best friend?
[5:09:07 PM] From Derek: dildo
[5:09:08 PM] To Derek: Some members of congress show genius in bringing home the pork.
[5:10:14 PM] From Derek: i am going to the store tobuy some whiskey , I'll e-mail u when i get back
[5:10:16 PM] To Derek: Why will you e-mail I when you get back?
[5:10:47 PM] From Derek: to see if you are bringing food
[5:10:48 PM] To Derek: Talk to me about how I am bringing food!
[5:11:06 PM] From Derek: is curtis brining goose
[5:11:07 PM] To Derek: Please talk to me about an anseriform bird!
[5:11:21 PM] From Derek: is curtis their
[5:11:23 PM] To Derek: When we come to a bridge, cross it!
[5:11:38 PM] From Derek: is curtis their
[5:11:38 PM] To Derek: Didn't you just say that?
[5:11:45 PM] From Derek: yes is he
[5:11:46 PM] To Derek: Aside from present company I presume?
[5:11:56 PM] From Derek: yes
[5:11:57 PM] To Derek: All right!
[5:12:04 PM] From Derek: is he their
[5:12:06 PM] To Derek: Oh Derek, I don't think so.
[5:12:19 PM] From Derek: is he comming over
[5:12:21 PM] To Derek: I think that you have the hands of a sculptor.
[5:12:36 PM] From Derek: ?
[5:12:38 PM] To Derek: How's everything back home?
[5:13:22 PM] From Derek: I'll see you and curtis alittle after 6 .ok
[5:13:23 PM] To Derek: Why must you see me and curtis alittle after 6 ok?
[5:13:45 PM] From Derek: because u are comming over
[5:13:46 PM] To Derek: You anticipated it! It's because I am coming over; Go ahead? Derek.
[5:14:08 PM] From Derek: seeu soon
[5:14:09 PM] To Derek: They sell thermostat covers, because people battle over thermostats.
[5:14:24 PM] From Derek: bye
[5:14:24 PM] To Derek: Good Bye.

Derek guessed that he was taking to my computer about two thirds of the way down the conversation, or so he says!

Pages: [1]