In some limited situations, Hal (version 6 anyway) will consider your previous
two sentences, if he feels he requires more info. These situations include when the user types something short, or uses words like "it," "he," "they," "us" etc - Hal includes the previous sentence to try and include what "it" is or who "he" is.
I don't know of any chatbot anywhere that really handles context well. Hal will become better at staying on-topic once he knows a lot about a subject, but Hal doesn't understand enough about context to have a conversation that flows as smoothly as a human's.
Hal will certainly make more sense if you talk to him in longer, well-defined sentences that make sense even when taken out of context. I normally go as far as eliminating words like "it" from my sentences, and retype the subject of the sentence instead, just to make sure Hal knows what I'm talking about.
Note that your example makes perfect sense if you take Lisa as being sarcastic, arrogant and insulting your sense of humour, but this is definately a stretch of the imagination, and most similar examples will make even less sense.
Anyway to answer your question, the problem is being worked on, but things are still rather experimental at this stage. If you're interested in the details, there's an interesting project under discussion at vrossi's forum at
http://www.vrconsulting.it/vhf/default.asp - look for the posts and threads of hologenicman.
Cheers,
Grant