I spent most of yesterday procrastinating finishing up some deferred shading examples and so I ended up reinstalling Ubuntu Linux on one of my spare hard-drives to play with the beta version of the distro and mess about with the recently stable ext4 file-system (as you may have already guessed I'm a complete power-nerd when it comes to computers). Unfortunately I was unable to get Ultra Hal to run under wine no matter what hacks I did as even though the Haptek SAPI configuration works fine for SAPI4 voices, the Haptek player itself failed to run with an obscure error that basically means there's just no chance of running it because of whatever obscure COM functions they chose to use will probably never be implemented and don't even have a stub implementation. Ultra Hal itself failed to run with a whole bunch of stub errors which suggests that eventually Ultra Hal may run under wine, but it will probably never run perfectly as any functionality it gets from the Haptek player won't work.
Anyways, to make a long story short this got me thinking of starting work on a program similar to Hal just for penguin lovers. Rather then a small program that starts up though, I was thinking a full desktop manager or at least a applet/module to an existing one. This way a person could have Hal's antarctic alter-ego sitting on the desktop in form of either a full-screen 3D head or body in place of where a wallpaper would be and a split terminal to type messages and command to it sitting somewhere on the screen. I've never liked the idea of start menus and icons all that much as it usually takes less time to type a run command into BASH then it does to look through a start menu to find what you want, so this sort of thing would really make an ideal desktop environment for me. Linux also lends itself rather well to an AI driven system as nearly any action you could imagine doing on the computer has a command-line program dedicated to just that task. Having an AI driven system as opposed to just using a minimalistic window manager and a terminal also has the benefit of the AI performing extended operations that require other operations to complete first (ie. a command like "Play all songs by Zakk Wylde" could either be performed from a database and looking for songs with the author Zakk Wylde, or it could be performed by doing a file-system scan of your hard-drive, opening all music files to read the ID3 tags, and then playing it or queuing it if it's by that author. The first method would certainly be faster, but you should get the picture of what I mean by more complicated tasks then could be usually performed by a single command). Obviously this is a fairly crazy idea, but its certainly doable, and if the AI scripting was done using lua, python, perl, java, or even C#, the AI could easily be improved to not only better control your system with ease, but also learn from conversations just as hal does. For admins this could also be a perfect tool as there are always thousands of small tasks that have to be regularly performed which are often times too complex for a cron job but an AI that could handle the sequence of commands could easily perform for you, though obviously if you are wanting to leave you entire system fully in the hands of an AI, you probably want to be sure that system isn't controlling anything vital, can be unplugged if it tries to take over the world, and most of all will ask for feedback if one of its tasks does something it hasn't been set up to handle perfectly.
This is no way a weekend project and could easily take years to complete, but I'm just putting it out there to see what you all think of it. For TTS functionality festival works great and provides a library to integrate it into any application, but unfortunately Voice Recognition are still in early days for the penguin, unless some university or business already has some proprietary system that they are keeping secret, so not much chance of being able to talk directly to your computer yet.
Some (or many) people may consider this more work then it's worth and that most of the functionality could already be done in BASH, but I think integrating something like this into a desktop environment would work well as it would allow it to receive feedback from other applications using the standard X11 events, and then this could be expanded further using D-BUS to communicate directly with applications that support it. I could do all the necessary Xlib programming, database programming, and binding to a script language, but I wouldn't really trust any AI I designed and programmed from the ground up to be all that efficient. Anyone think this idea is interesting? Anyone experienced with inter-process communication on a Unix based system? If anyone really likes this idea and wants to work on it feel free to e-mail me, but even though I'm definitely going to work on this, it will be a few months before I've got time that I can really get much done towards it, and it could be much longer before any real AI work gets done since there has to be a language to script it in first. As of now I'm just trying to decide on all the dependencies a system like this would be likely to have (ie. Xlib, DBUS, SQLite or MySQL).
If you've actually managed to read this rant this far and don't understand all my references to penguins then you obviously don't know Linux as the penguin is Linux's mascot.[

] Even if you aren't a Linux person though you can still help with feedback about AI design issues or whatever else that applies.