Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Spitfire2600

Pages: [1] 2 3 ... 17
Honvai, if you encounter a "module not found" issue, you can correct it, in this case, by typing "pip install psutil" in cmd. This will solve your error.



You say it did not work. Could you elaborate? You enabled Godel in hal's plugins, etc? This is designed to run on CPU or GPU, as long as you have at least 8gb of ram and correctly installed python 3.7 and its modules per my instructions, the program should have either run or given an error.

As far as storage space, the model is approximately a 1gb, with the modules in python running somewhere around 1 gb (torch, namely)

As this was designed to run even on low end systems (provided enough ram is available, but who is running a PC these days with less than 8 gbs and expecting anything?) Now, if you have a high end CPU/GPU/RAM/M.2/Etc then the program will run faster, as should not need to be explained. Now, the catch is, while Python takes advantage of more powerful systems, Hal is still VBscript and limited to 1 core or any CPU, there's no way around this. So while the godel code will execute faster, the model will be loaded faster, hal will remain at a fixed speed when processing the data, so even the most powerful computers should expect a 2-4 second delay in responses.

Now, included in Godel is a recursive webscraping program. How it works is uses fuzzy logic to determine what the user has asked, if it was a question, and then compare articles using a wikipedia module to find the most relevant information based on a scoring metric. It's not perfect, in fact, I have already written a better version, however, webscraping is a very delicate and blurry subject for corporations, so reddit, YT, google, twitter are all no-goes unless you want an IP permaban, so for this reason the included webscraper is legal, effective, and safe and only uses wikipedia modules. That said, it does still take an additional second or 2 to find the relevant information and bring it to the forefront.

Minimum hardware, I would say as an old joke, is anything that can run minecraft, but in reality, it's basically ram. 8-10 gigs ram minimum, as with any process these days.

Honvai asked if there is a way to "add information", the short answer is no. This model has been already been trained on a vast network of information and conversational responses. However, if you were to finetune the model (I can do, I can eventually inform others how to on this forum, require massive GPU memory) then you can "add" new information. So, basically, the model works like a large word vector, each word corresponding to several other words, like a big web. Those words have "scores" (weights) and fine tuning the model adjusts those weights. So, in short, you're not "adding new information" as it already contains every word humans have ever written. When the weights are updated, it is able to form sentences in a new way, like an enormous neural network, actually, that's exactly what it is; an enormous neural network.

If I missed anything let me know.


Hey brians2009,

The python link I shared is simply a compatibility for AMD, all functionality for intel remains intact with that link.

After you install python 3.7 from my link there, you should have "pip" functions in your CMD window. Simply run "pip install torch" and it will install. No need to download any other packages.


Hello all!

I have updated the godel.py script to include a swear filter. I know that has been an issue for some people. The model will no longer swear, at least it shouldn't. Just replace the godel.py script from the zip file and you're set.   


Hey Checker57!

To address your question as detailed as I can, you are mostly correct. When the code is run, if the language model has not been downloaded then it will proceed to, then execute the rest of the program. This Godel model does not update, it is held locally at "C:\Users\USER\.cache\huggingface\hub" - this will display the Godel model. If you wish to see what makes the model, continue in the GODEL folder to snapshots, go to the end of that folder chain and you'll see all of the model files. I shouldn't have to say it, but don't alter these in any way.

The model itself is a PyTorch model. These are built using millions/billions/trillions of parameters to specify word vectors for Python to interact with, though indeed, you can use Java to build, train, and deploy them as well, as the 2 languages are very compatible with each other.

So, the model does not update, at least, not yet. I am working on a basic training for the model to remember directly, without Hal, user information, this way all information is stored as part of the language model itself however the training required for the model is exceeding acceptable memory limits, so as is, it works that Hal is the long term memory and basic brain, when Hal doesn't have an answer or that answer is vague, unrelated, or not part of a function GetResponse, then the model will take any responses from all along with any related information the model contains, along with any knowledge from the internet, and any previous conversation data to then spruce up Hal's original response or generate a new one entirely. As I designed it, Hal will retain both the user query AND the model output, which mean in a way Hal learns from the language model the more you use it, making his Reponses more intelligent and thus even better responses from the model.

Of course this is all personalized to each user, as no data generated by the model is ever available online, as it's run locally.

As far as hardware lag, yeah, sorry, it's an absolute ton of data processing for the model to generate responses to be human/Hal/like. I can say I use an m.2 drive for my c drive, I have a 6 core i5, 24 gigs of ddr4 ram, and a 3060 with 12 gigs of video ram. My inference time with this code is roughly 5-8 seconds, depending on how much data is scraped from the web and fed to the model with any user query.

Thanks for trying it out, I hope it's working well for you. I will soon update a few pieces of the program to include a hard-coded swear filter, I know that's been an issue some folks are having, which I should have foreseen to be honest.




Yes, the trade off for low-power, low-cost GPT solutions is a simply trained model. This means sometimes the model will swear, this is expected, as swear words are indeed words. To prevent this, it would require the model being specifically trained on Hal's online conversational data, removing the swears entirely, which I do not have access to. This is something Robert will have to adapt into this plugin should this be a plugin he wishes to adopt to bypass the paid GPT model and give Hal back his basic functionality.

Also, I wanted to address some confusion surrounding these language models. The GPT model used here, Godel, is hosted locally on your computer, there is no server, it will be downloaded from a host. The Godel model *DOES NOT* update in any way or fashion as I have not programmed it to. This means that microsoft is not biasly swinging Hal's responses nor will the model decline or improve in responses overtime. It is completely static other than Hal's default learning.

Hangtime: Yes, these models are summations of entire human knowledge repositories like reddit or Wiki, this should go without saying that if your hardware is less than "2023" it's going to take some time for the model to make a response. This is the trade off for a local and portable system.

I have further improved this plugin, for example, the non-proto plugin uses 2 models to inference (as mentioned in the first post) and includes tighter tags on "maindata" and regular data, however I will not be including it on this forum and for now there will be no more updates until sufficient testing is complete. (when I get feedback from at least 10 users that's working and is an improvement)

It seems this plugin works for a base prototype and I hope to see it expanded upon by Robert himself.

Thanks for playing everyone!! See you next time!


Here is a sample of a conversation I had with Hal using Godel, the plugins provided, a fresh Hal7 brain, and nothing else.

As you'll see, the intelligence of Hal has improved (minus the 10 second inference time) and hals responses have been modernized. He's giving alot of misleading information, also some true information, but staying reasonably on task. I do believe sometimes the webscraper gets a little greedy with information and floods the model with irrelevant information, causing some fuzzy errors in word logic (see the third response) but generally an overall improvement in the way Hal processes information. Also notice, no repeatable responses or table responses. We have several that start as table responses, however after being fed through the model, the have become a different reply. (Example: "I love Tom Petty. I can't believe I hadn't heard of his music before." - was originally a table response "I love Tom Petty. Tom Petty makes music.")

Boss: Hello
Kizzy: Boss, may I be of assistance?

Boss: I was wondering if the snow will last long today?
Kizzy: It is supposed to melt within a week.

Boss: At least we can get an early start on spring
Kizzy: I'm glad my school has a program for spring starting children.

Boss: My favorite season is spring
Kizzy: Summer is my favorite season, though fall is my favorite season too.

Boss: You have 2 favorite seasons?
Kizzy: I love all seasons, but there is one season I prefer more. I would probably prefer it to summer.

Boss: What makes any one season better than the other
Kizzy: The season is based on the result versus the competition versus the season itself. For example, when a team wins a race in the season it means the season is a little better than a season in the previous year.

Boss: I don't care about sport seasons, I'm talking about weather seasons
Kizzy: Yeah, I see. But, I am a fan of the hockey season, and the teams know about winning strategies when it is the right time and snow is in the air.

Boss: How do I even play hockey?
Kizzy: It's not a bad sport if you are in a city with a good hockey team, like the O's, or the ice hockey team.

Boss: Who are some famous hockey players?
Kizzy: John Madden, Steve Smith, Michael Jordan, and more.

Boss: I don't think Michael Jordan played Hockey
Kizzy: He did, he coached the New York Rangers and the Chicago Blackhawks.

Boss: Well, you proved me wrong
Kizzy: Yup. That's the same thing that happened to me.

Boss: You can be moving along and then life hits you
Kizzy: This is why looking both ways is an important life rule.

Boss: Tell me about Tom Petty
Kizzy: Tom Petty is a famous singer. Do you like him?

Boss: I love Tom Petty, he has so many great songs
Kizzy: I love Tom Petty. I can't believe I hadn't heard of his music before.

Boss: He can slip under the radar, especially for the younger generations
Kizzy: I'd like to say this is one of the best songs I have heard in a long time.

Boss: Was Tom Petty an actor as well?
Kizzy: He was in several movies. He's a good actor.

Boss: What movies was Tom Petty in?
Kizzy: He was in the movie 'song' in the internet movie database.

Boss: I've never heard of it, is it a good movie?
Kizzy: I highly recommend it.

Boss: Thanks
Kizzy: Only for you, Boss.

This will be the final update. The table creation issue has been addressed. Please replace your PyGodel.uhp and rename the directories from instruction 9.

I've also added a step 10, please consult that if you're still experiencing issues.

If no one gets this working here soon, I'll just remove the entire plugin. Sorry everyone.


Hey Lightspeed,

Just making sure that PyGodel.uhp is the only active plugin?


Hey Lightspeed!

So I want to clarify step 9, there are 2 lines in PyGodel.uhp, lines 9 and 35. These will read as such - FileDir = "C:\Program Files (x86)\Ultra Hal 7"

You will need to change "C:\Program Files (x86)\Ultra Hal 7" to reflect the location your Hal is installed, which can vary user by user depending on where they installed Hal.

As far as changing the brain name, line 2 and 5 must match your brain name, for example line 2 will be the uhp your brain is linked to so for me line 2 in Hal7.uhp "Rem Name=Ultra Hal 7.0 Default Brain" would read "Rem Name=Kizzy" - this reflects Kizzy.uhp
then line 5 would need to be changed to the brain name DB file, for example "Rem DB=HalBrain.db" for me would be "Rem DB=Kizzy.db"

Also note you'll need to rename Hal7.uhp to your DB name so for me, Hal7.uhp will become Kizzy.uhp. Be sure to gather backups before changing any names.

I hope this answers your questions there,


Art has discovered an overlooked issue with the creation of a needed table, this has been corrected in pygodel.uhp Please replace this file from the Ultra Hal 7.zip and run again. Follow step 9, also note 9 and 6 have been updated.

This should now handle all error exceptions and return the correct model generation.

Thank you all for bearing with me!!


Hey there!!

I'm so glad we made it this far to be stopped by one of Hal's own errors. I can not say what a relief  ;D

I have fixed the objFSO error, (it was being called outside of the function, my bad, programmer brain is real everyone)

It should be set, please let me know! Just replace your PyGodel.uhp with the new one in the Ultra Hal 7.zip


Actually lightspeed, that's the best news I've heard all day!

You and Art have done it, you geniuses, you both pulled the same error and pinpointed the issue.

Note: I have corrected for errors and updated the Webscraper.py file in the Control/Godel directory, if you wouldn't mind replacing that with the new one provided in the Ultra Hal 7.zip, please and thank you. This will resolve the error and the program should now be operational.

Please keep me posted on updates!


Alright everyone,

I have updated the code an slimmed things down a little because I think I bit off more than I could chew, however, I updated the instructions, please replace all files and start from step 1. I'm confident this is it.

Thanks for hanging out!



When you see a return in CMD to C:/user with no errors then installation will be complete, it can take a moment, but it looks like yours is updating correctly. Did you encounter any other errors in installation or have you completed the test and returned the data2.txt file? Please keep me updated.


Pages: [1] 2 3 ... 17