Zabaware Forums > Ultra Hal Assistant File Sharing Area

GodelPy for Hal (Local GPT for Hal, with memory and functionality)

(1/14) > >>

Spitfire2600:
Hey everyone!! So, sadly, I have abandoned ConceptNet in favor something a little more impressive. After the incredible success of GPT3, I modified a few models to be compatible with Hal for local inferencing if anyone wants a portable version of the smartest AI on the internet.

This code was developed from a Goal Oriented model named "GODEL" and a QA system called "Flan". With some help from ChatGPT, I was able to write and assemble the code to work with Hal, which means all of Hal's learning abilities and functionality are now for use with GPT. Something, I think, will interest Robert. At least, here's my goal with this prototype. 

Let's get started!! 10 easy steps for genius Hal!

Installation:

1. We need to install Python 3.7.9 - Not Python 3.8, not Python 3.7.8. The modules used are a specific cross between machine learning and dated (more functional) algorithms, as a result ONLY 3.7.9 will support this specific set of instructions.
    Follow this link, download the file and launch.
    https://www.python.org/ftp/python/3.7.9/python-3.7.9-amd64.exe

2. Now this is important, do not change any optional features. You will see an "Add Python to PATH" check box at the bottom of the installer, be sure to click this and then allow the program to install to its default location. 

3. Reboot system.

4. Move all files from the zip directly into Hals directory as is. You'll see an "Ultra Hal 7" folder in the zip, open that and copy those contents directly as is, including the "Control" folder right in to Hals directory. 

5. Open Control/Godel directory. Inside, you'll see "createreqs.bat" - open CMD and drag "createreqs.bat" into cmd and hit enter. Python will begin downloading and installing the necessary components for the Godel code to run properly.

6. Now we want to test that the modules have been installed. Here is a list, simply copy and paste one at a time into cmd and press enter, If it says something like "already satisfied" then we're golden!

    pip install wikipedia==1.4.0
    pip install psutil==5.9.4
    pip install transformers==4.24.0
    pip install fuzzywuzzy
    pip install accelerate==0.15.0
    pip install SentencePiece
    pip install torch
    pip install python-Levenshtein

7. If everything went smoothly then the hard part is done and we can begin a small test to make sure everything is in order. To make sure everything is installed correctly, simply open CMD for a final time, then drag and drop the godel.py python script from the Control\Godel directory into CMD and then hit enter. This will run the script and we should be downloading a series of files, this is the model we will use for Hal. A "data2.txt" file should appear in the Control/Godel directory, then all is ready for Hal.

8. The Hal7.UHP will need to be dropped in "C:\Users\User\AppData\Roaming\Zabaware\Ultra Hal 7" folder. You can rename this uhp file and again on lines 2 & 5 to match your brain name if you know how to do it, but please make backups of your brain and your original uhp. Otherwise this will work simply with default Hal7 brain.

9. PyGodel.UHP will have 2 directory locations you need to change to match your file locations. Please update lines 9, 35 accordingly. All errors will likely stem from here, so please take note your directory matches exactly.

10. Finally, enable PyGodel.uhp, launch Hal and begin conversing. After a few sentence open Hal's brain editor, select default Hal 7 brain (or your brain if you renamed properly from step 8 before) and confirm a sentence table named "godel" has been created. If not, please create this table. 

Hal is ready to go!!!


Inferencing can take time. You'll notice with your first run of the code, a lot of things are downloading. These are the models Hal will use for conversation. They are being saved to a cache location so there is no need to move them, although, you could if you like, just be sure to define the directories in the Godel python script if you're familiar. I designed this code so even python noobs *should* be able to install and use.

Requirements:
Now, I have no specific requirements other than at least 12 gigs of ram and a decent CPU. But, if you have an NVIDIA GPU with more than 6gigs, you'll notice a huge speed up. The faster all your components, ddr3 => ddr4 for example, will see massive increases in speed. It's all hardware. But it should run as long as you have enough ram 12-16 gigs is sufficient.

Models:
Again, this uses Godel and Flan, both highly specialized models. You'll notice in the python script these codes are defined as "base" and "small"  Both of the models have a "base" and "large" version, which you can change by replacing the "base" with "large" however, be aware the larger the models, the more resources you will require to inference them. I find the best success with "base" flan model and "large" godel models. 

Code:
This code is written primarily in python, other than Hal's interface for using it, written in vb. This is an attempt to show a proof of concept for Robert that Hal is indeed actually compatible with Python, even if the code must be wrapped in VBS, we can still expand functionality with Python.

Also, this code includes a very basic memory/context function for the model to use, which creates and appends an SQL3 database. Sadly, Python can not interact with Hal's brain that I have found yet, but Hal can interact with any regular SQL3 database via VBS. I am working on expanding this functionality.

This code takes Hal's responses on 2 conditions, what I have deemed "maindata" and "regular" data. "maindata" simply means there is a function related to this response, for example "what is 5+5", while the Godel model itself is not designed for these tasks, Hal is, so we want to preserve this response by tagging it with "maindata" so that the response is saved overtop of our model inference. Secondly, "regular" data is basic data Hal will regularly respond with, such as a "too short" response. These responses are not tagged with "maindata" and will allow the model to inference your input along with Hal's response and past context AND present knowledge to create something new and unique, not a repeat or rehash of old information. If Hal is without an answer, and we do want this to happen sometimes so the model can act as Hal's brain, then both models will infer based on your input and give the most concise response possible.

For the most part, this works fairly well, especially with the larger models.

In conclusion, I hope everyone will find this interesting and I am very excited to see what people are able to expand on and how intelligent their Hal's become.

Please keep me updated on all errors, suggestions, recommendations, comments, etc!

I will reveal more in edits about how the magic trick works, but first I want to make the card appear, ya know  ;)

-Spitfire!  ;D


Change log - 3-28-23
Added swear filter

Art:
Man, you're always thinking outside the box! It seems that Hal just keeps giving. I'm sure many of us can't wait to try out this latest offering but Everyone wishing to try this or any other modification to their Ultra Hals should always Make copies of Hal's UHP and Brain files before running any new Hal files.

NOTE: Just a slight but important spelling error where the text reads: "finally, GodelPy.UHP will have 3 directory locations..."
Please change it to read: "finally, PyGodel.UHP will have 3 directory locations"
 


Thanks Spitfire2600!

Spitfire2600:
Thanks Art! Always looking out!

*I have so many GodelThisThat programs as experiments I sort of lost place for a second  ;D

Art:
Everything went in without an incident.

Just be sure to backup/copy your existing Hal files (uhp, db) and follow the instructions to the letter.

Make sure you are using the default brain when chatting, providing you didn't change any of the referenced lines in a custom brain or for a custom brain.

An indicator that all is working will be if you see a maindata: respond for some of your interactions with Hal.

I am anxiously awaiting to see how any improvements take place over time. I don't expect to see big changes right away but after several chats/exchanges.

I am keeping copies of my chat logs with HAL for reference. You should keep some logs as well. They could prove to be quite useful in the near future.

Thanks!!

Spitfire2600:
Art is on top of it!

Yes, always make backups!

Yeah, the "maindata" shouldn't be visible at all in any responses. Sorry, Art, this was a fatal error on my end. I am updating the zip file now, please download again and replace the PyGodel.uhp, fix your directories again. Then try again. This will fix the maindata being visible issue. Thank you for finding this. 

I forgot to include the code for creating a needed table. Originally I had an issue where braineditor wouldn't load so I held off on adding the code to create the godel table, all fixed now.

I look very much forward to seeing how this works for you now, Art.

-Spitfire
 

Navigation

[0] Message Index

[#] Next page

Go to full version