I’ve finally released the AI chat software I’ve been working on. I’ll try to write a blog post about it at some point, but until then, you can find more information at the github repo.

Sentient_Core is a local llm AI application so all of the text inference is done on your machine. Optionally, it can support calling out to koboldcpp through it’s API so it is possible to do the heavy compute on a different machine - or use more up to date model formats because rustformer’s llm project is still trying to merge a current version of ggml after being behind for months.

If you have any questions or comments, feel free to ask away!