- cross-posted to:
- minimalgpt@infosec.pub
- aicompanions@lemmy.world
- cross-posted to:
- minimalgpt@infosec.pub
- aicompanions@lemmy.world
I wrote MinimalGPT in about a weekend as a minimal chat client where everything is stored client side (chat messages aside obviously).
Entire conversations are stored local to your browser instead of a database etc…
Supports both GPT3.5 and GPT4 as well as basic DALL-E image generation. Possibly Bard integration in the future if anyone actually wants it.
The GitHub is available here
It’s nothing crazy, but for a simple chat client without any BS it is nice.
You have to provide your own API key but they hand them out like candy so have a blast!
Edit - Pushed out a small update that adds a toggle for auto saving new conversations. If disabled new conversations are only saved (locally) when you press the save icon.
After a conversation has been saved it is automatically updated/saved every time you send a message from there on out.
If anyone runs across this in the future and doesn’t want to revive a dead thread feel free to message me!
So, is there any way to get it working as a front-end for a local llama installation?
Not currently unfortunately, I just started work on adding PaLM API support in some form.
I’ll have to look into LLaMA after that is complete, I hadn’t even thought about that one yet haha.