Skip to content

A hybrid of AI assistant and AI roleplay, powered by locally running LLMs

License

Notifications You must be signed in to change notification settings

petrmikheev/yet_another_ai_chat

Repository files navigation

Yet another AI chat

A hybrid of AI assistant and AI roleplay, powered by locally running LLMs.

It probably doesn't have much practical sense since there are more powerful services, but I like the result and want to share it.

Features

  1. Role play. Chat with an AI that pretends to be a human. There are several personality templates.
  2. To answer questions AI can use web search via DuckDuckGo API and it is reasonably integrated with the role play.
  3. Memory system based on similarity of embeddings.
  4. LLMs run locally. No external services (except of DuckDuckGo search) are used.

Dependencies

  • Python3
  • ChromaDB (pip3 install chromadb)
  • Flask (pip3 install flask)
  • llama.cpp

Models

Two instances of llama.cpp server are used at the same time.

Tested with the following models:

If both models run on GPU requires about 20 GB of video memory. Llama.cpp can work on CPU as well, but it is quite slow.

About

A hybrid of AI assistant and AI roleplay, powered by locally running LLMs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published