Skip to content

chiragshah285/localLLM_guidance

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Guidance Agent Eval

This is a fork of https://github.com/QuangBK/localLLM_guidance/ with added models and example agents.

Standard prompt is a blank guidance prompt so you can design your own agent. Once you are satisified, you can create an agent file in the 'server' folder. You can start by copying the UniversalMarkdown agent and adding your prompt.

Generally speaking the input variables are "query" (the input box) and "resolver" (the output box) if your agent has a guidance resolver variable.

How to run

  • Download the models. For now, they are hard coded in ./app.py and you will need to change them to your local path.
  • Set your models home directory in app.py
MODEL_DIRECTORY = "/home/shazam"
  • Run the server

Optionally: Install GPTQ-for-LLaMA following the Oobabooga instructions https://github.com/oobabooga/text-generation-webui/blob/main/docs/GPTQ-models-(4-bit-mode).md At this time, they are using a forked version of GPTQ-for-LLaMA. Please pay special attention to the instructions above.

Run

python3 app.py

Goto http://localhost:7860/

Example agents

"StandardPrompt", "COTpromptBuilder", "COTpromptBuilder2PromptResponse", "AIDecisionMakerSimulator", "SearchToolAgentPOC", "AgentGuidanceSmartGPT", "ChatGPTAgentGuidance", "AgentGuidanceFlowGPT", "UniversalAnythingToJSON", "UniversalAnythingToMarkdown"]

  • StandardPrompt is a blank guidance prompt so you can design your own agent.
  • COTpromptBuilder is based on Connect multiple ChatGPT sessions w/ dynamic ChatGPT prompts https://www.youtube.com/watch?v=8PbpFxPibJM
  • COTpromptBuilder2PromptResponse is the above, but the resolver is the result.
  • AIDecisionMakerSimulator is an experimental simple agent that uses a decision tree to make a decision. Based on Henky!! from KoboaldAI and crew.
  • SearchToolAgentPOC is an experimental agent that uses a search tool to find the answer. NOTE: GoogleSerp is disabled and instead I am using SearX. It must be installed. I use the docker version. https://python.langchain.com/en/latest/reference/modules/searx_search.html?highlight=searx
  • AgentGuidanceSmartGPT is based on another youtube video by code4AI https://www.youtube.com/@code4AI
  • ChatGPTAgentGuidance is just an example of using ChatML with guidance
  • AgentGuidanceFlowGPT is an attempt to use FlowGPT Proteus
  • UniversalAnythingToJSON converts anything (!) to JSON
  • UniversalAnythingToMarkdown converts anything (including JSON) to Markdown

Original readme:

Makea simple agent with Guidance and local LLMs

The Guidance is a tool for controlling LLM. It provides a good concept to build prompt templates. This repository shows you how to make a agent with Guidance. You can combine it with various LLMs in Huggingface. My medium article for more explanation.

UPDATE: Added gradio UI.

Install

Python packages:

Note: we only use langchain for build the GoogleSerper tool. The agent itself is built only by Guidance. Feel free to change/add/modify the tools with your goal. The GPTQ-for-LLaMa I used is the oobabooga's fork. You can install it with this command.

Run

There are two options: run a Gradio server with UI and run the notebook file.

Gradio server

Please modify the SERPER_API_KEY, MODEL_PATH, CHECKPOINT_PATH in the app.py and run:

gradio app.py

Notebook

Please check the notebook file. You should have a free SERPER API KEY and a LLM model to run this. I use the wizard-mega-13B-GPTQ model. Feel free to try others.

Example

alt text

TODO https://github.com/hwchase17/langchain/tree/9231143f91863ffbe0542bc69a90b723a40e165d/langchain/experimental/plan_and_execute

About

Local LLM Agent with Guidance

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 71.5%
  • Jupyter Notebook 28.5%