Skip to content

This project utilizes Ollama, an open-source library, to query the open-source LLMs (Llama Language Models) using Streamlit.

Notifications You must be signed in to change notification settings

nebulaa/Streamlit_Ollama_Chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Streamlit Ollama Chat

This project utilizes Ollama, an open-source library, to query the open-source LLMs (Llama Language Models) using Streamlit.

Installation

  1. Clone the repository:

    git clone https://github.com/nebulaa/Streamlit_Ollama_Chat.git
  2. Navigate to the project directory:

    cd Streamlit_Ollama_Chat
  3. Install the required dependencies:

    pip install -r requirements.txt

Usage

  1. Download and run that Ollama application. Ensure that Ollama is running at http://localhost:11434/.

    ollama pull llama2-uncensored
  2. Run the Streamlit app:

    streamlit run chat.py
  3. Open your web browser and visit http://localhost:8501 to access the Ollama chat interface.

  4. Start interacting with the LLM by entering your queries and receiving responses in real-time.

    alt text

About

This project utilizes Ollama, an open-source library, to query the open-source LLMs (Llama Language Models) using Streamlit.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages