Skip to content

Latest commit

 

History

History
16 lines (12 loc) · 766 Bytes

Local_mode_guideline.md

File metadata and controls

16 lines (12 loc) · 766 Bytes

Local Mode Tutorial: Local deployment with the Visual Studio Code / Jetbrains extensions

The steps for two platforms are the same.

  1. Click VS Code / Jetbrains to download the extension.

  2. Open the local mode in the extension settings (no need to login).

  3. Start Ollama server (other OpenAI compatible APIs are also supported) with the following command (keep the server running in background):

    export OLLAMA_ORIGINS="*"
    ollama run codegeex4
    ollama serve
  4. Enter the api address and model name in local mode settings. Then enjoy coding with CodeGeeX4!

    local mode