Agent-LLM Docs
  • CONTRIBUTION
  • Building Prompts for Plugin System
  • Common Questions & Answers
  • Agent-LLM (Large Language Model)
  • concepts
    • Agents
    • Chains
    • Prompts
    • Providers
  • providers
    • BARD
    • BING
    • FASTCHAT
    • GPT4ALL
    • GPT4FREE
    • GPUGPT4ALL
    • HUGGINGCHAT
    • KOBOLD
    • LLAMACPP
    • OOBABOOGA
    • OPENAI
    • PALM
Powered by GitBook
On this page
  • AI Provider: llama.cpp
  • Quick Start Guide
  • Update your agent settings
  1. providers

LLAMACPP

PreviousKOBOLDNextOOBABOOGA

Last updated 2 years ago

AI Provider: llama.cpp

Quick Start Guide

Note: AI_MODEL should stay default unless there is a folder in model-prompts specific to the model that you're using. You can also create one and add your own prompts.

Update your agent settings

  1. Make sure your model is placed in the folder models/

  2. Create a new agent

  3. Set AI_PROVIDER to llamacpp.

  4. Set MODEL_PATH to the path of your llama.cpp model (for docker containers models/ is mapped to /model)

llama.cpp
Agent-LLM