Skip to main content

Overview

Presets in LibreChat allow you to save complete conversation configurations including model selection, parameters, system prompts, and tool settings. This enables quick access to frequently used configurations without manual setup each time.

What Are Presets?

A preset captures:
  • Model selection: Specific model and provider
  • Model parameters: Temperature, top_p, max tokens, etc.
  • System instructions: Custom prompts and behavior guidelines
  • Tools and plugins: Enabled tools like web search or code execution
  • Label and metadata: Custom display name and description
Think of presets as templates for different conversation styles or use cases.

Creating a Preset

1

Configure Your Conversation

Set up a conversation with:
  • Desired endpoint and model
  • Custom system prompt (if applicable)
  • Model parameters (temperature, max tokens, etc.)
  • Any tools or plugins
2

Open Preset Menu

Click the preset icon or menu in the conversation header.
3

Save as Preset

Choose Save as Preset:
  • Enter a descriptive name
  • Optionally add a description
  • Click Save
Give presets clear, descriptive names like “Creative Writing (GPT-4)” or “Code Review (Claude)” to easily identify them later.

Using Presets

Load a Preset

1

Start New Conversation

Click New Chat or the new conversation button.
2

Select Preset

Open the preset selector and choose your saved preset.
3

Start Chatting

The conversation inherits all preset settings automatically.

Modify Active Preset

You can adjust settings after loading a preset:
  • Changes apply only to the current conversation
  • Original preset remains unchanged
  • Save modified settings as a new preset if desired

Preset Configuration

Control preset availability in librechat.yaml:
# librechat.yaml
interface:
  presets: true  # Enable/disable presets

Example Preset Configurations

{
  "title": "Creative Writer",
  "endpoint": "openAI",
  "model": "gpt-4o",
  "modelLabel": "GPT-4o Creative",
  "promptPrefix": "You are a creative writing assistant. Help users craft compelling stories, characters, and dialogue.",
  "temperature": 0.9,
  "top_p": 0.95,
  "max_tokens": 4096,
  "presence_penalty": 0.6,
  "frequency_penalty": 0.3
}

Model Parameters Explained

Controls randomness in responses:
  • 0.0-0.3: Focused and deterministic (good for factual tasks)
  • 0.4-0.7: Balanced (general purpose)
  • 0.8-1.0: Creative and varied (good for creative writing)
"temperature": 0.7
Controls diversity via probability mass:
  • 0.9-1.0: More diverse responses
  • 0.5-0.8: More focused responses
  • Usually kept high (0.9) when adjusting temperature
"top_p": 0.9
Maximum response length:
  • Depends on model context window
  • Higher = longer responses (but more cost)
  • Consider conversation history in token budget
"max_tokens": 4096
Reduces repetition of topics:
  • 0.0: No penalty
  • 0.1-0.5: Mild reduction in repetition
  • 0.6-1.0: Strong push for new topics
"presence_penalty": 0.3
Reduces repetition of exact phrases:
  • 0.0: No penalty
  • 0.1-0.5: Mild reduction
  • 0.6-1.0: Strong reduction
"frequency_penalty": 0.3

Preset Titles and Labels

Presets generate display titles based on configuration:
// Title generation
const title = `${preset.title}: ${model}${label ? ` (${label})` : ''}`

// Examples:
// "Creative Writer: gpt-4o"
// "Code Helper: claude-sonnet-4 (Fast)"
// "Research Pro: gemini-2.5-pro"

Sharing Presets

Currently, presets are user-specific and stored locally. Sharing presets between users requires manual export/import or admin configuration.

Preset vs. Agent

Best for:
  • Quick model + parameter configurations
  • Different “modes” for the same model
  • Simple prompt templates
  • Fast switching between configurations
Limitations:
  • No persistent tools or capabilities
  • Cannot chain multiple steps
  • No file search or custom actions
Use Presets for quick parameter/prompt changes. Use Agents for tool-enabled workflows.

Configuration Reference

# librechat.yaml
interface:
  presets: true  # Enable presets
  
  # Control prompt management (related feature)
  prompts:
    use: true     # Allow using prompts
    create: true  # Allow creating prompts
    share: false  # Allow sharing prompts
    public: false # Allow public prompts

Best Practices

  • Name clearly: Use descriptive names that indicate purpose
  • Document parameters: Note why you chose specific values
  • Test iterations: Fine-tune parameters based on results
  • Organize by use case: Group related presets
  • Version control: Create new presets when experimenting rather than overwriting

Common Preset Patterns

Task-Specific Presets

- "Debug Code" (low temp, focused)
- "Brainstorm Ideas" (high temp, creative)
- "Summarize Docs" (medium temp, concise max_tokens)
- "Translate Text" (low temp, deterministic)

Model Comparison Presets

- "GPT-4o Standard"
- "Claude Sonnet Standard"
- "Gemini Pro Standard"

(Same parameters, different models for A/B testing)

Tone Presets

- "Professional" (formal promptPrefix)
- "Casual" (friendly promptPrefix)
- "Technical" (expert promptPrefix)
- "Simplified" (ELI5 promptPrefix)

Troubleshooting

  • Verify the model is still available
  • Check endpoint configuration in librechat.yaml
  • Ensure you have access to the selected model
  • Some models don’t support all parameters
  • Check model documentation for supported options
  • Verify parameter values are within valid ranges
  • Ensure interface.presets: true in config
  • Check browser local storage isn’t full
  • Try a different browser