Skip to main content

lf chat

Send a single prompt to your project’s chat endpoint. By default, responses include RAG context defined in llamafarm.yaml.

Synopsis

lf chat                        # Interactive TUI using project from llamafarm.yaml
lf chat [namespace/project] # Interactive TUI for explicit project
lf chat [namespace/project] "message" [flags] # One-off request

If you omit namespace/project, the CLI resolves them from llamafarm.yaml.

Useful Flags

FlagDescription
--modelSelect a specific model from your multi-model configuration.
--file, -fRead prompt content from a file.
--no-ragSkip retrieval—direct LLM call.
--databaseTarget a specific RAG database.
--retrieval-strategyOverride the retrieval strategy.
--rag-top-kAdjust the number of results (default 5).
--rag-score-thresholdMinimum similarity score for results.
--curlPrint the sanitized curl request instead of executing.

Behaviour

  • Automatically starts the server if needed.
  • Filters client/error messages from the transcript before sending.
  • Streams responses; exit code is non-zero if the API returns an error.
  • Redacts authorization headers when using --curl.

Examples

# Interactive project chat (auto-detect project)
lf chat

# Basic one-off chat (RAG enabled)
lf chat "Summarize the latest FDA letters."

# Use a specific model
lf chat --model powerful "Complex reasoning question"

# Explicit project with file input
lf chat company/legal -f prompt.txt

# Pure LLM request with curl preview
lf chat --no-rag --curl "Explain RAG in 2 sentences"

# Override strategy for targeted retrieval
lf chat --database main_db --retrieval-strategy hybrid_search "Find biologics references"

# Combine model selection with RAG
lf chat --model lemon --database main_db "Query with specific model and database"

Sessions

  • Set LLAMAFARM_SESSION_ID=abc123 to keep context between calls.
  • lf start manages its own session history in .llamafarm/projects/.../dev/context.
  • Delete session files to reset state or start a new namespace/project for isolation.

See Also