File: llama-server.default

package info (click to toggle)
llama.cpp 8461%2Bdfsg-1
  • links: PTS, VCS
  • area: main
  • in suites:
  • size: 74,500 kB
  • sloc: cpp: 378,787; ansic: 55,447; python: 31,100; lisp: 12,885; sh: 6,459; objc: 1,398; javascript: 925; xml: 384; makefile: 240
file content (23 lines) | stat: -rw-r--r-- 664 bytes parent folder | download | duplicates (2)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
# Run llama-server --help for the list of usable environment variables
# to be set in this file

# For example, to override some of the defaults:

# Listen on all network interfaces
#LLAMA_ARG_HOST=0.0.0.0

# Download and use a single LLM (from HuggingFace.com)
#LLAMA_ARG_HF_REPO=allenai/OLMo-2-0425-1B-Instruct-GGUF

# Limit the context size of each session
#LLAMA_ARG_CTX_SIZE=10240

# Set an API key restricting access
#LLAMA_API_KEY=

# Additionally, deployment defaults can also be overridden here
#SERVER_HOME=/var/lib/llama-server
#SERVER_CACHE=/var/cache/llama-server
#SERVER_NAME="llama.cpp server"
#SERVER_USER=_llama-server
#SERVER_GROUP=_llama-server