File: llama-server.default

package info (click to toggle)
llama.cpp 8064%2Bdfsg-2
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 76,488 kB
  • sloc: cpp: 353,828; ansic: 51,268; python: 30,090; lisp: 11,788; sh: 6,336; objc: 1,395; javascript: 924; xml: 384; makefile: 236
file content (23 lines) | stat: -rw-r--r-- 664 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
# Run llama-server --help for the list of usable environment variables
# to be set in this file

# For example, to override some of the defaults:

# Listen on all network interfaces
#LLAMA_ARG_HOST=0.0.0.0

# Download and use a single LLM (from HuggingFace.com)
#LLAMA_ARG_HF_REPO=allenai/OLMo-2-0425-1B-Instruct-GGUF

# Limit the context size of each session
#LLAMA_ARG_CTX_SIZE=10240

# Set an API key restricting access
#LLAMA_API_KEY=

# Additionally, deployment defaults can also be overridden here
#SERVER_HOME=/var/lib/llama-server
#SERVER_CACHE=/var/cache/llama-server
#SERVER_NAME="llama.cpp server"
#SERVER_USER=_llama-server
#SERVER_GROUP=_llama-server