Skip to content

Nix wrapper for running LLMs behind an OpenAI-compatible API proxy

License

Notifications You must be signed in to change notification settings

recap-utr/nixllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nixllm

Nix wrapper for running LLMs behind an OpenAI-compatible API proxy.

Important

This project is not designed to be used on NixOS, we do not use patchelf since that caused issues with the CUDA runtime. Instead, it is intended to be used with the nix package manager on other Linux distributions like Ubuntu. You may use it together with nix-ld on NixOS.

Ollama Usage

CUDA_VISIBLE_DEVICES=0 OLLAMA_HOST=0.0.0.0:6060 nix run github:recap-utr/nixllm#ollama -- serve
# then, in another terminal pull the models before performing API requests
OLLAMA_HOST=0.0.0.0:6060 nix run github:recap-utr/nixllm#ollama -- pull MODEL_NAME

About

Nix wrapper for running LLMs behind an OpenAI-compatible API proxy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages