Skip to content

VU-cs5891/vu-cs5891-ollama-Ollama-tutorial

Repository files navigation

Local LLM (lllm)

A series of experiments with local LLMs

TODOs

Exp (Using Ollama)

  • Run mistral 7b locally
  • Run llama 2 locally
  • Run custom prompts with llama 2
  • Run API calls against llama/mistral

Exp (Get Twinny to work in VSCode)

  • Done with caveats

Exp (Using Ollama for a Chat UI)

  • Create chat UI similar to this

Exp (Create your own version of an open source LLM)

  • Custom prompts + temps

Exp (Train Mistral on your own data - finetuning or use RAG)

About

vu-cs5891-ollama-Ollama-tutorial created by GitHub Classroom

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages