Skip to content

JohnsonLee-ane/chat-ollama

 
 

Repository files navigation

Chat Ollama

This is a Nuxt 3 + Ollama web application. It's an example of Ollama Javascript library.

Feature list:

  • Models management (list, download, delete)
  • Chat with models

Setup

Make sure to install the dependencies:

# npm
npm install

# pnpm
pnpm install

# yarn
yarn install

# bun
bun install

Ollama Server

You will need an Ollama server running. You can run it in local environment following the installation guide of Ollama.

By default, Ollama server is running on http://localhost:11434.

Development Server

Start the development server on http://localhost:3000:

# npm
npm run dev

# pnpm
pnpm run dev

# yarn
yarn dev

# bun
bun run dev

Releases

No releases published

Packages

No packages published

Languages

  • Vue 68.1%
  • TypeScript 31.9%