Skip to content
This repository has been archived by the owner on Dec 19, 2024. It is now read-only.

GitHub Action to set up and run a Ollama server

License

Notifications You must be signed in to change notification settings

phil65/ollama-github-action

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NOTE: This Github Action is deprecated. It was born out of a lack of alternatives, but since pydantic now maintains a Ollama Github Action, this one is not needed anymore. Check out The Pydantic alternative instead.

Ollama GitHub Action

Test Ollama Action License: MIT

A GitHub Action to easily install and run Ollama models in your workflow. Supports Linux, macOS, and Windows runners.

Features

  • 🚀 Cross-platform support (Linux, macOS, Windows)
  • 🔄 Automatic installation and setup
  • 🎯 Run specific models
  • ⚡ Fast model pulling and execution

Compatibility Matrix

Runner OS Architecture Status
Ubuntu 20.04+ x86_64 ✅ Fully Supported
macOS 11+ x86_64, ARM64 ✅ Fully Supported
Windows Server 2019+ x86_64 ✅ Fully Supported

Usage

Example

- name: Serve Ollama Model
  uses: phil65/ollama-github-action@v1
  with:
    model: "smollm2:135m"

Inputs

Input Description Required Default
model Ollama model to use (e.g., llama2, codellama, mistral) Yes smollm2:135m

Outputs

Output Description
server-url URL of the running Ollama server (http://localhost:11434)
status Status of the Ollama server (running/failed)

Platform-Specific Notes

Linux

  • Runs natively using the official installer

macOS

  • Installs via Homebrew
  • Supports both Intel and Apple Silicon

Windows

  • Uses the latest release from GitHub
  • Custom installation path at C:\ollama

Example for a workflow

jobs:
  serve-model:
    runs-on: ubuntu-latest
    steps:
      - name: Start Ollama Server
        id: ollama  # Required to reference outputs
        uses: phil65/ollama-github-action@v1
        with:
          model: "smollm2:135m"

      # Example: Use the Ollama server in subsequent steps
      - name: Use Ollama
        run: |
          echo "Server URL: ${{ steps.ollama.outputs.server-url }}"
          echo "Server Status: ${{ steps.ollama.outputs.status }}"

          # Example API call
          curl "${{ steps.ollama.outputs.server-url }}/api/generate" \
            -d '{
              "model": "smollm2:135m",
              "prompt": "What is GitHub Actions?"
            }'

Server Lifecycle

The Ollama server will:

  1. Start automatically when the action runs
  2. Remain running for subsequent workflow steps
  3. Be automatically cleaned up when the job completes

Note: If you need to stop the server manually in your workflow, you can use:

- name: Stop Ollama Server
  if: always()  # Ensures cleanup even if previous steps fail
  run: |
    pkill ollama || true

Environment Variables

The following environment variables are available during the workflow:

  • OLLAMA_HOST: localhost
  • OLLAMA_PORT: 11434

API Examples

Generate Text

- name: Generate Text
  run: |
    curl "${{ steps.ollama.outputs.server-url }}/api/generate" \
      -d '{
        "model": "smollm2:135m",
        "prompt": "Write a hello world program",
        "stream": false
      }'

List Models

- name: List Models
  run: |
    curl "${{ steps.ollama.outputs.server-url }}/api/tags"

Troubleshooting

Common Issues

Memory Issues

  • Use a runner with more RAM
  • Try a smaller model
  • Close unnecessary processes

Debug Logs

Enable debug logging by setting:

env:
  ACTIONS_STEP_DEBUG: true

Security Considerations

  • The server is accessible only on localhost
  • Model files are stored in the runner's temporary space
  • Cleanup is automatic after workflow completion
  • No sensitive data is persisted between runs

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

About

GitHub Action to set up and run a Ollama server

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages