Skip to content
/ zep Public
forked from getzep/zep

Zep: A long-term memory store for LLM / Chatbot applications

License

Notifications You must be signed in to change notification settings

Nepi24/zep

This branch is 138 commits behind getzep/zep:main.

Folders and files

NameName
Last commit message
Last commit date
Jun 16, 2023
Jun 26, 2023
Jul 8, 2023
Jul 8, 2023
May 26, 2023
Jul 14, 2023
May 4, 2023
May 8, 2023
May 26, 2023
May 20, 2023
May 21, 2023
Apr 29, 2023
May 31, 2023
Jun 28, 2023
May 29, 2023
May 26, 2023
Jul 8, 2023
Jul 8, 2023
Jul 4, 2023
May 19, 2023
Jul 8, 2023
Jul 8, 2023
May 31, 2023
Jul 8, 2023
Jul 3, 2023
May 12, 2023
Jun 20, 2023
Jul 3, 2023
Jul 8, 2023

Repository files navigation

GitHub release (latest by date) Build/Test Docker golangci-lint License: Apache

Zep: A long-term memory store for LLM applications

Zep stores, summarizes, embeds, indexes, and enriches LLM app / chatbot histories, and exposes them via simple, low-latency APIs. Zep allows developers to focus on developing their AI apps, rather than on building memory persistence, search, and enrichment infrastructure.

Zep's Extractor model is easily extensible, with a simple, clean interface available to build new enrichment functionality, such as summarizers, entity extractors, embedders, and more.

Key Features:

  • Fast! Zep's async extractors operate independently of the your chat loop, ensuring a snappy user experience.
  • Memory operations like addMemory(), getMemory(), searchMemory(), deleteMemory() with soft & hard deletes
  • Long-term memory persistence, with access to historical messages irrespective of your summarization strategy.
  • Auto-summarization of memory messages based on a configurable message window. A series of summaries are stored, providing flexibility for future summarization strategies.
  • Hybrid Vector search over memories and metadata, with messages automatically embedded on creation.
  • Entity Extractor that extracts named entities and intents from messages and stores them in the message metadata.
  • Auto-token counting of memories and summaries, allowing finer-grained control over prompt assembly.
  • OpenAI & Azure OpenAI Service models for your applications
  • Python and JavaScript SDKs.
  • Langchain memory and retriever support.

Quick Start

Deploy to Render

Read the docs: https://getzep.github.io

  1. Clone this repo
git clone https://github.com/getzep/zep.git
  1. Add your OpenAI API key to a .env file in the root of the repo:
ZEP_OPENAI_API_KEY=<your key here>
  1. Start the Zep server:
docker-compose up

This will start a Zep server on port 8000, and a Postgres database on port 5432.

  1. Access Zep via the Python or Javascript SDKs:

Python

async with ZepClient(base_url) as client:
    role = "user"
    content = "who was the first man to go to space?"
    message = Message(role=role, content=content)
    memory = Memory()
    memory.messages = [message]
    # Add a memory
    result = await client.aadd_memory(session_id, memory)

See zep-python for installation and use docs.

Javascript

 // Add memory
 const role = "user";
 const content = "I'm looking to plan a trip to Iceland. Can you help me?"
 const message = new Message({ role, content });
 const memory = new Memory();
 memory.messages = [message];
 const result = await client.addMemoryAsync(session_id, memory);
...

Zep Documentation

Server installation and SDK usage documentation is available here: https://getzep.github.io

Acknowledgements

h/t to the Motorhead and Langchain projects for inspiration.

About

Zep: A long-term memory store for LLM / Chatbot applications

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 98.1%
  • Makefile 1.4%
  • Other 0.5%