Skip to content

Commit

Permalink
Change dependency to mem0ai (#1476)
Browse files Browse the repository at this point in the history
  • Loading branch information
Dev-Khant authored Jul 18, 2024
1 parent c9240e7 commit 7441f14
Show file tree
Hide file tree
Showing 7 changed files with 274 additions and 309 deletions.
1 change: 0 additions & 1 deletion embedchain/docs/api-reference/advanced/configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,6 @@ Alright, let's dive into what each key means in the yaml config above:
- `similarity_threshold` (Float): The threshold for similarity evaluation. Defaults to `0.8`.
- `auto_flush` (Integer): The number of queries after which the cache is flushed. Defaults to `20`.
7. `memory` Section: (Optional)
- `api_key` (String): The API key of mem0.
- `top_k` (Integer): The number of top-k results to return. Defaults to `10`.
<Note>
If you provide a cache section, the app will automatically configure and use a cache to store the results of the language model. This is useful if you want to speed up the response time and save inference cost of your app.
Expand Down
8 changes: 3 additions & 5 deletions embedchain/docs/api-reference/app/chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -149,17 +149,15 @@ app.chat("What is the net worth of Elon Musk?", config=query_config)

Mem0 is a cutting-edge long-term memory for LLMs to enable personalization for the GenAI stack. It enables LLMs to remember past interactions and provide more personalized responses.

Follow these steps to use Mem0 to enable memory for personalization in your apps:
- Install the [`mem0`](https://docs.mem0.ai/) package using `pip install memzero`.
- Get the api_key from [Mem0 Platform](https://app.mem0.ai/).
- Provide api_key in config under `memory`, refer [Configurations](docs/api-reference/advanced/configuration.mdx).
In order to use Mem0 to enable memory for personalization in your apps:
- Install the [`mem0`](https://docs.mem0.ai/) package using `pip install mem0ai`.
- Prepare config for `memory`, refer [Configurations](docs/api-reference/advanced/configuration.mdx).

```python with mem0
from embedchain import App

config = {
"memory": {
"api_key": "m0-xxx",
"top_k": 5
}
}
Expand Down
6 changes: 3 additions & 3 deletions embedchain/embedchain/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import yaml
from tqdm import tqdm

from mem0 import Mem0
from mem0 import Memory
from embedchain.cache import (
Config,
ExactMatchEvaluation,
Expand Down Expand Up @@ -131,9 +131,9 @@ def __init__(
self._init_cache()

# If memory_config is provided, initializing the memory ...
self.mem0_client = None
self.mem0_memory = None
if self.memory_config is not None:
self.mem0_client = Mem0(api_key=self.memory_config.api_key)
self.mem0_memory = Memory()

# Send anonymous telemetry
self._telemetry_props = {"class": self.__class__.__name__}
Expand Down
10 changes: 5 additions & 5 deletions embedchain/embedchain/embedchain.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ def __init__(
self.config = config
self.cache_config = None
self.memory_config = None
self.mem0_client = None
self.mem0_memory = None
# Llm
self.llm = llm
# Database has support for config assignment for backwards compatibility
Expand Down Expand Up @@ -598,8 +598,8 @@ def chat(
contexts_data_for_llm_query = contexts

memories = None
if self.mem0_client:
memories = self.mem0_client.search(
if self.mem0_memory:
memories = self.mem0_memory.search(
query=input_query, agent_id=self.config.id, session_id=session_id, limit=self.memory_config.top_k
)

Expand Down Expand Up @@ -641,8 +641,8 @@ def chat(
# Add to Mem0 memory if enabled
# TODO: Might need to prepend with some text like:
# "Remember user preferences from following user query: {input_query}"
if self.mem0_client:
self.mem0_client.add(data=input_query, agent_id=self.config.id, session_id=session_id)
if self.mem0_memory:
self.mem0_memory.add(data=input_query, agent_id=self.config.id, session_id=session_id)

# add conversation in memory
self.llm.add_history(self.config.id, input_query, answer, session_id=session_id)
Expand Down
Loading

0 comments on commit 7441f14

Please sign in to comment.