This repo serves as a template for how to deploy a LangGraph agent on Streamlit.
This repo contains an main.py
file which has a template for a chatbot implementation.
To add your chain, you need to change the load_chain
function in main.py
.
Depending on the type of your chain, you may also need to change the inputs/outputs that occur later on.
After installing dependencies with e.g. $ pip install -r requirements.txt
, you can run this project locally with the following command:
$ streamlit run main.py
This is easily deployable on the Streamlit platform.
Note that when setting up your Streamlit app you should make sure to add OPENAI_API_KEY
as a secret environment variable.
To quickly spot issues and improve the performance of your LangGraph projects, sign up for LangSmith. LangSmith lets you use trace data to debug, test, and monitor your LLM apps built with LangGraph — read more about how to get started here.