You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. I have searched related issues but cannot get the expected help.
2. The bug has not been fixed in the latest version.
3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
I am using lang graph for tool calling with llama-3.1 8B Instruct model.
The model responses work well until tool calling section. It gives an internal server error when the tool response is sent back to llm for producing an output.
Reproduction
Steps to reproduce:
Attaching the code snippet as an example
from langchain_openai import ChatOpenAI
model = ChatOpenAI(
model = model_name,
temperature = 0,
base_url=model_url,
api_key="EMPTY",
disable_streaming=True,
)
# Define tools
from langchain_core.tools import tool
from typing import Literal
@tool
def get_weather(city: Literal["nyc", "sf"]):
"""Use this to get weather information."""
if city == "nyc":
return "It might be cloudy in nyc"
elif city == "sf":
return "It's always sunny in sf"
else:
raise AssertionError("Unknown city")
tools = [get_weather]
# Define the graph
from langgraph.prebuilt import create_react_agent
graph = create_react_agent(model, tools=tools)
# Printing
def print_stream(stream):
for s in stream:
message = s["messages"][-1]
if isinstance(message, tuple):
print(message)
else:
message.pretty_print()
# Run:
inputs = {"messages": [("user", "what is the weather in sf")]}
print_stream(graph.stream(inputs, stream_mode="values"))
# OUTPUT:
================================ Human Message =================================
what is the weather in sf
================================== Ai Message ==================================
<function=get_weather>{"city": "sf"}</function>
Tool Calls:
get_weather (0)
Call ID: 0
Args:
city: sf
================================= Tool Message =================================
Name: get_weather
It's always sunny in sf
------ Server Error ---
September 30 19:41:33.390
Traceback (most recent call last):
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
September 30 19:41:33.390
result = await app( # type: ignore[func-returns-value]
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
September 30 19:41:33.390
return await self.app(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
September 30 19:41:33.390
await super().__call__(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/applications.py", line 113, in __call__
September 30 19:41:33.390
await self.middleware_stack(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 187, in __call__
September 30 19:41:33.390
raise exc
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/errors.py", line 165, in __call__
September 30 19:41:33.390
await self.app(scope, receive, _send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in __call__
September 30 19:41:33.390
await self.app(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
September 30 19:41:33.390
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
September 30 19:41:33.390
raise exc
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
September 30 19:41:33.390
await app(scope, receive, sender)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 715, in __call__
September 30 19:41:33.390
await self.middleware_stack(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 735, in app
September 30 19:41:33.390
await route.handle(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 288, in handle
September 30 19:41:33.390
await self.app(scope, receive, send)
September 30 19:41:33.390
File "/opt/py3/lib/python3.10/site-packages/starlette/routing.py", line 76, in app
The text was updated successfully, but these errors were encountered:
Checklist
Describe the bug
I am using lang graph for tool calling with
llama-3.1 8B Instruct
model.The model responses work well until tool calling section. It gives an internal server error when the tool response is sent back to llm for producing an output.
Reproduction
Steps to reproduce:
Attaching the code snippet as an example
Environment
Error traceback
The text was updated successfully, but these errors were encountered: