Skip to content

Commit

Permalink
优化chatgpt特殊情况下token一直超长问题,2次都超长,直接清空上下文
Browse files Browse the repository at this point in the history
  • Loading branch information
Ikaros-521 committed Nov 7, 2023
1 parent f1019bc commit 9561b21
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions utils/gpt_model/chatgpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,8 @@ def chat(self, msg, sessionid):
message = self.chat(msg, sessionid)
# 再一次判断,再一次删除
if message.__contains__("This model's maximum context length is 4096 token"):
# 删除
del session['msg'][2:3]
del session['msg'][len(session['msg']) - 1:len(session['msg'])]
# 清空
session['msg'] = []

# 将 ChatGPT 返回的回复消息添加到会话中
session['msg'].append({"role": "assistant", "content": message})
Expand Down

0 comments on commit 9561b21

Please sign in to comment.