Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to solve ImportError: cannot import name 'BaseCache' from 'langchain'?如何解决ImportError?Following are the details. 报错细节见下。 #60

Open
PoisonousBromineChan opened this issue Nov 24, 2023 · 3 comments

Comments

@PoisonousBromineChan
Copy link

PS C:\Users\test\Desktop\baize-chatbot-main\demo> python app.py decapoda-research/llama-7b-hf project-baize/baize-lora-7B c:\users\test\anaconda3\lib\site-packages\bitsandbytes\cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
warn("The installed version of bitsandbytes was compiled without GPU support. " 'NoneType' object has no attribute 'cadam32bit_grad_fp32' c:\users\test\anaconda3\lib\site-packages\langchain_init_.py:34: UserWarning: Importing Cohere from langchain root module is no longer supported. Please use langchain.llms.Cohere instead.
warnings.warn( c:\users\test\anaconda3\lib\site-packages\langchain_init_.py:34: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
warnings.warn( c:\users\test\anaconda3\lib\site-packages\langchain_init_.py:34: UserWarning: Importing OpenAI from langchain root module is no longer supported. Please use langchain.llms.OpenAI instead.
warnings.warn( Traceback (most recent call last): File "C:\Users\test\Desktop\baize-chatbot-main\demo\app.py", line 9, in from app_modules.overwrites import * File "C:\Users\test\Desktop\baize-chatbot-main\demo\app_modules\overwrites.py", line 4, in from llama_index import Prompt File "c:\users\test\anaconda3\lib\site-packages\llama_index_init_.py", line 18, in from llama_index.indices.common.struct_store.base import SQLDocumentContextBuilder
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices_init_.py", line 4, in
from llama_index.indices.keyword_table.base import GPTKeywordTableIndex
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\keyword_table_init_.py", line 4, in
from llama_index.indices.keyword_table.base import GPTKeywordTableIndex
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\keyword_table\base.py", line 18, in
from llama_index.indices.base import BaseGPTIndex
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\base.py", line 8, in
from llama_index.indices.base_retriever import BaseRetriever
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\base_retriever.py", line 5, in
from llama_index.indices.query.schema import QueryBundle, QueryType
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\query_init_.py", line 3, in
from llama_index.indices.query.response_synthesis import ResponseSynthesizer
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\query\response_synthesis.py", line 5, in
from llama_index.indices.postprocessor.types import BaseNodePostprocessor
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\postprocessor_init_.py", line 4, in
from llama_index.indices.postprocessor.node import (
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\postprocessor\node.py", line 13, in
from llama_index.indices.response import get_response_builder
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\response_init_.py", line 3, in
from llama_index.indices.response.accumulate import Accumulate
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\response\accumulate.py", line 5, in
from llama_index.indices.response.base_builder import BaseResponseBuilder
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\response\base_builder.py", line 14, in
from llama_index.indices.service_context import ServiceContext
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\service_context.py", line 7, in
from llama_index.indices.prompt_helper import PromptHelper
File "c:\users\test\anaconda3\lib\site-packages\llama_index\indices\prompt_helper.py", line 12, in
from llama_index.langchain_helpers.chain_wrapper import LLMPredictor
File "c:\users\test\anaconda3\lib\site-packages\llama_index\langchain_helpers\chain_wrapper.py", line 6, in
from llama_index.llm_predictor.base import ( # noqa: F401
File "c:\users\test\anaconda3\lib\site-packages\llama_index\llm_predictor_init_.py", line 4, in
from llama_index.llm_predictor.base import LLMPredictor
File "c:\users\test\anaconda3\lib\site-packages\llama_index\llm_predictor\base.py", line 11, in
from langchain import BaseCache, Cohere, LLMChain, OpenAI
ImportError: cannot import name 'BaseCache' from 'langchain' (c:\users\test\anaconda3\lib\site-packages\langchain_init_.py)

@PoisonousBromineChan
Copy link
Author

I tried to update langchain, but in vain.

@PoisonousBromineChan
Copy link
Author

The link is the guidance I followed.
https://www.bilibili.com/read/cv22929305/
附上我的参考教程链接,在python app.py ……那边出错了。

@PoisonousBromineChan
Copy link
Author

你好,目前bug已经解决,是llama库版本没更新

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant