Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trying to run forecaster and I get this error: 'base_model.model.model.model.embed_tokens' #186

Open
rumcode opened this issue Jul 17, 2024 · 1 comment

Comments

@rumcode
Copy link

rumcode commented Jul 17, 2024

I slightly modified the code that I copied from the forecaster page. and I run into an error. Any suggestions? Thanks in advance.
Code is

"""
from datasets import load_dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch


base_model = AutoModelForCausalLM.from_pretrained(
    'meta-llama/Llama-2-7b-chat-hf',
    trust_remote_code=True,
    device_map="auto",
    torch_dtype=torch.float16,
    token='mytoken # optional if you have enough VRAM
)

tokenizer = AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-chat-hf',token='mytoken')
print("hi")
model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')
print("hi2")
model = model.eval()


The error messages are:

C:\Users\xx\AppData\Roaming\Python\Python311\site-packages\torch\nn\modules\module.py:2047: UserWarning: for base_model.model.model.layers.31.mlp.down_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
Traceback (most recent call last):

  File c:\ProgramData\Anaconda3\Lib\site-packages\spyder_kernels\py3compat.py:356 in compat_exec
    exec(code, globals, locals)

  File c:\users\rruffley 3677\downloads\fingpt20240715try2.py:23
    model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:430 in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:1022 in load_adapter
    self._update_offload(offload_index, adapters_weights)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:908 in _update_offload
    safe_module = dict(self.named_modules())[extended_prefix]

KeyError: 'base_model.model.model.model.embed_tokens'
@pratikm778
Copy link

Same error. you got any workarounds ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants