Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini LM problems #1977

Closed
xaviermehaut opened this issue Dec 23, 2024 · 1 comment
Closed

Gemini LM problems #1977

xaviermehaut opened this issue Dec 23, 2024 · 1 comment

Comments

@xaviermehaut
Copy link

Hello,
I uses to use Gemini with dspy as follows :
_```
import dspy
from dspy import LM
from dsp.modules.google import Google

api_key="XXXX"

# Configuration de l'API Google Gemini

config = {
"model": "gemini-1.5-pro",
"api_key": api_key,
"temperature": 0,
"top_p": 0.95,
"top_k": 40,
}

lm = Google(**config)
dspy.settings.configure(lm=lm)


But is written that this way of docing will be deprecated soon...
I've then tried with dspy.LM(...)
but I didn't succeeded.
Here is the trace :

Traceback (most recent call last):
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\main.py", line 1413, in completion
_response = openai_text_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\llms\OpenAI\openai.py", line 1704, in completion
raise OpenAIError(
litellm.llms.OpenAI.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: AIzaSyCx***************************-uoo. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 903, in wrapper
result = original_function(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\main.py", line 3009, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2116, in exception_type
raise e
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 343, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: Text-completion-openaiException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: AIzaSyCx************************-uoo. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\xmehaut\Documents\OneDrive\BI-Missions\SpiceDoc\workspace\langflow-main\debug.py", line 36, in
prediction = predict_emotion(text=example_text)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\utils\callback.py", line 202, in wrapper
return fn(instance, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\predict\predict.py", line 154, in call
return self.forward(kwargs)
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\predict\predict.py", line 188, in forward
completions = v2_5_generate(lm, config, signature, demos, kwargs, parse_values=self.parse_values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\predict\predict.py", line 295, in v2_5_generate
return adapter(
^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\adapters\base.py", line 20, in call
outputs = lm(inputs, lm_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\utils\callback.py", line 202, in wrapper
return fn(instance, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\clients\lm.py", line 94, in call
response = completion(
^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\clients\lm.py", line 235, in cached_litellm_text_completion
return litellm_text_completion(
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\dspy\clients\lm.py", line 257, in litellm_text_completion
return litellm.text_completion(
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 1013, in wrapper
raise e
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 903, in wrapper
result = original_function(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\main.py", line 4196, in text_completion
response = completion(
^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 993, in wrapper
return litellm.completion_with_retries(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\main.py", line 3042, in completion_with_retries
return retryer(original_function, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\tenacity_init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\tenacity_init_.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\tenacity_init_.py", line 418, in exc_check
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\tenacity_init_.py", line 185, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 401, in __get_result
raise self.exception
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\tenacity_init.py", line 478, in call
result = fn(args, kwargs)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 1013, in wrapper
raise e
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\utils.py", line 903, in wrapper
result = original_function(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\main.py", line 3009, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2116, in exception_type
raise e
File "C:\Users\xmehaut\AppData\Local\Programs\Python\Python312\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 343, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: Text-completion-openaiException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: AIzaSyCx
-uoo. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

@okhat
Copy link
Collaborator

okhat commented Dec 23, 2024

Look up how to use Gemini via LiteLLM:

https://docs.litellm.ai/docs/providers/vertex

https://docs.litellm.ai/docs/providers/gemini

Then use that inside dspy.LM(...)

@okhat okhat closed this as completed Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants