Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR]: I dont have bedrock client rather an custom wrapper and i call model using custom api wrapper #1449

Open
SridhanyaG opened this issue Mar 4, 2025 · 12 comments
Labels
enhancement New feature or request

Comments

@SridhanyaG
Copy link

SridhanyaG commented Mar 4, 2025

Proposal summary

I dont access bedrock models using boto client . Rather we have wrappers.
We pass model name in request payload and get response.
Any workaround to initalize client will help.
My Custom API Wrapper is def generate_completion(payload, , bearer_token) and payload has model name and params .. How do i trace it and how can initialize opik .. I have configured opik local=True. Kindly guide I am evaluvating your tool

url = + "/completion"
payload = json.dumps(payload) # needed to convert into string
headers = {"Authorization": bearer_token, "Content-Type": "application/json"}

    response = requests.request(
        "POST",
        url,
        headers=headers,
        data=payload,
        cert=(cert_file_path, key_file_path),
    )

This is the only way i can access llm kind suggest how can i get logging and tracability

Motivation

No response

@SridhanyaG SridhanyaG added the enhancement New feature or request label Mar 4, 2025
@alexkuzmik
Copy link
Collaborator

alexkuzmik commented Mar 4, 2025

Hi @SridhanyaG!
Since it's not the official boto API, our direct integration won't help here. However, you can just decorate your generate_completion function like that.

from opik import track

@track
def generate_completion(...):
    ...

If the SDK is configured it will be enough to start logging :)

Here is the doc page about it https://www.comet.com/docs/opik/tracing/log_traces#using-function-decorators

@SridhanyaG
Copy link
Author

Hi @alexkuzmik

Thanks for the quick response
I was able to get the traces records.
I am using claudesonnet 3.5 which gives me tokens and other info
is there any api or option to record this tokens etc in llm tab ?

@alexkuzmik
Copy link
Collaborator

@SridhanyaG it's technically possible to do today via our low-level API, but the API is not very convenient right now and supports only usage information in openai format.
The good news is that I'm currently working on a major token usage handling PR (#1483).

I recommend you to wait for a couple of days. Next week we are going to release this PR and we will offer a nice API allowing you to log the usage for anthropic easily and even see the actual LLM calls cost visible in the UI :)

@SridhanyaG
Copy link
Author

Hi @alexkuzmik
thanks again will wait for the same

as of now we have an hosted postgres rds db in aws ... Is this tool support postgres db .???.. as of now i am evaluvating with mysql but it will be cheaper for us to reusing the postgres db rather spin up a new rds mysql db

@SridhanyaG
Copy link
Author

Hi @alexkuzmik

regarding postgres db support can you let me know if it is possible.

@Nimrod007
Copy link
Collaborator

Hi @SridhanyaG at the moment we only support RDS MYSQL.
In addition you can deploy mysql as part of the application (open source) via helm https://github.com/comet-ml/opik/blob/main/deployment/helm_chart/opik/values.yaml#L186 or used aws managed

@SridhanyaG
Copy link
Author

SridhanyaG commented Mar 10, 2025

Thanks @Nimrod007 for confirming
I added a Prompt in prompt library
Now guide me how can i evaluvate it??
My App is not having an llm model rather in my case as mine it is a wrapper
### Guide how can i evaluvate prompts
Example
import requests

import json
url = "url of the wrapper/completion "
payload = json.dumps(
{
"max_tokens": 800,
"model": "anthropic.claude-3-haiku-v1:0",
"messages": [
{
"role": "system",
"content": "You are an Intelligent bot - which get only precise answer to the point no explanation just the answer."
},
{
"role": "user",
"content": [
{
"type": "text",
"text": "whats the capital of India"
}
]
}
]
}
)
headers = {

'Content-Type': 'application/json',

'Authorization': 'Bearer 0001YhNpN4YyzbKf'

}
response = requests.request("POST", URL, headers=headers, data=payload)
print(response.text)

@alexkuzmik
Copy link
Collaborator

@SridhanyaG please see the documentation https://www.comet.com/docs/opik/evaluation/evaluate_prompt

@alexkuzmik
Copy link
Collaborator

@SridhanyaG could you please share the example of the token usage dictionary that is being provided by bedrock for your model?

@SridhanyaG
Copy link
Author

Hi @alexkuzmik

I have added how the bedrock models are invoked using our api in previous comment #1449 (comment)

@SridhanyaG
Copy link
Author

SridhanyaG commented Mar 11, 2025

Hi @alexkuzmik
No luck in adding below code
Tht span gets added when @track(type="llm") and tokens appear in the span but the tool is not updating token usage in metrics table i.e metrics is not updated Please guide
from opik import track, LLMProvider

@track(type="llm") # important to specify "llm"!
def f():
opik_context.update_current_span(
usage={
"input_tokens": 200,
"output_tokens": 100,
"cache_creation_input_tokens": 50,
"cache_read_input_tokens": 30,
},
provider=LLMProvider.ANTHROPIC # just "anthropic" will also work but we recommend using enums to avoid mistakes
)

@alexkuzmik
Copy link
Collaborator

@SridhanyaG please try again to log the usage, we made a release. If you are using bedrock, the cost tracking in USD will not be available (bedrock is a separate provider with their own prices even if they host anthropic models), but the token usage should be logged and displayed. Please let me if the new version is working.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants