-
Notifications
You must be signed in to change notification settings - Fork 358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FR]: I dont have bedrock client rather an custom wrapper and i call model using custom api wrapper #1449
Comments
Hi @SridhanyaG! from opik import track
@track
def generate_completion(...):
... If the SDK is configured it will be enough to start logging :) Here is the doc page about it https://www.comet.com/docs/opik/tracing/log_traces#using-function-decorators |
Hi @alexkuzmik Thanks for the quick response |
@SridhanyaG it's technically possible to do today via our low-level API, but the API is not very convenient right now and supports only usage information in openai format. I recommend you to wait for a couple of days. Next week we are going to release this PR and we will offer a nice API allowing you to log the usage for anthropic easily and even see the actual LLM calls cost visible in the UI :) |
Hi @alexkuzmik as of now we have an hosted postgres rds db in aws ... Is this tool support postgres db .???.. as of now i am evaluvating with mysql but it will be cheaper for us to reusing the postgres db rather spin up a new rds mysql db |
Hi @alexkuzmik regarding postgres db support can you let me know if it is possible. |
Hi @SridhanyaG at the moment we only support RDS MYSQL. |
Thanks @Nimrod007 for confirming import json 'Content-Type': 'application/json', 'Authorization': 'Bearer 0001YhNpN4YyzbKf' } |
@SridhanyaG please see the documentation https://www.comet.com/docs/opik/evaluation/evaluate_prompt |
@SridhanyaG could you please share the example of the token usage dictionary that is being provided by bedrock for your model? |
Hi @alexkuzmik I have added how the bedrock models are invoked using our api in previous comment #1449 (comment) |
Hi @alexkuzmik @track(type="llm") # important to specify "llm"! |
@SridhanyaG please try again to log the usage, we made a release. If you are using bedrock, the cost tracking in USD will not be available (bedrock is a separate provider with their own prices even if they host anthropic models), but the token usage should be logged and displayed. Please let me if the new version is working. |
Proposal summary
I dont access bedrock models using boto client . Rather we have wrappers.
We pass model name in request payload and get response.
Any workaround to initalize client will help.
My Custom API Wrapper is def generate_completion(payload, , bearer_token) and payload has model name and params .. How do i trace it and how can initialize opik .. I have configured opik local=True. Kindly guide I am evaluvating your tool
url = + "/completion"
payload = json.dumps(payload) # needed to convert into string
headers = {"Authorization": bearer_token, "Content-Type": "application/json"}
This is the only way i can access llm kind suggest how can i get logging and tracability
Motivation
No response
The text was updated successfully, but these errors were encountered: