-
Notifications
You must be signed in to change notification settings - Fork 25k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference API returns an incorrect error message when inference ID = model ID #111312
Comments
Hello; I'm looking for my first contribution, can I work on this? |
elasticsearchmachine
added
Team:SearchOrg
Meta label for the Search Org (Enterprise Search)
Team:Search - Inference
and removed
needs:triage
Requires assignment of a team area label
labels
Jul 29, 2024
Pinging @elastic/search-inference-team (Team:Search - Inference) |
Pinging @elastic/ent-search-eng (Team:SearchOrg) |
astefan
added
:ml
Machine learning
and removed
Team:SearchOrg
Meta label for the Search Org (Enterprise Search)
:SearchOrg/Inference
Label for the Search Inference team
Team:Search - Inference
labels
Jul 29, 2024
Pinging @elastic/ml-core (Team:ML) |
@ersalazar, that would be awesome! Let me know if you need any help, and please assign me to your PR when you create it. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Elasticsearch Version
8.15
Installed Plugins
No response
Java Version
bundled
OS Version
Mac OS
Problem Description
When creating an inference endpoint using an
inference_id
that is identical to themodel_id
, it returns a misleading error suggesting that the inference API is trying to re-deploy the model again after eland has already deployed it to ML.Steps to Reproduce
When creating an inference endpoint using an
inference_id
that is identical to themodel_id
:It returns a misleading error message:
The error should say something like the following instead:
Inference ID [<inference_id>] must be unique and must not match the <model_id>.
Logs (if relevant)
No response
The text was updated successfully, but these errors were encountered: